David Abramson
Professor of Computer Science
Faculty of Information Technology, Monash University, Clayton VIC 3800, Australia
Robust Science and Engineering using Parametric Computing on the Computational Grid
Grid computing offers the potential to execute very large computational experiments by distributing the work across remote resources. This allows users to experiment with different design options, exploring parameter space more fully than ever before.
Over the past several years, we have developed a family of tools called Nimrod. Nimrod uses a simple declarative parametric modeling language to express a parametric experiment and provides machinery that automates the task of formulating, running, monitoring, and collating the results from the multiple individual experiments. Equally important, the Nimrod tools incorporate a distributed scheduling component that can manage the scheduling of individual experiments to idle computers in a local area network, or on remote high end resources found in the Grid. Together, these features mean that even complex parametric experiments can be defined and run with little programmer effort. In many cases it is possible to establish a new experiment in minutes.
In this seminar I will describe the Nimrod family of tools and show how they have been applied to a range of application areas, including Bioinformatics, Climate studies, Quantum Chemistry, Drug Design, Network Simulation, Electronic CAD, Ecological Modelling and Business Process Simulation.
Peter Arzberger
University of California, San Diego, USA
Building international e-communities in cyberinfrastructure, research, and education
In 2002 several institutions around the Pacific Rim founded the Pacific Rim Application and Grid Middleware Assembly (PRAGMA, http://www.pragma-grid.net). This group is focused on developing grid technology by focusing on team building and by running applications on the emerging international cyberinfrastructure. Over this time PRAGMA has grown and gained experience in running applications on its heterogeneous testbed. In addition, it has spun off several other projects.
We will introduce PRAGMA and its accomplishments, and describe other activities, including the Global Lake Ecological Observatory Network (GLEON, http://www.gleon.org), grassroots network of limnologists and information technology experts with a common goal of building a scalable, persistent network of lake ecology observatories; the Pacific Rim Experiences for Undergraduate (PRIME, http://prime.ucsd.edu), an international research apprenticeship for undergradutes; and the Pacific Rim International UniverSities (PRIUS, http://prius.ics.es.osaka-u.ac.jp/en/index.html), an international enhancement on graduate education in cyberinfrastructure. Some of the collaborative technologies being integrated and disseminated by OptIPuter (http://www.optiputer.net) will also be discussed in particular the use of high-definition visual and audio infrastructure for interactions. In addition, we will discuss technologies developed by the National Biomedical Computation Resource (http://nbcr.net), with a focus on international collaborations.
Key to the successes of these projects is the focus on collaborative interactions and the emergence of e-science communities.
Dieter Kranzlmuller
Johannes Kepler University of Linz, Austria
Interactive Grids - The GVid/GVK Approach for Scientific Visualization
The power of grids has been an enabling factor for scientific achievements in many application domains. Grids are accepted as a research utility, and today, production grid infrastructures offer pervasive grid services to the users. Yet, most grid applications are using a batch-oriented approach, which is well-suited for parameter studies but less optimal for many other application types. Building on todays grids, the glogin tool provides interactive access to grid nodes as well as transportation of data via dedicated tunnels. With this service, advanced interactive applications can be built on grids. The GVid/GVK approach demonstrates this by providing sophisticated visualization capabilities to the user. With GVK, users are able to visualize and manipulate the data while it is being produced on the grid. As a consequence, more and more application domains may utilize the power of the grid.
Cong Duc Pham
University of Pau, France
Communication networks in the next decade
In the past 10 years communication networks have benefited from 3 major technological revolutions. The first revolution came from the incredible development of optical fiber technologies (wavelength multiplexing for example) which allow the deployment of very high-speed core networks (vHSN): the so-called information highways. With operational optical fibers capable of transferring data at the tremendous rate of tens of Gbits/s, the very active research community has rapidly build vHSN infrastructures such as the early vBNS and Abilene networks in the US or the CA*NET project in Canada, and, more recently, the VTHD project in France and the GEANT infrastructure in the EU. These vHSN have brought to reality a large variety of bandwidth consuming or interactive applications such as high performance and grid computing or telemedecine for example. The second revolution happened more recently and is complementary with the first one: home accesses to operators or service providers networks have witnessed the massive usage of cheap DSL technologies that boosted broadband accesses to the Internet through the widely deployed copper-based telephony line, commonly called the local loop or the last miles. Allowing several Mbit/s data rate for the home access, this second technological revolution reached, and is still reaching, millions of end-users generating a whole new area for residential broadband services such as broadband Internet, interactive games, IP telephony, media-on-demand and peer-to-peer applications to name a few. Finally, the last ongoing revolution is in the domain of wireless and mobile communications. There are few technologies that have had a more profound effect on people's lives than wireless communications. Wireless communications have this unique feature of offering completely transparent communication capabilities in order to seamlessly integrate computers and electronic devices into the human world.
It is envisioned that the number and the variety of communicating devices will rapidly increase over the next years. With both broadband Internet and wireless communication capacities embedded in a larger number and variety of devices of our daily life, it is expected that interactions, leading to some form of cooperation, between all these communicating devices will become necessary and be more and more complex. As could be seen from the Internet experience where millions of computers connected together can lead to complex applications such as grid computing, on-line interactive gaming or personal video-conferencing, having a large variety of communicating devices able to access and send digital information of all forms, from every place to everywhere in the world, can lead to unexpected innovative cooperative applications. This talk will review these technological revolutions, present some challenges associated to very high-speed networks and small communication devices such as wireless sensor. It will then state on how sustainable the development of these communication networks is.
Gehard Reinelt
Institute for Computer Science University of Heidelberg, Germany
Applications of Combinatorial Optimization in Biology
The area of computational biology generates many interesting combinatorial optimization problems. In this talk we will survey some models and results for the physical mapping of chromosomes, determination of clusters in metabolic networks and the computation of shortest paths and minimum cuts in transitions networks describing the dynamics of biomolecules.
Simon See
APSTC, Sun Microsystems and Nanyang Technological University, Singapore
An Efficient High Performance PetaScale Systems for Scientific and Engineering Computation
Recently the demand for large scale (petascale) computing machine by the
the scientific and engineering community have been increased in pace.
In order to be economically competitive, the United States of America,
Japan and European Union have been and are investing billions of dollars
to build machines and facilities to feed the appetite of the
researchers. Though the Moore's law of microprocessors are still
valid, we see a switch to parallel microprocessors which is a milestone
in the history of computing. The industry (e.g Sun, IBM, Intel and AMD)
has laid out roadmap for multicore designs. The current programming
approaches are likely to diminishing returns as large number of cores
are realised. Further to this, bandwidth of interconnect, power
consumption and floorspace become important design factor. In this
talk, the author is going to discuss some of the issues and possible
approach in solving them.
Satoshi Sekiguchi
Director, Grid Technology Research Center
National Institute of Advanced Industrial Science and Technology, Japan
Grid Data Center for Utility Computing in marketplace
After completing significant national and international grid projects, the Grid technologies are getting more commonly used in day-to-day business scene now. It turns out that it is easily foreseeable to provide sustainable infrastructure-services of IT resources such as server, storage, networking, and web service platform. Grid Data Center is a business model enabled by advanced grid technologies as a natural extension of the Internet Data Center that would accelerate producing actual utility computing for innovating new businesses.
|