ACADEMIA
Scientists Reveal At SC11 Conference Advancements in 100 Gbps Networks Needed For Next Generation Research and Discovery
- Written by: Cat
- Category: ACADEMIA
Today at the 24th annual SC Conference (SC11) -- the foremost international high performance computing conference -- researchers demonstrated advancements in developing 100 Gbps networks within the U.S. and internationally that are necessary for the next generation of research and discovery.
”Next generation science increasingly requires investigations based on extremely large volumes of data that must be transported across wide distances with exceptionally high performance,” said Bill Fink, advanced technology researcher at the NASA Goddard Space Flight Center. “For example, NASA is developing a next generation network platform to support a wide range of strategic research projects including in the areas of advanced networking, climate science, earth science and astrophysics.”
Many new high-performance data intensive research investigations, called petascale science, will be increasingly applied to discovery domains, including weather and climate simulation, nuclear simulations, cosmology, quantum chemistry, lower-level organism brain simulation, and fusion science. Current networks provisioned for 10 Gbps do not provide sufficient rate, time and volume performance for many emerging applications. Petascale science involves not only the creation of massive datasets generated at supercomputer, instrumentation, and experimental facilities, but also subsequent analysis of that data by a user community that may be distributed across many laboratories and universities, across the U.S. and across the world. Exceptionally time-efficient data flows for petascale science over wide areas are persistent requirements for many advanced research disciplines. These projects are developing techniques to optimize WAN file transfer at 100 Gbps in part by designing data transfer utilities, protocols, and techniques that enable extremely high sustained end-to-end flows, including disk-to-disk and memory-to-memory.
The NASA Center for Climate Simulation is also using high performance computing to flow 100’s of terabytes of high-resolution climate forecasts from its Goddard Institute for Space Studies as well as its Global Modeling and Assimilation Office groups. These high-resolution climate forecasts will be major contributors to the next United Nations Intergovernmental Panel on Climate Change Assessment Report, which will be published in 2013.
NASA also has established a partnership with the International Center for Advanced Internet Research at Northwestern University (iCAIR) and the Laboratory for Advanced Computing at the University of Chicago (LAC) to investigate novel architecture, technology (including new protocols), and techniques for data intensive scientific investigation based on 100 Gbps capabilities. As part of this research, iCAIR and LAC are conducting experimental investigations, using novel cloud technology for data intensive science on a national Open Science Data Cloud testbed, which is supported by the Open Cloud Consortium.
NASA has had an ongoing collaborative relationship with the Mid-Atlantic Exchange (MAX) for over 10 years experimenting with and implementing new network technologies. The MAX also provides NASA Goddard Space Flight Center's (GSFC) High End Computer Networking (HECN) group with connectivity to high-speed research and educational networks.
”Next generation science increasingly requires investigations based on extremely large volumes of data that must be transported across wide distances with exceptionally high performance,” said Bill Fink, advanced technology researcher at the NASA Goddard Space Flight Center. “For example, NASA is developing a next generation network platform to support a wide range of strategic research projects including in the areas of advanced networking, climate science, earth science and astrophysics.”
Many new high-performance data intensive research investigations, called petascale science, will be increasingly applied to discovery domains, including weather and climate simulation, nuclear simulations, cosmology, quantum chemistry, lower-level organism brain simulation, and fusion science. Current networks provisioned for 10 Gbps do not provide sufficient rate, time and volume performance for many emerging applications. Petascale science involves not only the creation of massive datasets generated at supercomputer, instrumentation, and experimental facilities, but also subsequent analysis of that data by a user community that may be distributed across many laboratories and universities, across the U.S. and across the world. Exceptionally time-efficient data flows for petascale science over wide areas are persistent requirements for many advanced research disciplines. These projects are developing techniques to optimize WAN file transfer at 100 Gbps in part by designing data transfer utilities, protocols, and techniques that enable extremely high sustained end-to-end flows, including disk-to-disk and memory-to-memory.
The NASA Center for Climate Simulation is also using high performance computing to flow 100’s of terabytes of high-resolution climate forecasts from its Goddard Institute for Space Studies as well as its Global Modeling and Assimilation Office groups. These high-resolution climate forecasts will be major contributors to the next United Nations Intergovernmental Panel on Climate Change Assessment Report, which will be published in 2013.
NASA also has established a partnership with the International Center for Advanced Internet Research at Northwestern University (iCAIR) and the Laboratory for Advanced Computing at the University of Chicago (LAC) to investigate novel architecture, technology (including new protocols), and techniques for data intensive scientific investigation based on 100 Gbps capabilities. As part of this research, iCAIR and LAC are conducting experimental investigations, using novel cloud technology for data intensive science on a national Open Science Data Cloud testbed, which is supported by the Open Cloud Consortium.
NASA has had an ongoing collaborative relationship with the Mid-Atlantic Exchange (MAX) for over 10 years experimenting with and implementing new network technologies. The MAX also provides NASA Goddard Space Flight Center's (GSFC) High End Computer Networking (HECN) group with connectivity to high-speed research and educational networks.