Pages

Tuesday, April 12, 2011

Ruth Pordes Keynote: U.S. Shared Cyberinfrastructure

Ruth Pordes presented Tuesday’s keynote address to a packed auditorium.

Since much of her career has been dedicated to leadership roles with U.S. Department of Energy (DOE) and U.S. National Science Foundation (NSF) technology projects, Pordes is well versed on the topic of U.S. Shared Cyberinfrastructure (CI). Her current affiliations include: associate head of Fermi National Accelerator Laboratory’s Computing Division for Grids and Outreach; executive director of the Open Science Grid (OSG); member of the Compact Muon Solenoid, Large Hadron Collider experiment, and adviser to the European Grid Infrastructure (EGI)-InSpire and U.S. Network for Earthquake Engineering Simulation (NEES) projects.

Pordes takes her responsibility as a steward of the U.S. national investment in CI very seriously. She stressed the importance of assuring legislators and taxpayers that their programs are well managed and that continued support for funding should remain a national priority in the future.

In addition to providing an overview of the technology offered by U.S. government agencies, Pordes described a national commitment to the continued funding of petascale machines, with a major focus on research toward exascale computing. There is ubiquitous enthusiasm and support for the testing and use of commercial cloud technologies. Additionally, she anticipates the continued presence of university clusters for specific research goals and local sharing.

Pordes stressed the value of the Campus Champions model, developed by TeraGrid and embraced by the OSG. The program offers training and support to select faculty and staff who, in turn, share this expertise with their communities, offering first-time users a knowledgeable and familiar face to facilitate the use of advanced technologies available locally, regionally, nationally, and globally. Successful use increases user confidence while garnering buy-in from campus administrators. There are currently 94 champions from 75 U.S. university campuses, agencies, and industry affiliates. Solidarity among them fosters inter-institutional collaboration and the establishment of innovative virtual organizations. Champions serve as a conduit for feedback from the distributed and evolving user community and the model may relieve competition for resources in an oversubscribed culture by satisfying some of the demand at a local level.

OSG has had a long history of engagement with campuses through the more than 80 sites that are part of the OSG infrastructure. The project is continuing to outreach to new communities in an estimated 300 U.S. CI-ready campuses. Other activities in the U.S. include the EDUCAUSE report from campus CIOs aimed at influencing provosts.

Pordes talked about the role of science gateways and communities in facilitating access to discipline-specific resources. In Q310, 35% of all TeraGrid users submitted jobs via a Science Gateway. There are currently 34 gateways serving a wide array of research communities. Pordes described the iPlant Collaborative Project as an example of a multidisciplinary community of scientists, teachers, and students who work together, via a community of resources, to advance the understanding of plant science through the application of computational thinking and approaches to grand challenge problems in plant biology.

Bioenergy research is a priority for the U.S. and a shared CI for the community has been established. Three bioenergy centers have been funded by DOE over three years: Bioenergy Science Center at Oak Ridge National Laboratory, Great Lakes Bioenergy Research Center, and the Joint Bioenergy Institute at Lawrence Berkeley National Laboratory. Additional communities of interest include: Lattice Quantum Chromodynamics Consortium; Biomed Informatics Research Network (BIRN); Earth System Grid (ESGF) which includes a federated database for global collection of climate data; Advanced LIGO/extends the physics reach of LIGO; and the Network for Earthquake Engineering Simulation (NEES). NEES gathers measurements from widely distributed seismic sensors to determine the impact of earthquakes. Their data-intensive application involves the manipulation, analysis, and storage of multiple file formats which presents data analysis and storage challenges.

Pordes described the collaboration between OSG and TeraGrid, and OSG’s official role as a service provider of high throughput resources in the next phase of NSF-funded CI, or “XD” which will supplant TeraGrid July 1, 2011. In the spirit of NSF’s CI Framework for the 21st Century, or CF21 program, TeraGrid and OSG jointly subscribe to the principles of federated service and ubiquitous access. She explained the differences between OSG and TeraGrid, including hardware offerings, user communities, and access, specifically the allocation process. OSG does not have an allocation process, and requests for TeraGrid time are peer-reviewed on a quarterly basis. She speculates that there may be a unified method in the future as their collaboration matures. In 2010, NSF funded ExTENCI, a grant to facilitate the development of tools and deployment of OSG/TG bridging technologies, such as LUSTRE-WAN, virtual machines, and science clouds.

OSG has a long history of interoperation with EGI, and its predecessor EGEE I-111. She anticipates continued interoperation with EGI operations, software, and policy services as well as with the European Middleware Initiative (EMI) through additional support for future gLite & ARC releases as this will increase usability across OSG and XD as well. She identified future challenges as OSG and TeraGrid evolve, specifically the support of Unicore, Genesis II, and Nimbus. She emphasized the advantages of a common software stack, including further deployment of OSG’s Virtual Data Toolkit (VDT) which will provide campuses with access to clouds, and facilitate inter and intra-campus shared CI and the ability to leverage EPEL-Fedora for software packaging and distribution. OSG and TeraGrid are aggressively watchful, making sure software offerings don’t devolve as needs change.

Currently one of OSG’s focus areas is more support for campus wide shared CI, enabling capabilities like more support for local campus identity management services, where there will be no need for grid certificates. An integrated software package that does not require root privileges will be easily installed by a lightly-trained campus liaison. There will be more coordinated education, training, and documentation efforts and a campus job submission process that is capable of routing jobs to multiple batch schedulers, or directly to on-campus resources. OSG already has a prototype implementation with Condor GlideIn and Flocking capability on six campuses.

Pordes described how the NSF is rising to meet the data challenge. All new projects, since January of this year, are required to provide a data management plan. DataNET, DataOne, and Data Conservancy programs are in the beginning of their lifecycle and address the needs of specific and broad scientific domains. Digital libraries are becoming more common on the national scene. Among providers of national infrastructure, there is interest in developing a federated data archive (for curation and access).

There is continued investment by the DOE in leadership class machines at Argonne and Oak Ridge National Labs. Multipurpose HPC is available at Lawrence Berkeley National Lab (NERSC). Single-purpose resources are available at DOE’s Brookhaven National Laboratory, Fermilab, Stanford Linear Accelerator Center (SLAC), Jefferson Laboratory (JLAB), and National Renewable Energy Laboratory (NREL). Across all labs there is interest and activity in the use of GPUs for performing high-end computations.

There is U.S. participation in an international effort for the development of exascale software (exascale.org). The organization is concerned with: optimized processors, compilers, tools, data, workflow management, application adaptation, algorithms, performance, and fault tolerance. Also considered are exascale network preparation, including the integration of high-performance (100-1000 gigabit) network capabilities into drivers, OS, middleware, and applications over the next few years.

--------------------------------------------------------

As external relations coordinator for the U.S. National Science Foundation’s (NSF) TeraGrid—one of the most comprehensive and diverse CI's for open scientific research in the world—I thought I had a better-than-average understanding of the technology available to the U.S. research community. In less than one hour this morning, Pordes made me realize how much more I have to learn!

Elizabeth Leake--TeraGrid External Relations

No comments: