Award Abstract # 1445606
Bridges: From Communities and Data to Workflows and Insight

NSF Org: OAC
Office of Advanced Cyberinfrastructure (OAC)
Recipient: CARNEGIE MELLON UNIVERSITY
Initial Amendment Date: November 20, 2014
Latest Amendment Date: October 27, 2020
Award Number: 1445606
Award Instrument: Cooperative Agreement
Program Manager: Robert Chadduck
rchadduc@nsf.gov
 (703)292-2247
OAC
 Office of Advanced Cyberinfrastructure (OAC)
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: December 1, 2014
End Date: November 30, 2021 (Estimated)
Total Intended Award Amount: $9,650,000.00
Total Awarded Amount to Date: $20,895,168.00
Funds Obligated to Date: FY 2015 = $9,650,000.00
FY 2016 = $7,559,167.00

FY 2018 = $3,686,000.00
History of Investigator:
  • Shawn Brown (Principal Investigator)
    stbrown@psc.edu
  • John Urbanic (Co-Principal Investigator)
  • Paola Buitrago (Co-Principal Investigator)
  • Nicholas Nystrom (Former Principal Investigator)
  • Michael Levine (Former Co-Principal Investigator)
  • Ralph Roskies (Former Co-Principal Investigator)
  • Joseph Scott (Former Co-Principal Investigator)
  • Jason Sommerfield (Former Co-Principal Investigator)
Recipient Sponsored Research Office: Carnegie-Mellon University
5000 FORBES AVE
PITTSBURGH
PA  US  15213-3890
(412)268-8746
Sponsor Congressional District: 12
Primary Place of Performance: Carnegie-Mellon University
PA  US  15213-3815
Primary Place of Performance
Congressional District:
12
Unique Entity Identifier (UEI): U3NKNFLNQ613
Parent UEI: U3NKNFLNQ613
NSF Program(s): XD-Extreme Digital,
Innovative HPC,
Data Cyberinfrastructure
Primary Program Source: 01001516DB NSF RESEARCH & RELATED ACTIVIT
01001617DB NSF RESEARCH & RELATED ACTIVIT

01001819DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 7433
Program Element Code(s): 747600, 761900, 772600
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

1. Abstract: Nontechnical Description

The Pittsburgh Supercomputing Center (PSC) will provide an innovative and groundbreaking high-performance computing (HPC) and data-analytic system, Bridges, which will integrate advanced memory technologies to empower new communities, bring desktop convenience to HPC, connect to campuses, and intuitively express data-intensive workflows.

To meet the requirements of nontraditional HPC communities, Bridges will emphasize memory, usability, Title and effective data management, leveraging innovative new technologies to transparently benefit applications and lower the barrier to entry.
Three tiers of processing nodes with shared memory ranging from 128GB to 12TB will address an extremely broad range of user needs including interactivity, workflows, long-running jobs, virtualization, and high capacity. Flexible node allocation will enable interactive use for debugging, analytics, and visualization. Bridges will also include a shared flash memory device to accelerate Hadoop and databases.

Bridges will host a variety of popular gateways and portals through which users will easily be able to access its resources. Its many nodes will allow long-running jobs, flexible access to interactive use (for example, for debugging, analytics, and visualization, and access to nodes with more memory. Bridges will host a broad spectrum of application software, and its familiar operating system and programming environment will support high-productivity programming languages and development tools.

Bridges will address data management at all levels. Its shared Project File System, connected to processing nodes by a very capable, appropriately scaled fabric, will provide high-bandwidth, low-latency access to large datasets. Storage on each node will provide local filesystem space that is frequently requested by users and will prevent congestion to the shared filesystem. A set of nodes will be optimized for and dedicated to running databases to support gateways, workflows, and applications. Dedicated web server nodes will enable distributed workflows.

Bridges will introduce powerful new CPUs and GPUs, and a new interconnection fabric to connect them. These new technologies will be supported by extremely broad set of applications, libraries, and easy-to-use programming languages and tools.
Bridges will interoperate with and complement other NSF Advanced Cyberinfrastructure resources and large scientific instruments such as telescopes and high-throughput genome sequencers, and it will provide convenient bridging to campuses.

Bridges will enable important advances for science and society. By supporting pioneers who set examples in fields not traditionally users of HPC, and by lowering the barrier of entry, this project will spur others to recognize the power of the technology and transform their fields, as has happened in traditional HPC fields such as physics and chemistry. The project will engage students in research and systems internships, develop and offer training to novices and experts, extend the impact of the new system to minority schools and EPSCoR states, impact the undergraduate and graduate curriculum at many universities, raise the level of computational awareness at four-year colleges, and support the introduction of computational thinking into high schools.

2. Abstract: Technical Description

The Pittsburgh Supercomputing Center will substantially increase the scientific output of a large community of scientific and engineering researchers who have not traditionally used high-performance computing (HPC) resources. This will be accomplished by the acquisition, deployment, and management of Bridges, a HPC system designed for extreme flexibility, functionality, and usability. Bridges will be supported by operations, user service, and networking staff attuned to the needs of these new user communities, and it will offer a wide range of software appropriate for nontraditional HPC research communities. Users will be able to access Bridges through a variety of popular gateways and portals, and the system will provide development tools for gateway building.

Innovative capabilities to be introduced by Bridges are:

1. Three tiers of processing nodes will offer 128GB, 3TB, and 12TB of hardware-supported, coherent shared memory per node to address an extremely broad range of user needs including interactivity, workflows, long-running jobs, virtualization, and high capacity. The 12TB nodes, featuring a proprietary, high-bandwidth internal communication fabric, will be particularly valuable for genome sequence assembly, graph analytics, and machine learning. Bridges will also include a shared flash memory device to accelerate Hadoop and databases. Flexible node allocation will enable interactive use for debugging, analytics, and visualization.

2. Bridges will provide integrated, full-time relational and NoSQL databases to support metadata, data management and efficient organization, gateways, and workflows. Database nodes will include SSDs for high IOPs and will be allocated through an extension to the XRAC process. Dedicated web server nodes with high-bandwidth connections to the national cyberinfrastructure will enable distributed workflows. The system topology will provide balanced bandwidth for nontraditional HPC workloads and data-intensive computing.

3. Bridges will introduce powerful new CPUs (Intel Haswell and Broadwell), GPUs (NVIDIA GK210 and GP100), the innovative, high-performance Intel Omni Scale Fabric to support increasingly productive development of advanced applications, supported by an extremely broad set of applications, libraries, and easy-to-use programming languages and tools such as OpenACC, parallel MATLAB, Python, and R.

4. A shared Project File System (PFS) will provide high-bandwidth, low-latency access to large datasets. Each node will also provide distributed, high-performance storage to support many emerging applications, intermediate and temporary storage, and reduce congestion on the shared PFS.

Bridges will enable important advances for science and society. By supporting pioneers who set examples in fields not traditionally users of HPC, and by lowering the barrier of entry, this project will spur others to recognize the power of the technology and transform their fields, as has happened in traditional HPC fields such as physics and chemistry. The project will engage students in research and systems internships, develop and offer training to novices and experts, extend the impact of the new system to minority schools and EPSCoR states, impact the undergraduate and graduate curriculum at many universities, raise the level of computational awareness at four-year colleges, and support the introduction of computational thinking into high schools.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 11)
Brown, N., and Tuomas, S. "Superhuman AI for heads-up no-limit poker: Libratus beats top professionals" Science , 2017 10.1126/science.aao1733
Brown, NoamSandholm, Tuomas "Safe and Nested Subgame Solving for Imperfect-Information Games" Neural Information Processing Systems (NIPS-17): Best Paper , 2017 http://arXiv.org/abs/1705.02955
Dong, LiangKumar, HemantAnasori, BabakGogotsi, YuryShenoy, Vivek B. "Rational Design of Two-Dimensional Metallic and Semiconducting Spintronic Materials Based on Ordered Double-Transition-Metal MXenes" The Journal of Physical Chemistry Letters , v.8 , 2017 , p.422 10.1021/acs.jpclett.6b02751
Hoogerheide, David P.Noskov, Sergei Y.Jacobs, DanielBergdoll, LucieSilin, VitaliiWorcester, David L.Abramson, JeffNanda, HirshRostovtseva, Tatiana K.Bezrukov, Sergey M. "Structural features and lipid binding domain of tubulin on biomimetic mitochondrial membranes" Proceedings of the National Academy of Sciences , v.114 , 2017 , p.E3622 10.1073/pnas.1619806114
Hwang, HyeaMcCaslin, Tyler G.Hazel, AnthonyPagba, Cynthia V.Nevin, Christina M.Pavlova, AnnaBarry, Bridgette A.Gumbart, James C. "Redox-Driven Conformational Dynamics in a Photosystem-II-Inspired ?-Hairpin Maquette Determined through Spectroscopy and Simulation" The Journal of Physical Chemistry B , v.121 , 2017 , p.3536 10.1021/acs.jpcb.6b09481
Ishtar Nyaw?raKristi Bushman "A Simplified Approach to Deep Learning for Image Segmentation" PEARC '18 Proceedings of the Practice and Experience on Advanced Research Computing , 2018 10.1145/3219104.3229286
Ishtar Nyaw?raKristi BushmanIris QianAnnie Zhang "Understanding Neural Pathways in Zebrafish through Deep Learning and High Resolution Electron Microscope Data" PEARC '18 Proceedings of the Practice and Experience on Advanced Research Computing , 2018 10.1145/3219104.3229285
Jingyang WangBinit LukoseMichael O. ThompsonPaulette Clancy "Ab initio modeling of vacancies, antisites, and Si dopants in ordered InGaAs" Journal of Applied Physics , v.121 , 2017 , p.045106 10.1063/1.4974949
Nystrom, N. A., Levine, M. J., Roskies, R. Z., and Scott, J. R. "Bridges: A Uniquely Flexible HPC Resource for New Communities and Data Analytics" Bridges: A Uniquely Flexible HPC Resource for New Communities and Data Analytics. In Proceedings of the 2015 Annual Conference on Extreme Science and Engineering Discovery Environment (St. Louis, MO, July 26-30, 2015). XSEDE15. ACM, New York, NY, USA. , 2015 http://dx.doi.org/10.1145/2792745.2792775
Paola A. Buitrago, Nicholas A. Nystrom, Rajarsi Gupta, and Joel Saltz "Delivering Scalable Deep Learning to Research with Bridges-AI" High Performance Computing. CARLA 2019. Communications in Computer and Information Science , v.1087 , 2019 10.1007/978-3-030-41005-6_14
Solomon, BradKingsford, Carl "Fast search of thousands of short-read sequencing experiments" Nature Biotechnology , v.34 , 2017 , p.300 10.1038/nbt.3442
(Showing: 1 - 10 of 11)

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

The Bridges research computing platform for novel and innovative research provided an innovative production computing resource that converges high-performance computing (HPC), artificial intelligence (AI), and Big Data to empower new research communities, bring desktop convenience to high-end computing, expand remote access, and help researchers to work more intuitively. Bridges provided heterogeneous computing capabilities to support complex workflows and disparate computing types under a common framework so researchers could perform highly-complex and challenging computing and data analytics. With Bridges' unique blend of capacity, high-memory, and gpu-based computing, it was capable of supporting virtually all levels of tradtional computing while bringing new user communities into the NSF research computing ecosystem. The impact of Bridges for the science and engineering community has been vast, having supported over 2700 projects and over 26,000 users and produced over 1500 peer-reviewed publications over its lifetime, with 40-50% of these projects serving novel communities and applications that were not traditionally users of high-performance computing. Over 265 grants and 10 million CPU-hours have been awarded for use of the resource in university curriculum and workshop curriculum. Bridges has had broad impact, providing resources to several novel research computing compunities including genomics, epidemiology, neuroscience, and artificial intelligence. With the additiona of Bridges-AI, the platform brought state-of-the-art artificial intelligence capabilities to the academic community and paved the way for several future research architectures. Web-based science gateways and portals, such as those developed in the HuBMAP project (https://hubmapconsortium.org/). and the Bridges Community Data Set programs have helped propel the areas of datacentric and open-science research. Bridges has had significant impact across all areas of science and engineering throughout its lifetime and has led the charge in the deployment of heterogeneous research computing platforms.


Last Modified: 03/31/2022
Modified by: Shawn T Brown

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page