Cables connected to a glowing supercomputer, a researcher pointing at a computer monitor, and columns of supercomputer racks/cabinets.

Supercharging Science with Supercomputers

NSF's supercomputing investments help solve society's biggest challenges.

In today's fast-paced, data-driven world, computational power is key to developing life-saving drugs, predicting hurricanes and transforming countless other industries.

The U.S. National Science Foundation's investments in cutting-edge supercomputers catalyze scientific breakthroughs, bolster national security and sharpen the nation's competitive edge.

Stampede supercomputer
The Stampede supercomputer, located at the University of Texas at Austin's Texas Advanced Computing Center, has enabled research teams to predict where and when earthquakes may strike, how much sea levels could rise and how fast brain tumors grow.

Credit: Texas Advanced Computing Center

What is a supercomputer?

The term "supercomputer" typically applies to the world's fastest computers at any given time. Most modern supercomputers are systems of interconnected computers or processors that run different parts of the same program simultaneously, performing complex operations at exceptionally fast computing speeds.

Supercomputer speeds are measured in floating-point operations per second (FLOPS), or calculations per second. As technology advances, the list of supercomputers constantly changes. In the 1970s, supercomputers reached 160 megaflops; today, they perform quadrillions of flops (petaflops) — one million times more powerful than the fastest laptop.

Powering up

The first commercially available supercomputer, the Cray-1, went into production in 1977 at the NSF-supported National Center for Atmospheric Research (link is external) (NSF NCAR). Its processing speeds of around 160 megaflops — now outpaced by even the most basic modern laptops by several orders of magnitude — made it the world's fastest supercomputer at that time.

The Cray-1 became an essential research tool for weather forecasting, fluid dynamics and materials science. Its unprecedented speed set a new standard for supercomputing.

After it was decommissioned, it was donated to the Smithsonian National Air and Space Museum, which maintains it as part of its permanent collection.

A man stands among supercomputer racks in a 1970's era room with orange-red tile floors.
The CRAY-1A, the first commercially available supercomputer, pictured at the National Center for Atmospheric Research in 1978.

Credit: University Corporation for Atmospheric Research/UCAR

A map of the United States with lines connecting many points across the country.
NSFNET, launched in 1986 by NSF to connect academic researchers to a new system of supercomputer centers, became the backbone of the early internet.

Credit: Donna Cox and Robert Patterson, courtesy of the National Center for Supercomputing Applications (NCSA) and the Board of Trustees of the University of Illinois

Superinfrastructure

Throughout the 1980s and 1990s, NSF programs like Supercomputer Centers and the Partnership for Advanced Computational Infrastructure provided researchers nationwide with supercomputing resources to perform ever-more-advanced research and technological innovation.

These efforts also laid the groundwork for modern high-performance computing, enabling faster, more efficient and user-friendly systems with real-world uses — from genome mapping and drug discovery to weather forecasting, aircraft design, special effects in films, personalized e-commerce recommendations and training artificial intelligence-powered chatbots like ChatGPT.

Some key technological advancements brought about by NSF funding include:

  • Launching NSFNET in 1986 to connect researchers with NSF supercomputer centers and serving as the internet's backbone until the mid-1990s. Learn more about NSF's investments in the internet.
  • Advancing parallel computing architectures that use many processors simultaneously to solve complex problems faster and more efficiently, like forecasting weather patterns or analyzing medical data.
  • Developing universal file formats to standardize data sharing, improve systems compatibility and reduce the risk of data loss due to outdated software or hardware.
  • Enhancing data visualization for interpreting large datasets, such as 3D models of wind velocities in thunderstorms and simulations of human blood circulation.
  • Creating user-friendly desktop software to help users organize, access and navigate digital information.

Discoveries at warp speed

Through programs like the NSF Advanced Cyberinfrastructure Coordination Ecosystem: Services & Support, NSF connects researchers to supercomputer resources, enabling breakthroughs such as:

Wildfire risks to human health

Using the Cheyenne supercomputer at the NCAR-Wyoming Supercomputing Center, researchers found that larger, more intense wildfires in the Pacific Northwest have shifted North America's air pollution patterns, causing a late summer spike in harmful pollutants across the continent.

Advancing preeclampsia research

Simulations on the Expanse supercomputer at the San Diego Supercomputer Center revealed key genetic biomarkers indicative of preeclampsia complications, which affect roughly 1 in 25 U.S. pregnancies, paving the way for earlier diagnosis and improved treatments.

Space weather prediction

In 2021, scientists used TACC's Frontera supercomputer to improve space weather forecasting, helping to protect power grids and communication satellites.

Earthquake risk prediction

Using the Frontera supercomputer at TACC, researchers trained machine learning models to predict areas and structures most at risk of collapse after an earthquake, advancing emergency response and recovery efforts.

Black hole image capture

Simulations on the Blue Waters supercomputer at the University of Illinois National Center for Supercomputing Applications enabled researchers to capture the first image of a black hole in 2019, reshaping the understanding of these cosmic giants. Learn more about the image.

Gravitational wave detection

The supercomputer Stampede at the Texas Advanced Computing Center (TACC) helped confirm the first detection of gravitational waves in 2015 by the NSF Laser Interferometer Gravitational Observatory, earning Rainer Weiss, Barry C. Barish and Kip S. Thorne the 2017 Nobel Prize in physics.