This is historical material “frozen in time”. The website is no longer updated and links to external websites and some internal pages may not work.

Search form

Advancing U.S. Leadership in High-Performance Computing

Summary: 
Over the past 60 years, the United States has been a leader in the development and deployment of cutting-edge computing systems. Today, President Obama issued an Executive Order establishing the National Strategic Computing Initiative (NSCI) to ensure the United States continues leading in this field over the coming decades.

Over the past 60 years, the United States has been a leader in the development and deployment of cutting-edge computing systems. High-Performance Computing (HPC) systems, through their high levels of computing power and large amounts of storage capacity, have been and remain essential to economic competitiveness, scientific discovery, and national security.

Today, President Obama issued an Executive Order establishing the National Strategic Computing Initiative (NSCI) to ensure the United States continues leading in this field over the coming decades. This coordinated research, development, and deployment strategy will draw on the strengths of departments and agencies to move the Federal government into a position that sharpens, develops, and streamlines a wide range of new 21st century applications. It is designed to advance core technologies to solve difficult computational problems and foster increased use of the new capabilities in the public and private sectors.

HPC has historically focused on using numerical techniques to simulate a variety of complex natural and technological systems, such as galaxies, weather and climate, molecular interactions, electric power grids, and aircraft in flight.  The largest of these machines are referred to as supercomputers. One measure of supercomputer performance is flops, or floating-point operations per second, indicating the number of arithmetic operations performed each second.  Over the next decade the goal is to build supercomputers capable of one exaflop (1018 operations per second). It is also important to note that HPC in this context is not just about the speed of the computing device itself. As the President’s Council of Advisors on Science and Technology has concluded, high-performance computing “must now assume a broader meaning, encompassing not only flops, but also the ability, for example, to efficiently manipulate vast and rapidly increasing quantities of both numerical and non-numerical data.”

As an example, Computational Fluid Dynamics (CFD) has been an important tool in aircraft design since the 1970s. Through CFD simulations, the aircraft industry has significantly reduced the need for wind tunnel and flight testing, but current technology can only handle simplified models of the airflow around a wing and under limited flight conditions. A recent study commissioned by NASA determined that machines able to sustain exaflop-level performance could incorporate full modeling of turbulence, as well as more dynamic flight conditions, in their simulations.

With the availability of large data sets, including web pages, genome datasets, and the outputs of scientific instruments, data analytics has emerged as a new form of large-scale computing, extracting meaningful insights from diverse data collections. These big data approaches have had a revolutionary impact in both the commercial sector and in scientific discovery. Over the next decade, these systems will manage and analyze data sets of up to one exabyte (1018 bytes).

The Precision Medicine Initiative (PMI) illustrates how data-driven HPC can give clinicians tools to better understand the complex biological mechanisms underlying a patient’s disease, and to better predict the most effective treatments. Central to PMI is the ability to process large volumes of health and genomic data. As DNA sequencing technology improves, the volume of data will continue to increase and so too will the computational requirements. NSCI will shorten the time it takes to sequence samples and improve accuracy.

As NSCI drives forward these two goals of exaflop computing ability and exabyte storage capacity, it will also find ways to combine large-scale numerical computing with big data analytics. This will enable new forms of computation, including simulations of weather that are coupled with actual observations from weather satellites. It will also enable new analytic methods that require more extensive numerical processing, such as emerging techniques that use artificial intelligence to automatically learn new capabilities from large numbers of examples. There are also national security benefits, including using modeling and simulation to improve IED-resistant vehicle designs.

By strategically investing now, we can prepare for increasing computing demands and emerging technological challenges, building the foundation for sustained U.S. leadership for decades to come, while also expanding the role of high-performance computing to address the pressing challenges faced across many sectors.

Tom Kalil is Deputy Director for Technology and Innovation at the White House Office of Science and Technology Policy.

Jason Miller is Deputy Assistant to the President and Deputy Director of the National Economic Council.