Nav: Home

PPPL and Princeton join high-performance software project

July 25, 2016

Princeton University and the U.S. Department of Energy's Princeton Plasma Physics Laboratory (PPPL) are participating in the accelerated development of a modern high-performance computing code, or software package. Supporting this development is the Intel Parallel Computing Center (IPCC) Program, which provides funding to universities and laboratories to improve high-performance software capabilities for a wide range of disciplines.

The project updates the GTC-Princeton (GTC-P) code, which was originally developed for fusion research applications at PPPL and has evolved into highly portable software that is deployed on supercomputers worldwide. The National Science Foundation (NSF) strongly supported advances in the code from 2011 through 2014 through the "G8" international extreme scale computing program, which represented the United States and seven other highly industrialized countries during that period.

New Activity

Heading the new IPCC activity for the University's Princeton Institute for Computational Science & Engineering (PICSciE) is William Tang, a PPPL physicist and PICSciE principal investigator (PI). Working with Tang is Co-PI Bei Wang, Associate Research Scholar at PICSciE, who leads this accelerated modernization effort. Joining them in the project are Co-PIs Carlos Rosales of the NSF's Texas Advanced Computing Center at the University of Texas at Austin and Khaled Ibrahim of the Lawrence Berkeley National Laboratory.

The current GTC-P code has advanced understanding of turbulence and confinement of the superhot plasma that fuels fusion reactions in doughnut-shaped facilities called tokamaks. Understanding and controlling fusion fuel turbulence is a grand challenge of fusion science, and great progress has been made in recent years. It can determine how effectively a fusion reactor will contain energy generated by fusion reactions, and thus can strongly influence the eventual economic attractiveness of a fusion energy system. Further progress on the code will enable researchers to study conditions that arise as tokamaks increase in size to the enlarged dimensions of ITER -- the flagship international fusion experiment under construction in France.

Access to Intel computer clusters

Through the IPCC, Intel will provide access to systems for exploring the modernization of the code. Included will be clusters equipped with the most recent Intel "Knights Landing" (KNL) central processing chips.

The upgrade will become part of the parent GTC code, which is led by Prof. Zhihong Lin of the University of California, Irvine, with Tang as co-PI. That code is also being modernized and will be proposed, together with GTC-P, to be included in the early science portfolio for the Aurora supercomputer. Aurora will begin operations at the Argonne Leadership Computing Facility, a DOE Office of Science User Facility at Argonne National Laboratory, in 2019. Powering Aurora will be Intel "Knights Hill" processing chips.

Last year, the GTC and GTC-P codes were selected to be developed as an early science project designed for the Summit supercomputer that will be deployed at Oak Ridge Leadership Computing Facility, also a DOE Office of Science User Facility, at Oak Ridge National Laboratory in 2018. That modernization project differs from the one to be proposed for Aurora because Summit is being built around architecture powered by NVIDIA Volta graphical processing units and IBM Power 9 central processing chips.

Moreover, the code planned for Summit will be designed to run on the Aurora platform as well.

Boost U.S. computing power

The two new machines will boost U.S. computing power far beyond Titan, the current leading U.S. supercomputer at Oak Ridge that can perform 27 quadrillion -- or million billion -- calculations per second. Summit and Aurora plan to perform some 200 quadrillion and 180 quadrillion calculations per second, respectively. Said Tang: "These new machines hold tremendous promise for helping to accelerate scientific discovery in many application domains, including fusion, that are of vital importance to the country."
-end-
PPPL, on Princeton University's Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas -- ultra-hot, charged gases -- and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by the University for the U.S. Department of Energy's Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

DOE/Princeton Plasma Physics Laboratory

Related Supercomputer Articles:

Supercomputer shows 'Chameleon Theory' could change how we think about gravity
Supercomputer simulations of galaxies have shown that Einstein's theory of General Relativity might not be the only way to explain how gravity works or how galaxies form.
Scientists develop way to perform supercomputer simulations of the heart on cellphones
You can now perform supercomputer simulations of the heart's electrophysiology in real time on desktop computers and even cellphones.
Tianhe-2 supercomputer works out the criterion for quantum supremacy
A world's first criterion for quantum supremacy was issued, in a research jointly led by Prof.
Supercomputer simulations show new target in HIV-1 replication
Nature study found naturally-occurring compound inositol hexakisphosphate (IP6) promotes both assembly and maturation of HIV-1.
Researchers measure the coherence length in glasses using the supercomputer JANUS
Thanks to the JANUS II supercomputer, researchers from Spain and Italy (Institute of Biocomputation and Physics of Complex Systems of the University of Zaragoza, Complutense University of Madrid, University of Extremadura, La Sapienza University of Rome and University of Ferrara), have refined the calculation of the microscopic correlation length and have reproduced the experimental protocol, enabling them to calculate the macroscopic length.
Officials dedicate OSC's newest, most powerful supercomputer
State officials and Ohio Supercomputer Center leaders gathered at a data center today (March 29) to dedicate the Owens Cluster.
A scientist and a supercomputer re-create a tornado
With tornado season fast approaching or already underway in vulnerable states throughout the US, new supercomputer simulations are giving meteorologists unprecedented insight into the structure of monstrous thunderstorms and tornadoes.
Calculating 1 billion plasma particles in a supercomputer
At the National Institutes of Natural Sciences National Institute for Fusion Science (NIFS) a research group using the NIFS 'Plasma Simulator' supercomputer succeeded for the first time in the world in calculating the movements of one billion plasma particles and the electrical field constructed by those particles.
Supercomputer simulations help develop new approach to fight antibiotic resistance
Supercomputer simulations at the Department of Energy's Oak Ridge National Laboratory have played a key role in discovering a new class of drug candidates that hold promise to combat antibiotic resistance.
Supercomputer comes up with a profile of dark matter
In the search for the mysterious dark matter, physicists have used elaborate computer calculations to come up with an outline of the particles of this unknown form of matter.
More Supercomputer News and Supercomputer Current Events

Top Science Podcasts

We have hand picked the top science podcasts of 2019.
Now Playing: TED Radio Hour

Risk
Why do we revere risk-takers, even when their actions terrify us? Why are some better at taking risks than others? This hour, TED speakers explore the alluring, dangerous, and calculated sides of risk. Guests include professional rock climber Alex Honnold, economist Mariana Mazzucato, psychology researcher Kashfia Rahman, structural engineer and bridge designer Ian Firth, and risk intelligence expert Dylan Evans.
Now Playing: Science for the People

#540 Specialize? Or Generalize?
Ever been called a "jack of all trades, master of none"? The world loves to elevate specialists, people who drill deep into a single topic. Those people are great. But there's a place for generalists too, argues David Epstein. Jacks of all trades are often more successful than specialists. And he's got science to back it up. We talk with Epstein about his latest book, "Range: Why Generalists Triumph in a Specialized World".
Now Playing: Radiolab

Dolly Parton's America: Neon Moss
Today on Radiolab, we're bringing you the fourth episode of Jad's special series, Dolly Parton's America. In this episode, Jad goes back up the mountain to visit Dolly's actual Tennessee mountain home, where she tells stories about her first trips out of the holler. Back on the mountaintop, standing under the rain by the Little Pigeon River, the trip triggers memories of Jad's first visit to his father's childhood home, and opens the gateway to dizzying stories of music and migration. Support Radiolab today at Radiolab.org/donate.