Nav: Home

The incredible shrinking particle accelerator

October 05, 2016

Particle accelerators are on the verge of transformational breakthroughs--and advances in computing power and techniques are a big part of the reason.

Long valued for their role in scientific discovery and in medical and industrial applications such as cancer treatment, food sterilization and drug development, particle accelerators, unfortunately, occupy a lot of space and carry hefty price tags. The Large Hadron Collider at CERN in France and Switzerland, for example--the world's largest and most powerful particle accelerator--has a circumference of 17 miles and cost $10 billion to build. Even smaller accelerators, such as those used in medical centers for proton therapy, need large spaces to accommodate the hardware, power supplies and radiation shielding. Such treatment facilities typically fill a city block and cost hundreds of millions of dollars to build.

But efforts are under way to make this technology more affordable and accessible by shrinking both the size and the cost without losing the capability. One of the most exciting developments is the plasma accelerator, which uses lasers or particle beams rather than radio-frequency waves to generate the accelerating field. Researchers have already shown the potential for laser plasma acceleration to yield significantly more-compact accelerators. But further development is needed before these devices--envisioned as almost literally "tabletop" in many applications--make their way into everyday use.

This is where advanced visualization tools and supercomputers such as the Edison and Cori supercomputers at Lawrence Berkeley National Laboratory's National Energy Research Scientific Computing Center (NERSC) come in.

"To take full advantage of the societal benefits of particle accelerators, game-changing improvements are needed in the size and cost of accelerators, and plasma-based particle accelerators stand apart in their potential for these improvements," said Jean-Luc Vay, a senior physicist in Berkeley Lab's Accelerator Technology and Applied Physics Division (ATAP).

Vay is leading a particle accelerator modeling project as part of the NESAP program at NERSC and is the principal investigator on one of the new exascale computing projects sponsored by the U.S. Department of Energy (DOE). "Turning this from a promising technology into a mainstream scientific tool depends critically on large-scale, high-performance, high-fidelity modeling of complex processes that develop over a wide range of space and time scales," he said.

Vay and a team of mathematicians, computer scientists and physicists are working to do just that by developing software tools that can facilitate simulating, analyzing and visualizing the increasingly large datasets produced during particle accelerator studies.

Accelerator modeling is an opportunity to help lead the way to exascale applications, noted ATAP Division Director Wim Leemans. "We've spent years preparing for this opportunity," he said, pointing to the already widespread use of modeling in accelerator design and the tradition of collaboration between physics and computing experts that has been a hallmark of ATAP's modeling work.

"One of the driving factors in our research is the transition to exascale and how data visualization is changing," explained Burlen Loring, a computer systems engineer who is part of the collaboration, along with Oliver Rübel, David Grote, Remi Lehe, Stepan Bulanov and Wes Bethel, all of Berkeley Lab, and Henri Vincenti, a Berkeley Lab postdoctoral researcher from CEA in France. "With exascale systems, traditional visualization becomes prohibitive as the simulation get larger and the machines get larger--storing all the data doesn't work and the file systems and data bandwidth rates aren't keeping up with the compute capacity."

In Situ to the Rescue

Now, in a paper published June 9 in IEEE Computer Graphics and Applications (IEEE CG&A), the team describes a new approach to this challenge: WarpIV. WarpIV is a plasma and accelerator simulation, data visualization and analysis toolkit that marries two software tools already widely used in high energy physics: Warp, an advanced particle-in-cell simulation framework, and VisIt, a 3D scientific visualization application that supports most common visualization techniques. Together, they give users the ability to perform in situ visualization and analysis of their particle accelerator simulations at scale--that is, while the simulations are still running and using the same high performance computing resources--thus reducing memory usage and saving computer time.

"We have this push to transition a significant portion of our visualization work over to the in situ domain," Loring said. "This work is a step in that direction. It is our first take on in situ for laser plasma accelerators and our first chance to use it on a real science problem."

A primary function of WarpIV is to manage and control the end-to-end, integrated simulation and in situ visualization and analysis workflow. To achieve this, WarpIV supports four main modes of operation--batch, monitoring, interactive and prompt--each of which in turn supports a different approach to in situ scientific discovery. WarpIV also uses a factory pattern design to define simulation models, which allows users to create new simulation and in situ analysis models in a self-contained fashion; and Python-based visualization and analysis scripts.

"One of the design factors that will make it easy for scientists to use WarpIV is the ability to use Python scripts that are autogenerated in VisIt," Loring explained. "The scientist takes a representative dataset before they make their runs and comes up with visualization scripts. Then they open the representative dataset in VisIt and use the recording feature to automatically record their actions into a Python script. Then WarpIV takes these scripts and runs them in the in situ environment."

Another key feature of WarpIV is its integrated analytics--notably, filtered particle species, which enable users to pick out particular features of interest from the hundreds of millions of particles required for accurate simulation.

"Very often when you do a visualization, particularly in situ, you want to minimize how much time you spend on it, and you can do this by focusing on particular features," Rübel explained. "In this case, for example, you need large numbers of particles to simulate the process, but the features you are interested in, such as the beam that is extracted from the background plasma, are going to be much smaller than that. So finding these features and doing the analysis while the simulation is running, this is what we call filtered species. It is a mechanism we developed not just to do plots, but to find what it is you want to plot."

Toward 3D Modeling

WarpIV, which Rübel initially prototyped in 2013, grew out of a collaboration between two DOE SciDAC projects: SDAV (Scalable Data Management, Analysis and Visualization) and COMPASS (Community Project for Accelerator Science and Simulation) programs. The work was also subsequently supported by DOE's CAMPA (Consortium for Advanced Modeling of Particle Accelerators) program.

The WarpIV toolkit, which continues to undergo development, was officially rolled out in June 2016 and is available via bitbucket. Initial testing has yielded positive results in terms of scalability, performance, usability and proven impact on science.

For example, in the research that resulted in the IEEE CG&A paper, the team ran a series of ion accelerator simulations in 2D and 3D to analyze WarpIV's performance and scalability. Comparison of these simulations revealed significant quantitative differences between the 2D and 3D models, highlighting the critical need for high-resolution 3D simulations in conjunction with advanced in situ visualization and analysis to enable the accurate modeling and study of new breeds of particle accelerators.

In one 3D series, they tracked the run time for five categories of operations at 50-iteration simulation updates and found that, at each update, the visualization, analysis and I/O operations consumed 11-15 percent of the total time, while the rest was used by the simulation--a ratio the researchers consider "quite reasonable." They also found that the in situ approach reduced the I/O cost by a factor of more than 4000x.

There is great demand for 3D simulation codes that run in a reasonable time and perform accurate accelerator modeling with correct quantitative predictive power, Vay emphasized.

"We want to be able to conduct experiments on ion acceleration, so in this case it is very important to have a working simulation tool to predict and analyze all kinds of experiments and test theories," said Bulanov, a research scientist in the Berkeley Lab Laser Accelerator Center who works closely with Vay. "And if the simulations can't keep pace with the experiment, it would slow us down significantly."

Having in situ tools like WarpIV will be increasingly valuable as supercomputers transition to more complex manycore architectures, Vay added.

"WarpIV provides visualization in 3D that we would not have been able to obtain easily using our previous visualization tools, which were not scaling as well to many computational cores," he said.
NERSC is a DOE Office of Science User Facility and is the primary high-performance computing facility for scientific research sponsored by the U.S. Department of Energy's Office of Science. Lawrence Berkeley National Laboratory is a U.S. Department of Energy national laboratory located in Berkeley, California. It conducts unclassified scientific research and is managed by the University of California for the U.S. DOE Office of Science. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States.

DOE/Lawrence Berkeley National Laboratory

Related Plasma Articles:

Table top plasma gets wind of solar turbulence
Scientists from India and Portugal recreate solar turbulence on a table top using a high intensity ultrashort laser pulse to excite a hot, dense plasma and followed the evolution of the giant magnetic field generated by the plasma dynamics.
Getting the biggest bang out of plasma jets
Capillary discharge plasma jets are created by a large current that passes through a low-density gas in what is called a capillary chamber.
Neptune: Neutralizer-free plasma propulsion
Plasma propulsion concepts are gridded-ion thrusters that accelerate and emit more positively charged particles than negatively charged ones.
UCLA researchers discover a new cause of high plasma triglycerides
People with hypertriglyceridemia often are told to change their diet and lose weight.
Where does laser energy go after being fired into plasma?
An outstanding conundrum on what happens to the laser energy after beams are fired into plasma has been solved in newly-published research at the University of Strathclyde.
New feedback system could allow greater control over fusion plasma
A physicist has created a new system that will let scientists control the energy and rotation of plasma in real time in a doughnut-shaped machine known as a tokamak.
PPPL scientist uncovers physics behind plasma-etching process
PPPL physicist Igor Kaganovich and collaborators have uncovered some of the physics that make possible the etching of silicon computer chips, which power cell phones, computers, and a huge range of electronic devices.
Calculating 1 billion plasma particles in a supercomputer
At the National Institutes of Natural Sciences National Institute for Fusion Science (NIFS) a research group using the NIFS 'Plasma Simulator' supercomputer succeeded for the first time in the world in calculating the movements of one billion plasma particles and the electrical field constructed by those particles.
Anti-tumor effect of novel plasma medicine caused by lactate
Nagoya University researchers developed a new physical plasma-activated salt solution for use as chemotherapy.
Clarifying the plasma oscillation by high-energy particles
The National Institute for Fusion Science has developed a new code that can simulate the movement of plasma and, simultaneously, the movement of particles circulating at high speeds.

Related Plasma Reading:

Best Science Podcasts 2019

We have hand picked the best science podcasts for 2019. Sit back and enjoy new science podcasts updated daily from your favorite science news services and scientists.
Now Playing: TED Radio Hour

Digital Manipulation
Technology has reshaped our lives in amazing ways. But at what cost? This hour, TED speakers reveal how what we see, read, believe — even how we vote — can be manipulated by the technology we use. Guests include journalist Carole Cadwalladr, consumer advocate Finn Myrstad, writer and marketing professor Scott Galloway, behavioral designer Nir Eyal, and computer graphics researcher Doug Roble.
Now Playing: Science for the People

#529 Do You Really Want to Find Out Who's Your Daddy?
At least some of you by now have probably spit into a tube and mailed it off to find out who your closest relatives are, where you might be from, and what terrible diseases might await you. But what exactly did you find out? And what did you give away? In this live panel at Awesome Con we bring in science writer Tina Saey to talk about all her DNA testing, and bioethicist Debra Mathews, to determine whether Tina should have done it at all. Related links: What FamilyTreeDNA sharing genetic data with police means for you Crime solvers embraced...