Nav: Home

Applying machine learning to the universe's mysteries

January 30, 2018

Computers can beat chess champions, simulate star explosions, and forecast global climate. We are even teaching them to be infallible problem-solvers and fast learners.

And now, physicists at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) and their collaborators have demonstrated that computers are ready to tackle the universe's greatest mysteries. The team fed thousands of images from simulated high-energy particle collisions to train computer networks to identify important features.

The researchers programmed powerful arrays known as neural networks to serve as a sort of hivelike digital brain in analyzing and interpreting the images of the simulated particle debris left over from the collisions. During this test run the researchers found that the neural networks had up to a 95 percent success rate in recognizing important features in a sampling of about 18,000 images.

The study was published Jan. 15 in the journal Nature Communications.

The next step will be to apply the same machine learning process to actual experimental data.

Powerful machine learning algorithms allow these networks to improve in their analysis as they process more images. The underlying technology is used in facial recognition and other types of image-based object recognition applications.

The images used in this study - relevant to particle-collider nuclear physics experiments at Brookhaven National Laboratory's Relativistic Heavy Ion Collider and CERN's Large Hadron Collider - recreate the conditions of a subatomic particle "soup," which is a superhot fluid state known as the quark-gluon plasma believed to exist just millionths of a second after the birth of the universe. Berkeley Lab physicists participate in experiments at both of these sites.

"We are trying to learn about the most important properties of the quark-gluon plasma," said Xin-Nian Wang, a nuclear physicist in the Nuclear Science Division at Berkeley Lab who is a member of the team. Some of these properties are so short-lived and occur at such tiny scales that they remain shrouded in mystery.

In experiments, nuclear physicists use particle colliders to smash together heavy nuclei, like gold or lead atoms that are stripped of electrons. These collisions are believed to liberate particles inside the atoms' nuclei, forming a fleeting, subatomic-scale fireball that breaks down even protons and neutrons into a free-floating form of their typically bound-up building blocks: quarks and gluons.

Researchers hope that by learning the precise conditions under which this quark-gluon plasma forms, such as how much energy is packed in, and its temperature and pressure as it transitions into a fluid state, they will gain new insights about its component particles of matter and their properties, and about the universe's formative stages.

But exacting measurements of these properties - the so-called "equation of state" involved as matter changes from one phase to another in these collisions - have proven challenging. The initial conditions in the experiments can influence the outcome, so it's challenging to extract equation-of-state measurements that are independent of these conditions.

"In the nuclear physics community, the holy grail is to see phase transitions in these high-energy interactions, and then determine the equation of state from the experimental data," Wang said. "This is the most important property of the quark-gluon plasma we have yet to learn from experiments."

Researchers also seek insight about the fundamental forces that govern the interactions between quarks and gluons, what physicists refer to as quantum chromodynamics.

Long-Gang Pang, the lead author of the latest study and a Berkeley Lab-affiliated postdoctoral researcher at UC Berkeley, said that in 2016, while he was a postdoctoral fellow at the Frankfurt Institute for Advanced Studies, he became interested in the potential for artificial intelligence (AI) to help solve challenging science problems.

He saw that one form of AI, known as a deep convolutional neural network - with architecture inspired by the image-handling processes in animal brains - appeared to be a good fit for analyzing science-related images.

"These networks can recognize patterns and evaluate board positions and selected movements in the game of Go," Pang said. "We thought, 'If we have some visual scientific data, maybe we can get an abstract concept or valuable physical information from this.'"

Wang added, "With this type of machine learning, we are trying to identify a certain pattern or correlation of patterns that is a unique signature of the equation of state." So after training, the network can pinpoint on its own the portions of and correlations in an image, if any exist, that are most relevant to the problem scientists are trying to solve.

Accumulation of data needed for the analysis can be very computationally intensive, Pang said, and in some cases it took about a full day of computing time to create just one image. When researchers employed an array of GPUs that work in parallel - GPUs are graphics processing units that were first created to enhance video game effects and have since exploded into a variety of uses - they cut that time down to about 20 minutes per image.

They used computing resources at Berkeley Lab's National Energy Research Scientific Computing Center (NERSC) in their study, with most of the computing work focused at GPU clusters at GSI in Germany and Central China Normal University in China.

A benefit of using sophisticated neural networks, the researchers noted, is that they can identify features that weren't even sought in the initial experiment, like finding a needle in a haystack when you weren't even looking for it. And they can extract useful details even from fuzzy images.

"Even if you have low resolution, you can still get some important information," Pang said.

Discussions are already underway to apply the machine learning tools to data from actual heavy-ion collision experiments, and the simulated results should be helpful in training neural networks to interpret the real data.

"There will be many applications for this in high-energy particle physics," Wang said, beyond particle-collider experiments.
-end-
Other participants in the study were from the Frankfurt Institute for Advanced Studies, Goethe University, GSI Helmholtzzentrum für Schwerionenforschung (GSI), and Central China Normal University.

The work was supported by the U.S Department of Energy's Office of Science, the National Science Foundation, the Helmholtz Association, GSI, SAMSON AG, Goethe University, the National Natural Science Foundation of China, the Major State Basic Research Development Program in China, and the Helmholtz International Center for the Facility for Antiproton and Ion Research.

NERSC is DOE Office of Science user facility.

Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel Prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy's Office of Science. For more, visit http://www.lbl.gov.

DOE's Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

DOE/Lawrence Berkeley National Laboratory

Related Learning Articles:

Getting kids moving, and learning
Children are set to move more, improve their skills, and come up with their own creative tennis games with the launch of HomeCourtTennis, a new initiative to assist teachers and coaches with keeping kids active while at home.
How expectations influence learning
During learning, the brain is a prediction engine that continually makes theories about our environment and accurately registers whether an assumption is true or not.
Technology in higher education: learning with it instead of from it
Technology has shifted the way that professors teach students in higher education.
Learning is optimized when we fail 15% of the time
If you're always scoring 100%, you're probably not learning anything new.
School spending cuts triggered by great recession linked to sizable learning losses for learning losses for students in hardest hit areas
Substantial school spending cuts triggered by the Great Recession were associated with sizable losses in academic achievement for students living in counties most affected by the economic downturn, according to a new study published today in AERA Open, a peer-reviewed journal of the American Educational Research Association.
Lessons in learning
A new Harvard study shows that, though students felt like they learned more from traditional lectures, they actually learned more when taking part in active learning classrooms.
Learning to look
A team led by JGI scientists has overhauled the perception of inovirus diversity.
Sleep readies synapses for learning
Synapses in the hippocampus are larger and stronger after sleep deprivation, according to new research in mice published in JNeurosci.
Learning from experience is all in the timing
Animals learn the hard way which sights, sounds, and smells are relevant to survival.
Learning language
When it comes to learning a language, the left side of the brain has traditionally been considered the hub of language processing.
More Learning News and Learning Current Events

Trending Science News

Current Coronavirus (COVID-19) News

Top Science Podcasts

We have hand picked the top science podcasts of 2020.
Now Playing: TED Radio Hour

Our Relationship With Water
We need water to live. But with rising seas and so many lacking clean water – water is in crisis and so are we. This hour, TED speakers explore ideas around restoring our relationship with water. Guests on the show include legal scholar Kelsey Leonard, artist LaToya Ruby Frazier, and community organizer Colette Pichon Battle.
Now Playing: Science for the People

#569 Facing Fear
What do you fear? I mean really fear? Well, ok, maybe right now that's tough. We're living in a new age and definition of fear. But what do we do about it? Eva Holland has faced her fears, including trauma and phobia. She lived to tell the tale and write a book: "Nerve: Adventures in the Science of Fear".
Now Playing: Radiolab

Uncounted
First things first: our very own Latif Nasser has an exciting new show on Netflix. He talks to Jad about the hidden forces of the world that connect us all. Then, with an eye on the upcoming election, we take a look back: at two pieces from More Perfect Season 3 about Constitutional amendments that determine who gets to vote. Former Radiolab producer Julia Longoria takes us to Washington, D.C. The capital is at the heart of our democracy, but it's not a state, and it wasn't until the 23rd Amendment that its people got the right to vote for president. But that still left DC without full representation in Congress; D.C. sends a "non-voting delegate" to the House. Julia profiles that delegate, Congresswoman Eleanor Holmes Norton, and her unique approach to fighting for power in a virtually powerless role. Second, Radiolab producer Sarah Qari looks at a current fight to lower the US voting age to 16 that harkens back to the fight for the 26th Amendment in the 1960s. Eighteen-year-olds at the time argued that if they were old enough to be drafted to fight in the War, they were old enough to have a voice in our democracy. But what about today, when even younger Americans are finding themselves at the center of national political debates? Does it mean we should lower the voting age even further? This episode was reported and produced by Julia Longoria and Sarah Qari. Check out Latif Nasser's new Netflix show Connected here. Support Radiolab today at Radiolab.org/donate.