Nav: Home

Engineers use graph networks to accurately predict properties of molecules and crystals

June 10, 2019

Nanoengineers at the University of California San Diego have developed new deep learning models that can accurately predict the properties of molecules and crystals. By enabling almost instantaneous property predictions, these deep learning models provide researchers the means to rapidly scan the nearly-infinite universe of compounds to discover potentially transformative materials for various technological applications, such as high-energy-density Li-ion batteries, warm-white LEDs, and better photovoltaics.

To construct their models, a team led by nanoengineering professor Shyue Ping Ong at the UC San Diego Jacobs School of Engineering used a new deep learning framework called graph networks, developed by Google DeepMind, the brains behind AlphaGo and AlphaZero. Graph networks have the potential to expand the capabilities of existing AI technology to perform complicated learning and reasoning tasks with limited experience and knowledge--something that humans are good at.

For materials scientists like Ong, graph networks offer a natural way to represent bonding relationships between atoms in a molecule or crystal and enable computers to learn how these relationships relate to their chemical and physical properties.

The new graph network-based models, which Ong's team dubbed MatErials Graph Network (MEGNet) models, outperformed the state of the art in predicting 11 out of 13 properties for the 133,000 molecules in the QM9 data set. The team also trained the MEGNet models on about 60,000 crystals in the Materials Project. The models outperformed prior machine learning models in predicting the formation energies, band gaps and elastic moduli of crystals.

The team also demonstrated two approaches to overcome data limitations in materials science and chemistry. First, the team showed that graph networks can be used to unify multiple free energy models, resulting in a multi-fold increase in training data. Second, they showed that their MEGNet models can effectively learn relationships between elements in the periodic table. This machine-learned information from a property model trained on a large data set can then be transferred to improve the training and accuracy of property models with smaller amounts of data--this concept is known in machine learning as transfer learning.
-end-
The study is published in the journal Chemistry of Materials. The MEGNet models and software are available open-source via Github (https://github.com/materialsvirtuallab/megnet). To enable others to reproduce and build on their results, the authors have also provided the training dataset at https://figshare.com/articles/Graphs_of_materials_project/7451351. A web application for crystal property prediction using MEGNet models is available at http://crystals.ai.

Paper title: "Graph Networks as a Universal Machine Learning Framework for Molecules and Crystals." Co-authors include Chi Chen, Weike Ye, Yunxing Zuo and Chen Zheng, UC San Diego.

This work is supported by the Samsung Advanced Institute of Technology (SAIT) Global Research Outreach (GRO) Program, utilizing data and software resources developed by the DOE-funded Materials Project (DE-AC02-05-CH11231). Computational resources were provided by the Triton Shared Computing Cluster (TSCC) at UC San Diego; the Comet supercomputer, a National Science Foundation-funded resource at the San Diego Supercompuer Center (SDSC) at UC San Diego; the National Energy Research Scientific Computer Centre (NERSC); and the Extreme Science and Engineering Discovery Environment (XSEDE) supported by the National Science Foundation (Grant No. ACI-1053575).

University of California - San Diego

Related Molecules Articles:

Water molecules dance in three
An international team of scientists has been able to shed new light on the properties of water at the molecular level.
How molecules self-assemble into superstructures
Most technical functional units are built bit by bit according to a well-designed construction plan.
Breaking down stubborn molecules
Seawater is more than just saltwater. The ocean is a veritable soup of chemicals.
Shaping the rings of molecules
Canadian chemists discover a natural process to control the shape of 'macrocycles,' molecules of large rings of atoms, for use in pharmaceuticals and electronics.
The mysterious movement of water molecules
Water is all around us and essential for life. Nevertheless, research into its behaviour at the atomic level -- above all how it interacts with surfaces -- is thin on the ground.
Spectroscopy: A fine sense for molecules
Scientists at the Laboratory for Attosecond Physics have developed a unique laser technology for the analysis of the molecular composition of biological samples.
Looking at the good vibes of molecules
Label-free dynamic detection of biomolecules is a major challenge in live-cell microscopy.
Colliding molecules and antiparticles
A study by Marcos Barp and Felipe Arretche from Brazil published in EPJ D shows a model of the interaction between positrons and simple molecules that is in good agreement with experimental results.
Discovery of periodic tables for molecules
Scientists at Tokyo Institute of Technology (Tokyo Tech) develop tables similar to the periodic table of elements but for molecules.
New method for imaging biological molecules
Researchers at Karolinska Institutet in Sweden have, together with colleagues from Aalto University in Finland, developed a new method for creating images of molecules in cells or tissue samples.
More Molecules News and Molecules Current Events

Trending Science News

Current Coronavirus (COVID-19) News

Top Science Podcasts

We have hand picked the top science podcasts of 2020.
Now Playing: TED Radio Hour

Listen Again: The Biology Of Sex
Original broadcast date: May 8, 2020. Many of us were taught biological sex is a question of female or male, XX or XY ... but it's far more complicated. This hour, TED speakers explore what determines our sex. Guests on the show include artist Emily Quinn, journalist Molly Webster, neuroscientist Lisa Mosconi, and structural biologist Karissa Sanbonmatsu.
Now Playing: Science for the People

#569 Facing Fear
What do you fear? I mean really fear? Well, ok, maybe right now that's tough. We're living in a new age and definition of fear. But what do we do about it? Eva Holland has faced her fears, including trauma and phobia. She lived to tell the tale and write a book: "Nerve: Adventures in the Science of Fear".
Now Playing: Radiolab

The Wubi Effect
When we think of China today, we think of a technological superpower. From Huweai and 5G to TikTok and viral social media, China is stride for stride with the United States in the world of computing. However, China's technological renaissance almost didn't happen. And for one very basic reason: The Chinese language, with its 70,000 plus characters, couldn't fit on a keyboard.  Today, we tell the story of Professor Wang Yongmin, a hard headed computer programmer who solved this puzzle and laid the foundation for the China we know today. This episode was reported and produced by Simon Adler with reporting assistance from Yang Yang. Special thanks to Martin Howard. You can view his renowned collection of typewriters at: antiquetypewriters.com Support Radiolab today at Radiolab.org/donate.