Nav: Home

Learning with light: New system allows optical 'deep learning'

June 12, 2017

CAMBRIDGE, Mass. -- "Deep Learning" computer systems, based on artificial neural networks that mimic the way the brain learns from an accumulation of examples, have become a hot topic in computer science. In addition to enabling technologies such as face- and voice-recognition software, these systems could scour vast amounts of medical data to find patterns that could be useful diagnostically, or scan chemical formulas for possible new pharmaceuticals.

But the computations these systems must carry out are highly complex and demanding, even for the most powerful computers.

Now, a team of researchers at MIT and elsewhere has developed a new approach to such computations, using light instead of electricity, which they say could vastly improve the speed and efficiency of certain deep learning computations. Their results appear today in the journal Nature Photonics in a paper by MIT postdoc Yichen Shen, graduate student Nicholas Harris, professors Marin Soljacic and Dirk Englund, and eight others.

Soljacic says that many researchers over the years have made claims about optics-based computers, but that "people dramatically over-promised, and it backfired." While many proposed uses of such photonic computers turned out not to be practical, a light-based neural-network system developed by this team "may be applicable for deep-learning for some applications," he says.

Traditional computer architectures are not very efficient when it comes to the kinds of calculations needed for certain important neural-network tasks. Such tasks typically involve repeated multiplications of matrices, which can be very computationally intensive in conventional CPU or GPU chips.

After years of research, the MIT team has come up with a way of performing these operations optically instead. "This chip, once you tune it, can carry out matrix multiplication with, in principle, zero energy, almost instantly," Soljacic says. "We've demonstrated the crucial building blocks but not yet the full system."

By way of analogy, Soljacic points out that even an ordinary eyeglass lens carries out a complex calculation (the so-called Fourier transform) on the light waves that pass through it. The way light beams carry out computations in the new photonic chips is far more general but has a similar underlying principle. The new approach uses multiple light beams directed in such a way that their waves interact with each other, producing interference patterns that convey the result of the intended operation. The resulting device is something the researchers call a programmable nanophotonic processor.

The result, Shen says, is that the optical chips using this architecture could, in principle, carry out calculations performed in typical artificial intelligence algorithms much faster and using less than one-thousandth as much energy per operation as conventional electronic chips. "The natural advantage of using light to do matrix multiplication plays a big part in the speed up and power savings, because dense matrix multiplications are the most power hungry and time consuming part in AI algorithms" he says.

The new programmable nanophotonic processor, which was developed in the Englund lab by Harris and collaborators, uses an array of waveguides that are interconnected in a way that can be modified as needed, programming that set of beams for a specific computation. "You can program in any matrix operation," Harris says. The processor guides light through a series of coupled photonic waveguides. The team's full proposal calls for interleaved layers of devices that apply an operation called a nonlinear activation function, in analogy with the operation of neurons in the brain.

To demonstrate the concept, the team set the programmable nanophotonic processor to implement a neural network that recognizes four basic vowel sounds. Even with this rudimentary system, they were able to achieve a 77 percent accuracy level, compared to about 90 percent for conventional systems. There are "no substantial obstacles" to scaling up the system for greater accuracy, Soljacic says.

Englund adds that the programmable nanophotonic processor could have other applications as well, including signal processing for data transmission. "High-speed analog signal processing is something this could manage" faster than other approaches that first convert the signal to digital form, since light is an inherently analog medium. "This approach could do processing directly in the analog domain," he says.

The team says it will still take a lot more effort and time to make this system useful; however, once the system is scaled up and fully functioning, it can find many user cases, such as data centers or security systems. The system could also be a boon for self-driving cars or drones, says Harris, or "whenever you need to do a lot of computation but you don't have a lot of power or time."
-end-
The research team also included MIT graduate students Scott Skirlo and Mihika Prabhu in the Research Laboratory of Electronics, Xin Sun in mathematics, and Shijie Zhao in biology, Tom Baehr-Jones and Michael Hochberg at Elenion Technologies, in New York, and Hugo Larochelle at Université de Sherbrooke, in Quebec. The work was supported by the U.S. Army Research Office through the Institute for Soldier Nanotechnologies, the National Science Foundation, and the Air Force Office of Scientific Research.

ADDITIONAL BACKGROUND:

ARCHIVE: New method for analyzing crystal structure

http://news.mit.edu/2016/new-method-analyzing-photonic-crystals-1125

ARCHIVE: Study opens new realms of light-matter interaction

http://news.mit.edu/2016/forbidden-light-emissions-sensors-0714

Massachusetts Institute of Technology

Related Learning Articles:

Learning with light: New system allows optical 'deep learning'
A team of researchers at MIT and elsewhere has come up with a new approach to complex computations, using light instead of electricity.
Mount Sinai study reveals how learning in the present shapes future learning
The prefrontal cortex shapes memory formation by modulating hippocampal encoding.
Better learning through zinc?
Zinc is a vital micronutrient involved in many cellular processes: For example, in learning and memory processes, it plays a role that is not yet understood.
Deep learning and stock trading
A study undertaken by researchers at the School of Business and Economics at Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) has shown that computer programs that algorithms based on artificial intelligence are able to make profitable investment decisions.
Learning makes animals intelligent
The fact that animals can use tools, have self-control and certain expectations of life can be explained with the help of a new learning model for animal behavior.
Learning Morse code without trying
Researchers at the Georgia Institute of Technology have developed a system that teaches people Morse code within four hours using a series of vibrations felt near the ear.
The adolescent brain is adapted to learning
Teenagers are often portrayed as seeking immediate gratification, but new work suggests that their sensitivity to reward could be part of an evolutionary adaptation to learn from their environment.
The brain watched during language learning
Researchers from Nijmegen, the Netherlands, have for the first time captured images of the brain during the initial hours and days of learning a new language.
Learning in the absence of external feedback
Rewards act as external factors that influence and reinforce learning processes.
New learning procedure for neural networks
Neural networks learn to link temporally dispersed stimuli.

Related Learning Reading:

Best Science Podcasts 2019

We have hand picked the best science podcasts for 2019. Sit back and enjoy new science podcasts updated daily from your favorite science news services and scientists.
Now Playing: TED Radio Hour

Anthropomorphic
Do animals grieve? Do they have language or consciousness? For a long time, scientists resisted the urge to look for human qualities in animals. This hour, TED speakers explore how that is changing. Guests include biological anthropologist Barbara King, dolphin researcher Denise Herzing, primatologist Frans de Waal, and ecologist Carl Safina.
Now Playing: Science for the People

#SB2 2019 Science Birthday Minisode: Mary Golda Ross
Our second annual Science Birthday is here, and this year we celebrate the wonderful Mary Golda Ross, born 9 August 1908. She died in 2008 at age 99, but left a lasting mark on the science of rocketry and space exploration as an early woman in engineering, and one of the first Native Americans in engineering. Join Rachelle and Bethany for this very special birthday minisode celebrating Mary and her achievements. Thanks to our Patreons who make this show possible! Read more about Mary G. Ross: Interview with Mary Ross on Lash Publications International, by Laurel Sheppard Meet Mary Golda...