Nav: Home

Learning with light: New system allows optical 'deep learning'

June 12, 2017

CAMBRIDGE, Mass. -- "Deep Learning" computer systems, based on artificial neural networks that mimic the way the brain learns from an accumulation of examples, have become a hot topic in computer science. In addition to enabling technologies such as face- and voice-recognition software, these systems could scour vast amounts of medical data to find patterns that could be useful diagnostically, or scan chemical formulas for possible new pharmaceuticals.

But the computations these systems must carry out are highly complex and demanding, even for the most powerful computers.

Now, a team of researchers at MIT and elsewhere has developed a new approach to such computations, using light instead of electricity, which they say could vastly improve the speed and efficiency of certain deep learning computations. Their results appear today in the journal Nature Photonics in a paper by MIT postdoc Yichen Shen, graduate student Nicholas Harris, professors Marin Soljacic and Dirk Englund, and eight others.

Soljacic says that many researchers over the years have made claims about optics-based computers, but that "people dramatically over-promised, and it backfired." While many proposed uses of such photonic computers turned out not to be practical, a light-based neural-network system developed by this team "may be applicable for deep-learning for some applications," he says.

Traditional computer architectures are not very efficient when it comes to the kinds of calculations needed for certain important neural-network tasks. Such tasks typically involve repeated multiplications of matrices, which can be very computationally intensive in conventional CPU or GPU chips.

After years of research, the MIT team has come up with a way of performing these operations optically instead. "This chip, once you tune it, can carry out matrix multiplication with, in principle, zero energy, almost instantly," Soljacic says. "We've demonstrated the crucial building blocks but not yet the full system."

By way of analogy, Soljacic points out that even an ordinary eyeglass lens carries out a complex calculation (the so-called Fourier transform) on the light waves that pass through it. The way light beams carry out computations in the new photonic chips is far more general but has a similar underlying principle. The new approach uses multiple light beams directed in such a way that their waves interact with each other, producing interference patterns that convey the result of the intended operation. The resulting device is something the researchers call a programmable nanophotonic processor.

The result, Shen says, is that the optical chips using this architecture could, in principle, carry out calculations performed in typical artificial intelligence algorithms much faster and using less than one-thousandth as much energy per operation as conventional electronic chips. "The natural advantage of using light to do matrix multiplication plays a big part in the speed up and power savings, because dense matrix multiplications are the most power hungry and time consuming part in AI algorithms" he says.

The new programmable nanophotonic processor, which was developed in the Englund lab by Harris and collaborators, uses an array of waveguides that are interconnected in a way that can be modified as needed, programming that set of beams for a specific computation. "You can program in any matrix operation," Harris says. The processor guides light through a series of coupled photonic waveguides. The team's full proposal calls for interleaved layers of devices that apply an operation called a nonlinear activation function, in analogy with the operation of neurons in the brain.

To demonstrate the concept, the team set the programmable nanophotonic processor to implement a neural network that recognizes four basic vowel sounds. Even with this rudimentary system, they were able to achieve a 77 percent accuracy level, compared to about 90 percent for conventional systems. There are "no substantial obstacles" to scaling up the system for greater accuracy, Soljacic says.

Englund adds that the programmable nanophotonic processor could have other applications as well, including signal processing for data transmission. "High-speed analog signal processing is something this could manage" faster than other approaches that first convert the signal to digital form, since light is an inherently analog medium. "This approach could do processing directly in the analog domain," he says.

The team says it will still take a lot more effort and time to make this system useful; however, once the system is scaled up and fully functioning, it can find many user cases, such as data centers or security systems. The system could also be a boon for self-driving cars or drones, says Harris, or "whenever you need to do a lot of computation but you don't have a lot of power or time."
-end-
The research team also included MIT graduate students Scott Skirlo and Mihika Prabhu in the Research Laboratory of Electronics, Xin Sun in mathematics, and Shijie Zhao in biology, Tom Baehr-Jones and Michael Hochberg at Elenion Technologies, in New York, and Hugo Larochelle at Université de Sherbrooke, in Quebec. The work was supported by the U.S. Army Research Office through the Institute for Soldier Nanotechnologies, the National Science Foundation, and the Air Force Office of Scientific Research.

ADDITIONAL BACKGROUND:

ARCHIVE: New method for analyzing crystal structure

http://news.mit.edu/2016/new-method-analyzing-photonic-crystals-1125

ARCHIVE: Study opens new realms of light-matter interaction

http://news.mit.edu/2016/forbidden-light-emissions-sensors-0714

Massachusetts Institute of Technology

Related Learning Articles:

How expectations influence learning
During learning, the brain is a prediction engine that continually makes theories about our environment and accurately registers whether an assumption is true or not.
Technology in higher education: learning with it instead of from it
Technology has shifted the way that professors teach students in higher education.
Learning is optimized when we fail 15% of the time
If you're always scoring 100%, you're probably not learning anything new.
School spending cuts triggered by great recession linked to sizable learning losses for learning losses for students in hardest hit areas
Substantial school spending cuts triggered by the Great Recession were associated with sizable losses in academic achievement for students living in counties most affected by the economic downturn, according to a new study published today in AERA Open, a peer-reviewed journal of the American Educational Research Association.
Lessons in learning
A new Harvard study shows that, though students felt like they learned more from traditional lectures, they actually learned more when taking part in active learning classrooms.
Learning to look
A team led by JGI scientists has overhauled the perception of inovirus diversity.
Sleep readies synapses for learning
Synapses in the hippocampus are larger and stronger after sleep deprivation, according to new research in mice published in JNeurosci.
Learning from experience is all in the timing
Animals learn the hard way which sights, sounds, and smells are relevant to survival.
Learning language
When it comes to learning a language, the left side of the brain has traditionally been considered the hub of language processing.
When it comes to learning, what's better: The carrot or the stick?
Does the potential to win or lose money influence the confidence one has in one's own decisions?
More Learning News and Learning Current Events

Trending Science News

Current Coronavirus (COVID-19) News

Top Science Podcasts

We have hand picked the top science podcasts of 2020.
Now Playing: TED Radio Hour

Listen Again: Reinvention
Change is hard, but it's also an opportunity to discover and reimagine what you thought you knew. From our economy, to music, to even ourselves–this hour TED speakers explore the power of reinvention. Guests include OK Go lead singer Damian Kulash Jr., former college gymnastics coach Valorie Kondos Field, Stockton Mayor Michael Tubbs, and entrepreneur Nick Hanauer.
Now Playing: Science for the People

#562 Superbug to Bedside
By now we're all good and scared about antibiotic resistance, one of the many things coming to get us all. But there's good news, sort of. News antibiotics are coming out! How do they get tested? What does that kind of a trial look like and how does it happen? Host Bethany Brookeshire talks with Matt McCarthy, author of "Superbugs: The Race to Stop an Epidemic", about the ins and outs of testing a new antibiotic in the hospital.
Now Playing: Radiolab

Dispatch 6: Strange Times
Covid has disrupted the most basic routines of our days and nights. But in the middle of a conversation about how to fight the virus, we find a place impervious to the stalled plans and frenetic demands of the outside world. It's a very different kind of front line, where urgent work means moving slow, and time is marked out in tiny pre-planned steps. Then, on a walk through the woods, we consider how the tempo of our lives affects our minds and discover how the beats of biology shape our bodies. This episode was produced with help from Molly Webster and Tracie Hunte. Support Radiolab today at Radiolab.org/donate.