Nav: Home

Neural networks everywhere

February 14, 2018

CAMBRIDGE, MASS.--Most recent advances in artificial-intelligence systems such as speech- or face-recognition programs have come courtesy of neural networks, densely interconnected meshes of simple information processors that learn to perform tasks by analyzing huge sets of training data.

But neural nets are large, and their computations are energy intensive, so they're not very practical for handheld devices. Most smartphone apps that rely on neural nets simply upload data to internet servers, which process it and send the results back to the phone.

Now, MIT researchers have developed a special-purpose chip that increases the speed of neural-network computations by three to seven times over its predecessors, while reducing power consumption 94 to 95 percent. That could make it practical to run neural networks locally on smartphones or even to embed them in household appliances.

"The general processor model is that there is a memory in some part of the chip, and there is a processor in another part of the chip, and you move the data back and forth between them when you do these computations," says Avishek Biswas, an MIT graduate student in electrical engineering and computer science, who led the new chip's development.

"Since these machine-learning algorithms need so many computations, this transferring back and forth of data is the dominant portion of the energy consumption. But the computation these algorithms do can be simplified to one specific operation, called the dot product. Our approach was, can we implement this dot-product functionality inside the memory so that you don't need to transfer this data back and forth?"

Biswas and his thesis advisor, Anantha Chandrakasan, dean of MIT's School of Engineering and the Vannevar Bush Professor of Electrical Engineering and Computer Science, describe the new chip in a paper that Biswas is presenting this week at the International Solid State Circuits Conference.

Back to analog

Neural networks are typically arranged into layers. A single processing node in one layer of the network will generally receive data from several nodes in the layer below and pass data to several nodes in the layer above. Each connection between nodes has its own "weight," which indicates how large a role the output of one node will play in the computation performed by the next. Training the network is a matter of setting those weights.

A node receiving data from multiple nodes in the layer below will multiply each input by the weight of the corresponding connection and sum the results. That operation - the summation of multiplications - is the definition of a dot product. If the dot product exceeds some threshold value, the node will transmit it to nodes in the next layer, over connections with their own weights.

A neural net is an abstraction: The "nodes" are just weights stored in a computer's memory. Calculating a dot product usually involves fetching a weight from memory, fetching the associated data item, multiplying the two, storing the result somewhere, and then repeating the operation for every input to a node. Given that a neural net will have thousands or even millions of nodes, that's a lot of data to move around.

But that sequence of operations is just a digital approximation of what happens in the brain, where signals traveling along multiple neurons meet at a "synapse," or a gap between bundles of neurons. The neurons' firing rates and the electrochemical signals that cross the synapse correspond to the data values and weights. The MIT researchers' new chip improves efficiency by replicating the brain more faithfully.

In the chip, a node's input values are converted into electrical voltages and then multiplied by the appropriate weights. Only the combined voltages are converted back into a digital representation and stored for further processing.

The chip can thus calculate dot products for multiple nodes - 16 at a time, in the prototype - in a single step, instead of shuttling between a processor and memory for every computation.

All or nothing

One of the keys to the system is that all the weights are either 1 or -1. That means that they can be implemented within the memory itself as simple switches that either close a circuit or leave it open. Recent theoretical work suggests that neural nets trained with only two weights should lose little accuracy - somewhere between 1 and 2 percent.

Biswas and Chandrakasan's research bears that prediction out. In experiments, they ran the full implementation of a neural network on a conventional computer and the binary-weight equivalent on their chip. Their chip's results were generally within 2 to 3 percent of the conventional network's.
-end-


Massachusetts Institute of Technology

Related Memory Articles:

How long does memory last? For shape memory alloys, the longer the better
Scientists captured live action details of the phase transitions of shape memory alloys, giving them a better idea how to improve their properties for applications.
Seeing it both ways: Visual perspective in memory
Think of a memory from your childhood. Are you seeing the memory through your own eyes, or can you see yourself, while viewing that child as if you were an observer?
A NEAT discovery about memory
UAB researchers say over expression of NEAT1, an noncoding RNA, appears to diminish the ability of older brains to form memories.
Molecular memory can be used to increase the memory capacity of hard disks
Researchers at the University of Jyväskylä have taken part in an international British-Finnish-Chinese collaboration where the first molecule capable of remembering the direction of a magnetic above liquid nitrogen temperatures has been prepared and characterized.
Memory transferred between snails
Memories can be transferred between organisms by extracting ribonucleic acid (RNA) from a trained animal and injecting it into an untrained animal, as demonstrated in a study of sea snails published in eNeuro.
An immunological memory in the brain
Inflammatory reactions can change the brain's immune cells in the long term -- meaning that these cells have an 'immunological memory.' This memory may influence the progression of neurological disorders that occur later in life, and is therefore a previously unknown factor that could influence the severity of these diseases.
Anxiety can help your memory
Anxiety can help people to remember things, a study from the University of Waterloo has found.
Pores with a memory
Whether for separation processes, photovoltaics, catalysis, or electronics, porous polymer membranes are needed in many fields.
Memory gene goes viral
Two independent teams of scientists from the University of Utah and the University of Massachusetts Medical School have discovered that a gene crucial for learning, called Arc, can send its genetic material from one neuron to another by employing a strategy commonly used by viruses.
Neurobiology: The chemistry of memory
Learning requires the chemical adaptation of individual synapses. Researchers have now revealed the impact of an RNA-binding protein that is intimately involved in this process on learning and memory formation and learning processes.
More Memory News and Memory Current Events

Best Science Podcasts 2019

We have hand picked the best science podcasts for 2019. Sit back and enjoy new science podcasts updated daily from your favorite science news services and scientists.
Now Playing: TED Radio Hour

Rethinking Anger
Anger is universal and complex: it can be quiet, festering, justified, vengeful, and destructive. This hour, TED speakers explore the many sides of anger, why we need it, and who's allowed to feel it. Guests include psychologists Ryan Martin and Russell Kolts, writer Soraya Chemaly, former talk radio host Lisa Fritsch, and business professor Dan Moshavi.
Now Playing: Science for the People

#538 Nobels and Astrophysics
This week we start with this year's physics Nobel Prize awarded to Jim Peebles, Michel Mayor, and Didier Queloz and finish with a discussion of the Nobel Prizes as a way to award and highlight important science. Are they still relevant? When science breakthroughs are built on the backs of hundreds -- and sometimes thousands -- of people's hard work, how do you pick just three to highlight? Join host Rachelle Saunders and astrophysicist, author, and science communicator Ethan Siegel for their chat about astrophysics and Nobel Prizes.