Nav: Home

Possibilities of the biosimilar principle of learning are shown for a memristor-based neural network

July 12, 2019

Lobachevsky University scientists together with their colleagues from the National Research Center "Kurchatov Institute" (Moscow) and the National Research Center "Demokritos" (Athens) are working on the hardware implementation of a spiking neural network based on memristors. The key elements of such a network, along with pulsed neurons, are artificial synaptic connections that can change the strength (weight) of connection between neurons during the learning.

For this purpose, memristive devices based on metal-oxide-metal nanostructures developed at the UNN Physics and Technology Research Institute (PTRI) are suitable, but their use in specific spiking neural network architectures developed at the Kurchatov Institute requires demonstration of biologically plausible learning principles.

The biological mechanism of learning of neural systems is described by Hebb's rule, according to which learning occurs as a result of an increase in the strength of connection (synaptic weight) between simultaneously active neurons, which indicates the presence of a causal relationship in their excitation. One of the clarifying forms of this fundamental rule is plasticity, which depends on the time of arrival of pulses (Spike-Timing Dependent Plasticity - STDP).

In accordance with STDP, synaptic weight increases if the postsynaptic neuron generates a pulse (spike) immediately after the presynaptic one, and vice versa, the synaptic weight decreases if the postsynaptic neuron generates a spike right before the presynaptic one. Moreover, the smaller the time difference Δt between the pre- and postsynaptic spikes, the more pronounced the weight change will be.

According to one of the researchers, Head of the UNN PTRI laboratory Alexei Mikhailov, in order to demonstrate the STDP principle, memristive nanostructures based on yttria-stabilized zirconia (YSZ) thin films were used. YSZ is a well-known solid-state electrolyte with high oxygen ion mobility.

"Due to a specified concentration of oxygen vacancies, which is determined by the controlled concentration of yttrium impurities, and the heterogeneous structure of the films obtained by magnetron sputtering, such memristive structures demonstrate controlled bipolar switching between different resistive states in a wide resistance range. The switching is associated with the formation and destruction of conductive channels along grain boundaries in the polycrystalline ZrO2 (Y) film," notes Alexei Mikhailov.

An array of memristive devices for research was implemented in the form of a microchip mounted in a standard cermet casing, which facilitates the integration of the array into a neural network's analog circuit. The full technological cycle for creating memristive microchips is currently implemented at the UNN PTRI. In the future, it is possible to scale the devices down to the minimum size of about 50 nm, as was established by Greek partners.

Our studies of the dynamic plasticity of the memoristive devices, continues Alexey Mikhailov, have shown that the form of the conductance change depending on Δt is in good agreement with the STDP learning rules. It should be also noted that if the initial value of the memristor conductance is close to the maximum, it is easy to reduce the corresponding weight while it is difficult to enhance it, and in the case of a memristor with a minimum conductance in the initial state, it is difficult to reduce its weight, but it is easy to enhance it.

According to Vyacheslav Demin, director-coordinator in the area of nature-like technologies of the Kurchatov Institute, who is one of the ideologues of this work, the established pattern of change in the memristor conductance clearly demonstrates the possibility of hardware implementation of the so-called local learning rules. Such rules for changing the strength of synaptic connections depend only on the values of variables that are present locally at each time point (neuron activities and current weights).

"This essentially distinguishes such principle from the traditional learning algorithm, which is based on global rules for changing weights, using information on the error values at the current time point for each neuron of the output neural network layer (in a widely popular group of error back propagation methods). The traditional principle is not biosimilar, it requires "external" (expert) knowledge of the correct answers for each example presented to the network (that is, they do not have the property of self-learning). This principle is difficult to implement on the basis of memristors, since it requires controlled precise changes of memristor conductances, as opposed to local rules. Such precise control is not always possible due to the natural variability (a wide range of parameters) of memristors as analog elements," says Vyacheslav Demin.

Local learning rules of the STDP type implemented in hardware on memristors provide the basis for autonomous ("unsupervised") learning of a spiking neural network. In this case, the final state of the network does not depend on its initial state, but depends only on the learning conditions (a specific sequence of pulses). According to Vyacheslav Demin, this opens up prospects for the application of local learning rules based on memristors when solving artificial intelligence problems with the use of complex spiking neural network architectures.
A paper on the results of this research was recently published in the journal Microelectronic Engineering.

Lobachevsky University

Related Neurons Articles:

How do we get so many different types of neurons in our brain?
SMU (Southern Methodist University) researchers have discovered another layer of complexity in gene expression, which could help explain how we're able to have so many billions of neurons in our brain.
These neurons affect how much you do, or don't, want to eat
University of Arizona researchers have identified a network of neurons that coordinate with other brain regions to influence eating behaviors.
Mood neurons mature during adolescence
Researchers have discovered a mysterious group of neurons in the amygdala -- a key center for emotional processing in the brain -- that stay in an immature, prenatal developmental state throughout childhood.
Astrocytes protect neurons from toxic buildup
Neurons off-load toxic by-products to astrocytes, which process and recycle them.
Connecting neurons in the brain
Leuven researchers uncover new mechanisms of brain development that determine when, where and how strongly distinct brain cells interconnect.
The salt-craving neurons
Pass the potato chips, please! New research discovers neural circuits that regulate craving and satiation for salty tastes.
When neurons are out of shape, antidepressants may not work
Selective serotonin reuptake inhibitors (SSRIs) are the most commonly prescribed medication for major depressive disorder (MDD), yet scientists still do not understand why the treatment does not work in nearly thirty percent of patients with MDD.
Losing neurons can sometimes not be that bad
Current thinking about Alzheimer's disease is that neuronal cell death in the brain is to blame for the cognitive havoc caused by the disease.
Neurons that fire together, don't always wire together
As the adage goes 'neurons that fire together, wire together,' but a new paper published today in Neuron demonstrates that, in addition to response similarity, projection target also constrains local connectivity.
Scientists accidentally reprogram mature mouse GABA neurons into dopaminergic-like neurons
Attempting to make dopamine-producing neurons out of glial cells in mouse brains, a group of researchers instead converted mature inhibitory neurons into dopaminergic cells.
More Neurons News and Neurons Current Events

Top Science Podcasts

We have hand picked the top science podcasts of 2019.
Now Playing: TED Radio Hour

In & Out Of Love
We think of love as a mysterious, unknowable force. Something that happens to us. But what if we could control it? This hour, TED speakers on whether we can decide to fall in — and out of — love. Guests include writer Mandy Len Catron, biological anthropologist Helen Fisher, musician Dessa, One Love CEO Katie Hood, and psychologist Guy Winch.
Now Playing: Science for the People

#542 Climate Doomsday
Have you heard? Climate change. We did it. And it's bad. It's going to be worse. We are already suffering the effects of it in many ways. How should we TALK about the dangers we are facing, though? Should we get people good and scared? Or give them hope? Or both? Host Bethany Brookshire talks with David Wallace-Wells and Sheril Kirschenbaum to find out. This episode is hosted by Bethany Brookshire, science writer from Science News. Related links: Why Climate Disasters Might Not Boost Public Engagement on Climate Change on The New York Times by Andrew Revkin The other kind...
Now Playing: Radiolab

An Announcement from Radiolab