RUDN University mathematicians reduced neural network size six times without post-training

February 05, 2021

A team of mathematicians from RUDN University found a way to reduce the size of a trained neural network six times without spending additional resources on re-training it. The approach is based on finding a correlation between the weights of neural connections in the initial system and its simplified version. The results of the work were published in the Optical Memory and Neural Networks journal.

The structures of artificial neural networks and neurons in a living organism are based on the same principles. Nodes in a network are interconnected; some of them receive a signal, and some transmit it by activating or suppressing the next element in the chain. The processing of any signal (for example, an image or a sound) requires a lot of network elements and connections coming from them. However, computer models have limited capacity and storage volume. To work with large data volumes, specialists have to invent different ways to lower capacity requirements, including the so-called quantization. It helps reduce the consumption of resources but requires system re-training. A team of mathematicians from RUDN University found out that the latter step could be avoided.

"Several years ago we carried out efficient and cost-effective quantization of weights in a Hopfield network. It is an associative memory network with symmetrical connections between elements that are formed following Hebb's rule. In the course of its operation, the activity of the network is reduced to a certain equilibrium state, and when it is reached, a task is considered solved. The insights obtained in that study were later applied to feedforward deep learning networks that are very popular in image recognition today. As a rule, these networks require re-training after quantization, but we found a way to avoid it," said Iakov Karandashev, PhD, an Assistant Professor at the Nikolskii Mathematical Institute, RUDN University.

The main idea behind the simplification of artificial neural networks is the so-called quantization of weights, i.e. reducing the number of bits per each weight. Quantization provides for the averaging of signal: for example, if it is applied to an image, all pixels representing different shades of the same color will become identical. Mathematically, it means that neural connections that are similar by certain parameters should have the same weight (or importance) expressed as a number.

A team of mathematicians from RUDN University carried out calculations and created formulae that effectively establish correlations between the weights in a neural network before and after quantization. Based on them, the scientists developed algorithms using which a trained neural network could classify images. In their experiment, the mathematicians used a text package of 50 thousand photos that could be divided into 1,000 groups. After training, the network was quantized using the new method and not re-trained. Then, the results were compared to other quantization algorithms.

"After quantization, the classification accuracy decreased by only 1%, but the required storage volume was reduced six times. Experiments show that our network doesn't need re-training due to a strong correlation between initial and quantized weights. This approach could help save resources when completing time-sensitive tasks or working on mobile devices," added Iakov Karandashev from RUDN University.
-end-


RUDN University

Related Neural Network Articles from Brightsurf:

Performance test for neural interfaces
Freiburg researchers develop guidelines to standardize analysis of electrodes.

Neural cartography
A new x-ray microscopy technique could help accelerate efforts to map neural circuits and ultimately the brain itself.

New neural network differentiates Middle and Late Stone Age toolkits
The change from Middle Stone Age (MSA) to Later Stone Age (LSA) marks a major cultural change amongst our hunter-gatherer ancestors, but distinguishing between these two industrial complexes is not straightforward.

Early neural activity associated with autism
Researchers at the University of California, Los Angeles, have found evidence of signature brain activity in infants that predicted ASD symptoms later at 18 months old.

Recovering data: NIST's neural network model finds small objects in dense images
In efforts to automatically capture important data from scientific papers, computer scientists at the National Institute of Standards and Technology (NIST) have developed a method that can accurately detect small, geometric objects such as triangles within dense, low-quality plots contained in image data.

Get excited by neural networks
Scientists at The University of Tokyo introduced a new method for inferring the energy of the excited states of electrons in materials using machine learning.

Performing optical logic operations by a diffractive neural network
Optical logic operations, as the basis of optical computing, hold huge potentials to many applications such as cryptographically secured wireless communication and real-time wavefront-shaping.

Computational imaging benefits from untrained neural network
In a recent study, investigators from the Chinese Academy of Sciences described how they combined an untrained neural network and physics knowledge to eliminate the limitations of deep-learning-based CI methods.

Neural hardware for image recognition in nanoseconds
Usually, artificial intelligence is based on software. Scientists at TU Wien (Vienna) created intelligent hardware, which is much faster.

The neural basis of sensory hypersensitivity
A study from MIT and Brown University reveals a neural circuit that appears to underlie sensory hypersensitivity in a mouse model of autism, offering a possible strategy for developing new treatments.

Read More: Neural Network News and Neural Network Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.