New data processing module makes deep neural networks smarter

September 16, 2020

Artificial intelligence researchers at North Carolina State University have improved the performance of deep neural networks by combining feature normalization and feature attention modules into a single module that they call attentive normalization (AN). The hybrid module improves the accuracy of the system significantly, while using negligible extra computational power.

"Feature normalization is a crucial element of training deep neural networks, and feature attention is equally important for helping networks highlight which features learned from raw data are most important for accomplishing a given task," says Tianfu Wu, corresponding author of a paper on the work and an assistant professor of electrical and computer engineering at NC State. "But they have mostly been treated separately. We found that combining them made them more efficient and effective."

To test their AN module, the researchers plugged it into four of the most widely used neural network architectures: ResNets, DenseNets, MobileNetsV2 and AOGNets. They then tested the networks against two industry standard benchmarks: the ImageNet-1000 classification benchmark and the MS-COCO 2017 object detection and instance segmentation benchmark.

"We found that AN improved performance for all four architectures on both benchmarks," Wu says. "For example, top-1 accuracy in the ImageNet-1000 improved by between 0.5% and 2.7%. And Average Precision (AP) accuracy increased by up to 1.8% for bounding box and 2.2% for semantic mask in MS-COCO.

"Another advantage of AN is that it facilitates better transfer learning between different domains," Wu says. "For example, from image classification in ImageNet to object detection and semantic segmentation in MS-COCO. This is illustrated by the performance improvement in the MS-COCO benchmark, which was obtained by fine-tuning ImageNet-pretrained deep neural networks in MS-COCO, a common workflow in state-of-the-art computer vision.

"We have released the source code and hope our AN will lead to better integrative design of deep neural networks."
The paper, "Attentive Normalization," was presented at the European Conference on Computer Vision (ECCV), which was held online Aug. 23-28. The paper was co-authored by Xilai Li, a recent Ph.D. graduate from NC State; and by Wei Sun, a Ph.D. student at NC State. The work was done with support from the National Science Foundation, under grants 1909644, 1822477, and 2013451; and by the U.S. Army Research Office, under grant W911NF1810295.

North Carolina State University

Related Performance Articles from Brightsurf:

Guiding the way to improved solar cell performance
Small molecules could hold the key to enhancing the efficiency of organic solar cells.

Performance test for neural interfaces
Freiburg researchers develop guidelines to standardize analysis of electrodes.

Cognitive performance - Better than our predecessors
We employ our cognitive skills daily to assimilate and process information.

Predicting sports performance with "big data"
Smartphones and wearable devices are not simple accessories for athletes.

All of the performance, none of the fuss: Nitrile hydrogenation done right
Researchers developed a nano-cobalt phosphide catalyst (nano-Co2P) for the hydrogenation of nitriles to primary amines.

Big ideas in performance management 2.0
Industrial-era performance management paradigms and practices are outdated and ineffective in the modern VUCA work environment.

Can exercise improve video game performance?
Time spent playing video games is often seen as time stolen from physical activities.

Anticipating performance can hinder memory
Anticipating your own performance at work or school may hinder your ability to remember what happened before your presentation, a study from the University of Waterloo has found.

Want to optimize sales performance?
CATONSVILLE, MD, September 16, 2019- According to new research published in the INFORMS journal Marketing Science, companies can improve sales performance when they adjust sales commissions for the sale of more popular items.

Assessing battery performance: Compared to what?
A team from the US Department of Energy's (DOE) Argonne National Laboratory, University of Warwick, OVO Energy, Hawaii National Energy Institute, and Jaguar Land Rover reviewed the literature on the various methods used around the world to characterize the performance of lithium-ion batteries to provide insight on best practices.

Read More: Performance News and Performance Current Events is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to