DUAL takes AI to the next level

December 30, 2020

"Today's computer applications generate a large amount of data that needs to be processed by machine learning algorithms," says Yeseong Kim of Daegu Gyeongbuk Institute of Science and Technology (DGIST), who led the effort.

Powerful 'unsupervised' machine learning involves training an algorithm to recognize patterns in large datasets without providing labelled examples for comparison. One popular approach is a clustering algorithm, which groups similar data into different classes. These algorithms are used for a wide variety of data analyses, such as identifying fake news on social media, filtering spam in our e-mails, and detecting criminal or fraudulent activity online.

"But running clustering algorithms on traditional cores results in high energy consumption and slow processing, because a large amount of data needs to be moved from the computer's memory to its processing unit, where the machine learning tasks are conducted," explains Kim.

Scientists have been looking into processing in-memory (PIM) approaches. But most PIM architectures are analog-based and require analog-to-digital and digital-to-analog converters, which take up a huge amount of the computer chip power and area. They also work better with supervised machine learning, which includes labelled datasets to help train the algorithm.

To overcome these issues, Kim and his colleagues developed DUAL, which stands for digital-based unsupervised learning acceleration. DUAL enables computations on digital data stored inside a computer memory. It works by mapping all the data points into high-dimensional space; imagine data points stored in many locations within the human brain.

The scientists found DUAL efficiently speeds up many different clustering algorithms, using a wide range of large-scale datasets, and significantly improves energy efficiency compared to a state-of-the-art graphics processing unit. The researchers believe this is the first digital-based PIM architecture that can accelerate unsupervised machine learning.

"The existing approach of the state-of-the-arts in-memory computing research focuses on accelerating supervised learning algorithms through artificial neural networks, which increases chip design costs and may not guarantee sufficient learning quality," says Kim. "We showed that combining hyper-dimensional and in-memory computing can significantly improve efficiency while providing sufficient accuracy."
-end-


DGIST (Daegu Gyeongbuk Institute of Science and Technology)

Related Memory Articles from Brightsurf:

Memory of the Venus flytrap
In a study to be published in Nature Plants, a graduate student Mr.

Memory protein
When UC Santa Barbara materials scientist Omar Saleh and graduate student Ian Morgan sought to understand the mechanical behaviors of disordered proteins in the lab, they expected that after being stretched, one particular model protein would snap back instantaneously, like a rubber band.

Previously claimed memory boosting font 'Sans Forgetica' does not actually boost memory
It was previously claimed that the font Sans Forgetica could enhance people's memory for information, however researchers from the University of Warwick and the University of Waikato, New Zealand, have found after carrying out numerous experiments that the font does not enhance memory.

Memory boost with just one look
HRL Laboratories, LLC, researchers have published results showing that targeted transcranial electrical stimulation during slow-wave sleep can improve metamemories of specific episodes by 20% after only one viewing of the episode, compared to controls.

VR is not suited to visual memory?!
Toyohashi university of technology researcher and a research team at Tokyo Denki University have found that virtual reality (VR) may interfere with visual memory.

The genetic signature of memory
Despite their importance in memory, the human cortex and subcortex display a distinct collection of 'gene signatures.' The work recently published in eNeuro increases our understanding of how the brain creates memories and identifies potential genes for further investigation.

How long does memory last? For shape memory alloys, the longer the better
Scientists captured live action details of the phase transitions of shape memory alloys, giving them a better idea how to improve their properties for applications.

A NEAT discovery about memory
UAB researchers say over expression of NEAT1, an noncoding RNA, appears to diminish the ability of older brains to form memories.

Molecular memory can be used to increase the memory capacity of hard disks
Researchers at the University of Jyväskylä have taken part in an international British-Finnish-Chinese collaboration where the first molecule capable of remembering the direction of a magnetic above liquid nitrogen temperatures has been prepared and characterized.

Memory transferred between snails
Memories can be transferred between organisms by extracting ribonucleic acid (RNA) from a trained animal and injecting it into an untrained animal, as demonstrated in a study of sea snails published in eNeuro.

Read More: Memory News and Memory Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.