Nav: Home

Information theory holds surprises for machine learning

January 24, 2019

New research challenges a popular conception of how machine learning algorithms "think" about certain tasks.

The conception goes something like this: because of their ability to discard useless information, a class of machine learning algorithms called deep neural networks can learn general concepts from raw data-- like identifying cats generally after encountering tens of thousands of images of different cats in different situations. This seemingly human ability is said to arise as a byproduct of the networks' layered architecture. Early layers encode the "cat" label along with all of the raw information needed for prediction. Subsequent layers then compress the information, as if through a bottleneck. Irrelevant data, like the color of the cat's coat, or the saucer of milk beside it, is forgotten, leaving only general features behind. Information theory provides bounds on just how optimal each layer is, in terms of how well it can balance the competing demands of compression and prediction.

"A lot of times when you have a neural network and it learns to map faces to names, or pictures to numerical digits, or amazing things like French text to English text, it has a lot of intermediate hidden layers that information flows through," says Artemy Kolchinsky, an SFI Postdoctoral Fellow and the study's lead author. "So there's this long-standing idea that as raw inputs get transformed to these intermediate representations, the system is trading prediction for compression, and building higher-level concepts through this information bottleneck."

However, Kolchinsky and his collaborators Brendan Tracey (SFI, MIT) and Steven Van Kuyk (University of Wellington) uncovered a surprising weakness when they applied this explanation to common classification problems, where each input has one correct output (e.g., in which each picture can either be of a cat or of a dog). In such cases, they found that classifiers with many layers generally do not give up some prediction for improved compression. They also found that there are many "trivial" representations of the inputs which are, from the point of view of information theory, optimal in terms of their balance between prediction and compression.

"We found that this information bottleneck measure doesn't see compression in the same way you or I would. Given the choice, it is just as happy to lump 'martini glasses' in with 'Labradors', as it is to lump them in with 'champagne flutes,'" Tracey explains. "This means we should keep searching for compression measures that better match our notions of compression."

While the idea of compressing inputs may still play a useful role in machine learning, this research suggests it is not sufficient for evaluating the internal representations used by different machine learning algorithms.

At the same time, Kolchinsky says that the concept of trade-off between compression and prediction will still hold for less deterministic tasks, like predicting the weather from a noisy dataset. "We're not saying that information bottleneck is useless for supervised [machine] learning," Kolchinsky stresses. "What we're showing here is that it behaves counter-intuitively on many common machine learning problems, and that's something people in the machine learning community should be aware of."
-end-
The paper has been accepted to the 2019 International Conference on Learning Representations (ICLR 2019).

A copy of the preprint is available on arXiv.

Santa Fe Institute

Related Balance Articles:

Need to balance guides development of limb-body coordination
The need to feel balanced drives the development of coordination between body and limbs as zebrafish larvae learn to swim, a new study finds.
Scientists weigh the balance of matter in galaxy clusters
A method of weighing the quantities of matter in galaxy clusters - the largest objects in our universe - has shown a balance between the amounts of hot gas, stars and other materials.
A matter of fine balance
How does the brain's circuitry adjust itself to make sense of the world despite the hugely different signals it receives?
Virtual reality could improve your balance, study finds
Virtual Reality technology could become an efficient tool for older people with balance problems or for rehabilitation following injuries or illness that affect balance and movement.
Achieving a balance: Animal welfare and conservation
In a paper recently published in the journal Frontiers in Veterinary Science, a team of researchers, animal care experts and veterinarians evaluate the balance between animal welfare and conservation needs for a number of rare species of native birds being raised in San Diego Zoo Global breeding centers in Hawaii.
New device improves balance in veterans with Gulf War Illness
Gulf War veterans with unexplained illnesses that cause fatigue, headaches, respiratory disorders and memory problems can improve their balance with a device developed by Rutgers University researchers.
European workers fail to maintain water balance
A newly published scientific paper indicates that occupational safety and daily day performance in seven out of 10 workers, from several European industries, is negatively affected by a combination of heat stress and failure to maintain water balance.
Living organisms find a critical balance
Biologists know a lot about how life works, but they are still figuring out the big questions of why life exists, why it takes various shapes and sizes, and how life is able to amazingly adapt to fill every nook and cranny on Earth.
Neural signature of balance
A study of young adults published in eNeuro demonstrates how the brain responds to disruptions in the body's balance.
Striking a balance between immunity and inflammation
Hookworms infect people mostly in countries where sanitation is poor and people often walk barefoot.
More Balance News and Balance Current Events

Best Science Podcasts 2019

We have hand picked the best science podcasts for 2019. Sit back and enjoy new science podcasts updated daily from your favorite science news services and scientists.
Now Playing: TED Radio Hour

Rethinking Anger
Anger is universal and complex: it can be quiet, festering, justified, vengeful, and destructive. This hour, TED speakers explore the many sides of anger, why we need it, and who's allowed to feel it. Guests include psychologists Ryan Martin and Russell Kolts, writer Soraya Chemaly, former talk radio host Lisa Fritsch, and business professor Dan Moshavi.
Now Playing: Science for the People

#538 Nobels and Astrophysics
This week we start with this year's physics Nobel Prize awarded to Jim Peebles, Michel Mayor, and Didier Queloz and finish with a discussion of the Nobel Prizes as a way to award and highlight important science. Are they still relevant? When science breakthroughs are built on the backs of hundreds -- and sometimes thousands -- of people's hard work, how do you pick just three to highlight? Join host Rachelle Saunders and astrophysicist, author, and science communicator Ethan Siegel for their chat about astrophysics and Nobel Prizes.