Nav: Home

Gestures and visual animations reveal cognitive origins of linguistic meaning

April 25, 2019

Gestures and visual animations can help reveal the cognitive origins of meaning, indicating that our minds can assign a linguistic structure to new informational content "on the fly"--even if it is not linguistic in nature.

These conclusions stem from two studies, one in linguistics and the other in experimental psychology, appearing in Natural Language & Linguistic Theory and Proceedings of the National Academy of Sciences (PNAS).

"These results suggest that far less is encoded in words than was originally thought," explains Philippe Schlenker, a senior researcher at Institut Jean-Nicod within France's National Center for Scientific Research (CNRS) and a Global Distinguished Professor at New York University, who wrote the first study and co-authored the second. "Rather, our mind has a 'meaning engine' that can apply to linguistic and non-linguistic material alike.

"Taken together, these findings provide new insights into the cognitive origins of linguistic meaning."

Contemporary linguistics has established that language conveys information through a highly articulated typology of inferences. For instance, I have a dog asserts that I own a dog, but it also suggests (or "implicates") that I have no more than one: the hearer assumes that if I had two dogs, I would have said so (as I have two dogs is more informative).

Unlike asserted content, implicated content isn't targeted by negation. I don't have a dog thus means that I don't have any dog, not that I don't have exactly one dog. There are further inferential types characterized by further properties: the sentence I spoil my dog still conveys that I have a dog, but now this is neither asserted nor implicated; rather, it is "presupposed"--i.e. taken for granted in the conversation. Unlike asserted and implicated information, presuppositions are preserved in negative statements, and thus I don't spoil my dog still presupposes that I have a dog.

A fundamental question of contemporary linguistics is: Which of these inferences come from arbitrary properties of words stored in our mental dictionary and which result from general, productive processes?

In the Natural Language & Linguistic Theory work and the PNAS study, written by Lyn Tieu of Australia's Western Sydney University, Schlenker, and CNRS's Emmanuel Chemla, the authors argue that nearly all inferential types result from general, and possibly non-linguistic, processes.

Their conclusion is based on an understudied type of sentence containing gestures that replace normal words. For instance, in the sentence You should UNSCREW-BULB, the capitalized expression encodes a gesture of unscrewing a bulb from the ceiling. While the gesture may be seen for the first time (and thus couldn't be stored in our mental dictionary), it is understood due to its visual content.

This makes it possible to test how its informational content (i.e. unscrewing a bulb that's on the ceiling) is divided on the fly among the typology of inferences. In this case, the unscrewing action is asserted, but the presence of a bulb on the ceiling is presupposed, as shown by the fact that the negation (You shouldn't UNSCREW-BULB) preserves this information. By systematically investigating such gestures, the Natural Language & Linguistic Theory study reaches a ground-breaking conclusion: nearly all inferential types (eight in total) can be generated on the fly, suggesting that all are due to productive processes.

The PNAS study investigates four of these inferential types with experimental methods, confirming the results of the linguistic study. But it also goes one step further by replacing the gestures with visual animations embedded in written texts, thus answering two new questions: First, can the results be reproduced for visual stimuli that subjects cannot possibly have seen in a linguistic context, given that people routinely speak with gestures but not with visual animations? Second, can entirely non-linguistic material be structured by the same processes?

Both answers are positive.

In a series of experiments, approximately 100 subjects watched videos of sentences in which some words were replaced either by gestures or by visual animations. They were asked how strongly they derived various inferences that are the hallmarks of different inferential types (for instance, inferences derived in the presence of negation). The subjects' judgments displayed the characteristic signature of four classic inferential types (including presuppositions and implicated content) in gestures but also in visual animations: the informational content of these non-standard expressions was, as expected, divided on the fly by the experiments' subjects among well-established slots of the inferential typology.
-end-
Natural Language & Linguistic Theory paper: https://rdcu.be/bb7yF

PNAS paper: https://www.pnas.org/lookup/doi/10.1073/pnas.1821018116

New York University

Related Gestures Articles:

How do babies coordinate gestures and vocalization?
Asier Romero-Andonegi, Aintzane Etxebarria-Lejarreta, Ainara Romero-Andonegi and Irati de Pablo-Delgado, lecturers and researchers at the UPV/EHU's Faculty of Education in Bilbao, have studied how 9 to 13-month-old babies tackle the shift from early babbling to the use of combinations of gestures and speech.
Gesturing can boost children's creative thinking
Encouraging children to use gestures as they think can help them come up with more creative ideas, according to research in Psychological Science, a journal of the Association for Psychological Science.
Repurposed sensor enables smartwatch to detect finger taps and other bio-acoustic signals
A smartwatch is capable of detecting and distinguishing a variety of taps, flicks and scratches by the hands and fingers, and all that's required is a software upgrade that repurposes the device's existing accelerometer, Carnegie Mellon University researchers discovered.
Dartmouth-led team develops WristWhirl, a smartwatch prototype using wrist as a joystick
Checking email, tracking fitness, and listening to music, are just a few things that a smartwatch can do but what if your hands aren't free?
Great apes communicate cooperatively
Gestural communication in bonobos and chimpanzees shows turn-taking and clearly distinguishable communication styles.
That new baby isn't imitating you
For decades, there have been studies suggesting that human babies are capable of imitating facial gestures, hand gestures, facial expressions, or vocal sounds right from their first weeks of life after birth.
Gestures improve communication -- even with robots
In the world of robot communication, it seems actions speak louder than words.
Seeing isn't required to gesture like a native speaker
People the world over gesture when they talk, and they tend to gesture in certain ways depending on the language they speak.
Smartphone security: Why doodling trumps text passwords
Someday soon, you may be able to log into your smartphone with sweeping gestures or doodling, using one or more fingers.
Sounds can help develop speech and gestures in children with autism
Children with autism and other similar conditions often have difficulties in several areas of communication.

Related Gestures Reading:

Best Science Podcasts 2019

We have hand picked the best science podcasts for 2019. Sit back and enjoy new science podcasts updated daily from your favorite science news services and scientists.
Now Playing: TED Radio Hour

Digital Manipulation
Technology has reshaped our lives in amazing ways. But at what cost? This hour, TED speakers reveal how what we see, read, believe — even how we vote — can be manipulated by the technology we use. Guests include journalist Carole Cadwalladr, consumer advocate Finn Myrstad, writer and marketing professor Scott Galloway, behavioral designer Nir Eyal, and computer graphics researcher Doug Roble.
Now Playing: Science for the People

#530 Why Aren't We Dead Yet?
We only notice our immune systems when they aren't working properly, or when they're under attack. How does our immune system understand what bits of us are us, and what bits are invading germs and viruses? How different are human immune systems from the immune systems of other creatures? And is the immune system so often the target of sketchy medical advice? Those questions and more, this week in our conversation with author Idan Ben-Barak about his book "Why Aren't We Dead Yet?: The Survivor’s Guide to the Immune System".