Bluesky Facebook Reddit Email

Towards a new generation of human-inspired language models

01.28.25 | Vrije Universiteit Brussel

CalDigit TS4 Thunderbolt 4 Dock

CalDigit TS4 Thunderbolt 4 Dock simplifies serious desks with 18 ports for high-speed storage, monitors, and instruments across Mac and PC setups.

"Children learn their native language by communicating with the people around them in their environment. As they play and experiment with language, they attempt to interpret the intentions of their conversation partners. In this way, they gradually learn to understand and use linguistic constructions. This process, in which language is acquired through interaction and meaningful context, is at the core of human language acquisition," says Katrien Beuls.

"The current generation of large language models (LLMs), such as ChatGPT, learns language in a very different way," adds Paul Van Eecke. "By observing vast amounts of text and identifying which words frequently appear together, they generate texts that are often indistinguishable from human writing. This results in models that are extremely powerful in many forms of text generation—such as summarizing, translating, or answering questions—but that also exhibit inherent limitations. They are susceptible to hallucinations and biases, often struggle with human reasoning, and require enormous amounts of data and energy to build and operate."

The researchers propose an alternative model in which artificial agents learn language as humans do—by engaging in meaningful communicative interactions within their environment. Through a series of experiments, they demonstrate how these agents develop linguistic constructions that are directly linked to their surroundings and sensory perceptions. This leads to language models that:

"Integrating communicative and situated interactions into AI models is a crucial step in developing the next generation of language models. This research offers a promising path toward language technologies that more closely resemble how humans understand and use language," the researchers conclude.

Reference:

Katrien Beuls, Paul Van Eecke; Humans Learn Language from Situated Communicative Interactions. What about Machines?. Computational Linguistics 2024; 50 (4): 1277–1311. doi: https://doi.org/10.1162/coli_a_00534

Computational Linguistics

10.1162/coli_a_00534

Keywords

Article Information

Contact Information

Sam Jaspers
Vrije Universiteit Brussel
sam.jaspers@vub.be

How to Cite This Article

APA:
Vrije Universiteit Brussel. (2025, January 28). Towards a new generation of human-inspired language models. Brightsurf News. https://www.brightsurf.com/news/8J462W7L/towards-a-new-generation-of-human-inspired-language-models.html
MLA:
"Towards a new generation of human-inspired language models." Brightsurf News, Jan. 28 2025, https://www.brightsurf.com/news/8J462W7L/towards-a-new-generation-of-human-inspired-language-models.html.