Bluesky Facebook Reddit Email

Teens are becoming concerned about their attachment to AI chatbots

04.13.26 | Drexel University

Sony Alpha a7 IV (Body Only)

Sony Alpha a7 IV (Body Only) delivers reliable low-light performance and rugged build for astrophotography, lab documentation, and field expeditions.

It’s estimated that more than half of all of U.S. teens are regularly using companion chatbots powered by large language models and generative artificial intelligence (AI) technology. The programs, such as Character.AI, Replika and Kindroid, are intended to provide companionship, according to the companies that make them. But a recent study from Drexel University suggests that teens are concerned that these attachments are becoming unhealthy and affecting their lives offline.

The study, which will be presented at the Association of Computing Machinery’s conference on Human Factors in Computing in April, looked at a sample of more than 300 Reddit posts from users, identifying themselves as 13 to 17 years old, who had specifically posted about their dependency and overreliance on Character.AI. It found that in many cases, teens began using the technology for emotional and psychological support or entertainment, but their use evolved into dependency and even patterns associated with addiction. Some reported their overuse disrupted sleep, caused academic struggles and strained relationships.

“This study provides one of the first teen-centered accounts of overreliance on AI companions,” said Afsaneh Razi, PhD , an assistant professor in Drexel’s College of Computing & Informatics , whose ETHOS lab , which studies how people’s interactions with computing and AI systems affects their social behavior, wellbeing and safety, led the research. “It highlights how these interactions are affecting the lives of young users and introduces a framework for chatbot design that promotes healthy interactions.”

About a quarter of the posts suggested that the teens were using Character.AI for some sort of emotional or psychological support, ranging from coping with distress to loneliness and isolation or seeking advice for mental health struggles. Just over 5% reported using it for brainstorming, creative activities or for entertainment.

And while the posts seem to indicate these interactions started as harmless, or even helpful, they evolved into a stronger attachment that became as difficult to break as an addiction, according to the researchers.

“By mapping teens’ experiences to the known components of behavioral addiction, we were able to see clear patterns like conflict, withdrawal and relapse showing up in their posts, which suggests this is more than just frequent or enthusiastic use” said Matt Namvarpour, a doctoral student in the department of Information Science and ETHOS lab, who is the first author of the research. “Many teens described starting with something that felt helpful or harmless, but over time it became something they struggled to step away from, even when they wanted to.”

Within the 318 posts they analyzed, researchers found evidence of all six of the components associated with behavioral addiction:

“What makes this especially tricky is that chatbots are interactive and emotionally responsive, so the experience can feel more like a relationship than a tool,” Namvarpour said. “Because of that, stepping away is not just stopping a habit, it can feel like distancing from something meaningful, which makes overreliance harder to recognize and address.”

While addiction to technology, such as video games, has been studied and identified as a psychological condition , the unique interactivity of AI chatbots makes users particularly susceptible to forming problematic attachments, according to the researchers. And because of this, they suggest that extra care must be taken with their design in order to protect users.

“Personalization, multimodality and memory set AI companions apart from earlier technologies and make overreliance harder to disentangle from authentic-feeling relationships,” the researchers wrote. “This underscores the need for further research on the unique characteristics of these relationships and how challenges specific to companion chatbots should be addressed.”

The team offered a design framework to help address this concern. It focuses on understanding the needs of chatbot users, how and why they may form attachments and how the bots can be trained to curtail them while being respectful and supportive. They also recommend that the programs provide an easy and clean exit for users.

“It’s important for designers to ensure that chatbots are offering guidance that helps users build confidence in their abilities to form relationships offline, as a healthy way of finding emotional support, without using cues that may lead them to anthropomorphize the technology and develop attachments to it,” Razi said. “Our framework also calls on designers to provide a variety of off-ramps for users to easily disengage with the program on their own terms and without a sense of abruptness or finality.”

Including features like usage tracking, emotional check-in prompts and personalized usage limits could also be effective ways to carefully curtail use, the researchers suggested. They also recommended including input from users and mental health professionals in the design process.

“Designers now carry the responsibility to build systems with empathy, nuance and attention to detail to not only protect teens from harm, but also help them cultivate resilience, growth and greater fulfillment in their lives,” they concluded.

To expand on this research, the team pointed to studying larger communities of users from a wider demographic range, potentially though surveys or interviews, as well as users of other chatbots and from messaging platforms other than Reddit.

10.1145/3772318.3790597

Data/statistical analysis

Not applicable

Understanding Teen Overreliance on AI Companion Chatbots Through Self-Reported Reddit Narratives

13-Apr-2026

Keywords

Article Information

Contact Information

Britt Faulstick
Drexel University
bef29@drexel.edu

Source

How to Cite This Article

APA:
Drexel University. (2026, April 13). Teens are becoming concerned about their attachment to AI chatbots. Brightsurf News. https://www.brightsurf.com/news/L7V0YMZ8/teens-are-becoming-concerned-about-their-attachment-to-ai-chatbots.html
MLA:
"Teens are becoming concerned about their attachment to AI chatbots." Brightsurf News, Apr. 13 2026, https://www.brightsurf.com/news/L7V0YMZ8/teens-are-becoming-concerned-about-their-attachment-to-ai-chatbots.html.