Bluesky Facebook Reddit Email

How far can automation and AI support psychotherapy?

04.06.26 | University of Utah

Apple Watch Series 11 (GPS, 46mm)

Apple Watch Series 11 (GPS, 46mm) tracks health metrics and safety alerts during long observing sessions, fieldwork, and remote expeditions.


Psychotherapy has always been a deeply human endeavor: a patient talking, a therapist listening and responding, and healing happening through words. But with the rapid rise of conversational artificial intelligence, particularly large language models (LLMs), that paradigm is shifting fast.

A team of University of Utah researchers is tackling this change, but not by asking, “Will robots replace therapists?” Rather, they explore more practical questions: What are we automating and how much?

“The history of new technology like this is almost always about collaboration, and it's about how it supports the human expert in doing the work they can do,” said Zac Imel , a professor of educational psychology and lead author of a new study titled “A Framework for Automation in Psychotherapy.” “It might be useful to think about frameworks for understanding the different types of work that could be done through automation, and that's what this paper is.”

The study is the result of a cross-campus collaboration among researchers from the U’s College of Engineering, School of Medicine and College of Education.

Simply put, automation is when machines perform tasks humans have previously done. In therapy, that could range from a chatbot delivering prewritten coping tips to AI systems that take and organize notes, analyze therapy sessions and provide feedback to clinicians, or even talk directly to patients.

Varying degrees of automation

Co-author Vivek Srikumar uses self-driving cars as an analogy for the varying levels of automation.

“The automobile industry has been introducing driver assistance systems in our cars for many years now, and the extreme end is self-driving cars,” said Srikumar, an associate professor at the Kahlert School of Computing. “This paper can be seen from that perspective. The extreme version of AI in psychotherapy is an AI therapist, but there are different levels of automation that might be associated with different amounts of risk. You might have different capabilities or assistance that is provided to therapists, to clients, to organizations by AI.”

Imel and Srikumar are long-time collaborators who teamed up with Brent Kious , an associate professor of psychiatry, to craft the automation framework, which was posted in advance of publication by Current Directions in Psychological Science .

The team outlined four categories, representing different levels of automation along a continuum.

The team evaluated each category for its potential utility and risk levels, which vary widely. A scripted chatbot, an AI coaching tool for therapists, and a fully autonomous AI therapist are fundamentally different technologies with different risks. However, it’s often not clear to users, or even health systems, which technology they are using.

Weighing risks and benefits

“By cataloging the various levels of automation, the same question takes on different flavors at various levels, questions about risk, questions about consent, who gets to consent and how much consent and the impact of potential mistakes and the questions about who and how much responsibility is borne by various parties,” Srikumar said. “All of these things, the questions remain the same, but the impact of these questions changes.”

The team is particularly interested in improving the way clinicians are evaluated and mentored to improve the level of care provided to patients.

“We are currently partnering with SafeUT , Utah’s statewide text-based crisis line, to develop tools that help evaluate crisis counselors’ sessions so that they can get feedback to maintain key skills and even develop new ones as we learn more about crisis counseling,” Kious said.

Evaluation and training are where large language models can support therapists without coming close to replacing them, Imel said. Current methods are no match to the scale of need in mental health care.

Automating without replacing human therapists

“To evaluate a psychotherapy session is tremendously labor-intensive. It's slow, it's unreliable, it rarely gets used,” Imel said. “You're not recording your sessions and then mailing them off to an expert who can listen to them and evaluate them and give you feedback and then send it back to you so you can learn from it.” Here, appropriately trained LLMs can quickly capture core components of treatment and provide that information back to therapists quickly–often in real time.

The researchers note that anyone can now turn to ChatGPT for counseling that might resemble psychotherapy. LLMs are designed to be engaging and sound empathetic, and are trained on vast datasets, but they don’t necessarily use evidence-based psychotherapy techniques. Accordingly, they carry huge risks since they are known to fabricate information, encode biases and respond unpredictably.

“Why would one want to deploy the riskiest version of a tool when there are so many lighter versions of it that we can already deploy that are going to make life easier?” Srikumar said. “A note-taking application, for example, something that maintains notes across a session. These are already going to improve the quality of life for clinicians, the quality of service.”

The team also envisions a role for AI in crisis hotlines someday.

“It’s a really challenging environment where you don't know anything about the people you're talking to. They're calling in, you may only have five or six talk turns to connect with them. You have a very confined space to try and help this person and get them safe and reduce risk,” Srikumar said. “What I do foresee is that future crisis counseling systems will be heavily augmented by AI because the scale is too big to be satisfied without automation.”

The study, titled “ A Framework for Automation in Psychotherapy ,” appears in the April edition of Current Directions in Psychological Science . Lead author Zac Imel is a co-founder of Lyssn , a tech company in Seattle developing AI-based quality-improvement programs for behavioral health services. Co-authors include researchers with the University of Washington, University of Pennsylvania and the Alan Turing Institute.

Current Directions in Psychological Science

10.1177/09637214251386047

Systematic review

Not applicable

A Framework for Automation in Psychotherapy

1-Apr-2026

Zac Imel is a cofounder and minority equity shareholder in Lyssn, a technology company focused on improving the quality of psychotherapy. The authors declared that there were no other potential conflicts of interest with respect to the authorship or the publication of this article.

Keywords

Article Information

Contact Information

Brian Maffly
University of Utah
brian.maffly@utah.edu

Source

How to Cite This Article

APA:
University of Utah. (2026, April 6). How far can automation and AI support psychotherapy?. Brightsurf News. https://www.brightsurf.com/news/80EO92E8/how-far-can-automation-and-ai-support-psychotherapy.html
MLA:
"How far can automation and AI support psychotherapy?." Brightsurf News, Apr. 6 2026, https://www.brightsurf.com/news/80EO92E8/how-far-can-automation-and-ai-support-psychotherapy.html.