Bluesky Facebook Reddit Email

Uncovering the highly skilled emotional work of content moderators

07.10.25 | King's College London

Rigol DP832 Triple-Output Bench Power Supply

Rigol DP832 Triple-Output Bench Power Supply powers sensors, microcontrollers, and test circuits with programmable rails and stable outputs.

Healthcare platform moderators use strategies to manage distressing material while staying engaged enough to protect vulnerable users, finds a new study.

The researchers call for regulators and platform providers to take steps to reduce the toll of this highly skilled emotional work.

The study, published in the Journal of Management Studies looks at the work of the moderation team of the UK-based non-profit online health platform, Care Opinion. The platform serves as a ‘TripAdvisor for healthcare’ but with a strong social service ethos given its role in shaping public perceptions of healthcare providers and their staff. Moderators on this platform often encounter deeply personal or distressing stories and operate with limited but important discretion; to alter the stories’ content - sometimes following discussion with the author, to withdraw them from publication or to undertake a duty of care to safeguard the author of a story.

The study by researchers at King’s Business School at King's College London and the University of Sussex found that moderators engage in five main practices as part of their work:

“Our findings show that far from being emotionless cogs, moderators have to manage their emotions about highly charged information in order to conform to the platform’s neutral stance. Yet they still engage emotionally and retain feelings of empathy or distress. This is what enables them to bend the rules and occasionally to offer care – even in a tightly controlled system.” says study co-author Dimitra Petrakaki, Professor of Technology and Organisation, University of Sussex Business School.

The authors’ findings have important implications for many of the stakeholders involved in healthcare and other platforms where sensitive or distressing content is shared.

Platform operators and designers should embed support systems like well-being check-ins and debrief tools into moderator dashboards and ensure staff have training on emotion regulation and empathetic communication.

Policy makers and regulators should develop digital occupational safety and health standards that mandate safeguards such as rotation policies and access to counselling.

Content moderators themselves should understand the skilled, emotional work they perform, rather than seeing themselves as mere “rule-enforcers”. They should develop mechanisms for collective support, such as peer communities of practice.

“Moderators told us about the challenges of their role; that there is some content that they just can’t moderate and the weight of responsibility they feel when editing someone’s story. It’s important that as a society, we better value the emotional work of content moderators and build a greater understanding of what it means to perform this new role across different sectors, countries and cultures.” says study co-author Andreas Kornelakis, Reader in Comparative Management at King’s Business School.

Journal of Management Studies

10.1111/joms.13219

Case study

People

What Do Content Moderators Do? Emotion Work and Control on a Digital Health Platform

18-Mar-2025

Keywords

Article Information

Contact Information

Tanya Wood
King's College London
tanya.wood@kcl.ac.uk

How to Cite This Article

APA:
King's College London. (2025, July 10). Uncovering the highly skilled emotional work of content moderators. Brightsurf News. https://www.brightsurf.com/news/12D5ERY1/uncovering-the-highly-skilled-emotional-work-of-content-moderators.html
MLA:
"Uncovering the highly skilled emotional work of content moderators." Brightsurf News, Jul. 10 2025, https://www.brightsurf.com/news/12D5ERY1/uncovering-the-highly-skilled-emotional-work-of-content-moderators.html.