Bluesky Facebook Reddit Email

Content moderators are influenced by online misinformation

11.19.24 | PNAS Nexus

Apple Watch Series 11 (GPS, 46mm)

Apple Watch Series 11 (GPS, 46mm) tracks health metrics and safety alerts during long observing sessions, fieldwork, and remote expeditions.

Repeated exposure to lies online may influence the beliefs of professional content moderators, with consequences for online platforms. Hundreds of thousands of content moderators, typically based in non-Western countries, identify and weed out problematic and false content on social platforms. However, constant exposure to misinformation could convince some content moderators that false claims are true, in what is known as the “illusory truth effect.” Hause Lin and colleagues assessed the extent of this effect among professional content moderators in India and the Philippines and explored whether encouraging an accuracy mindset reduces the effect. The authors asked 199 content moderators to rate 16 COVID-19 news headlines, first on their interestingness and then, after a break, on their accuracy—along with 32 new COVID-19 news headlines. As predicted by the illusory truth effect, headlines seen for the second time were 7.1% more likely to be judged as accurate than non-repeated headlines. However, in a similar experiment in which content moderators were asked to rate accuracy first—thereby encouraging an accuracy mindset—repeated headlines were not rated as more accurate than new headlines. Similar experiments with members of the public in India and the Philippines found similar effects. According to the authors, the results suggest that the illusory truth effect is not idiosyncratic to Western populations, suggesting that content moderators may become less effective over time due to being chronically exposed to falsehoods, which could compromise the safety and integrity of online platforms. Accuracy mindset prompts could help, the authors note.

PNAS Nexus

Accuracy prompts protect professional content moderators from the illusory truth effect

19-Nov-2024

Research by G.P. and D.G.R. has been funded by Meta and Google. TaskUs authors are employees of TaskUs. M.S. is an employee of TikTok and was a former employee of TaskUs. D.S. is an employee of Google. G.P. was a Faculty Research Fellow at Google in 2022. D.G.R. is in the PNAS Nexus editorial board.

Keywords

Article Information

Contact Information

Hause Lin
Massachusetts Institute of Technology
hause@mit.edu

Source

How to Cite This Article

APA:
PNAS Nexus. (2024, November 19). Content moderators are influenced by online misinformation. Brightsurf News. https://www.brightsurf.com/news/12D75RO1/content-moderators-are-influenced-by-online-misinformation.html
MLA:
"Content moderators are influenced by online misinformation." Brightsurf News, Nov. 19 2024, https://www.brightsurf.com/news/12D75RO1/content-moderators-are-influenced-by-online-misinformation.html.