How people investigate -- or don't -- fake news on Twitter and Facebook

March 18, 2020

Social media platforms, such as Facebook and Twitter, provide people with a lot of information, but it's getting harder and harder to tell what's real and what's not.

Researchers at the University of Washington wanted to know how people investigated potentially suspicious posts on their own feeds. The team watched 25 participants scroll through their Facebook or Twitter feeds while, unbeknownst to them, a Google Chrome extension randomly added debunked content on top of some of the real posts. Participants had various reactions to encountering a fake post: Some outright ignored it, some took it at face value, some investigated whether it was true, and some were suspicious of it but then chose to ignore it. These results have been accepted to the 2020 ACM CHI conference on Human Factors in Computing Systems.

"We wanted to understand what people do when they encounter fake news or misinformation in their feeds. Do they notice it? What do they do about it?" said senior author Franziska Roesner, a UW associate professor in the Paul G. Allen School of Computer Science & Engineering. "There are a lot of people who are trying to be good consumers of information and they're struggling. If we can understand what these people are doing, we might be able to design tools that can help them."

Previous research on how people interact with misinformation asked participants to examine content from a researcher-created account, not from someone they chose to follow.

"That might make people automatically suspicious," said lead author Christine Geeng, a UW doctoral student in the Allen School. "We made sure that all the posts looked like they came from people that our participants followed."

The researchers recruited participants ages 18 to 74 from across the Seattle area, explaining that the team was interested in seeing how people use social media. Participants used Twitter or Facebook at least once a week and often used the social media platforms on a laptop.

Then the team developed a Chrome extension that would randomly add fake posts or memes that had been debunked by the fact-checking website Snopes.com on top of real posts to make it temporarily appear they were being shared by people on participants' feeds. So instead of seeing a cousin's post about a recent vacation, a participant would see their cousin share one of the fake stories instead.

The researchers either installed the extension on the participant's laptop or the participant logged into their accounts on the researcher's laptop, which had the extension enabled. The team told the participants that the extension would modify their feeds -- the researchers did not say how -- and would track their likes and shares during the study -- though, in fact, it wasn't tracking anything. The extension was removed from participants' laptops at the end of the study.

"We'd have them scroll through their feeds with the extension active," Geeng said. "I told them to think aloud about what they were doing or what they would do if they were in a situation without me in the room. So then people would talk about 'Oh yeah, I would read this article,' or 'I would skip this.' Sometimes I would ask questions like, 'Why are you skipping this? Why would you like that?'"

Participants could not actually like or share the fake posts. On Twitter, a "retweet" would share the real content beneath the fake post. The one time a participant did retweet content under the fake post, the researchers helped them undo it after the study was over. On Facebook, the like and share buttons didn't work at all.

After the participants encountered all the fake posts -- nine for Facebook and seven for Twitter -- the researchers stopped the study and explained what was going on.

"It wasn't like we said, 'Hey, there were some fake posts in there.' We said, 'It's hard to spot misinformation. Here were all the fake posts you just saw. These were fake, and your friends did not really post them,'" Geeng said. "Our goal was not to trick participants or to make them feel exposed. We wanted to normalize the difficulty of determining what's fake and what's not."

The researchers concluded the interview by asking participants to share what types of strategies they use to detect misinformation.

In general, the researchers found that participants ignored many posts, especially those they deemed too long, overly political or not relevant to them.

But certain types of posts made participants skeptical. For example, people noticed when a post didn't match someone's usual content. Sometimes participants investigated suspicious posts -- by looking at who posted it, evaluating the content's source or reading the comments below the post -- and other times, people just scrolled past them.

"I am interested in the times that people are skeptical but then choose not to investigate. Do they still incorporate it into their worldviews somehow?" Roesner said. "At the time someone might say, 'That's an ad. I'm going to ignore it.' But then later do they remember something about the content, and forget that it was from an ad they skipped? That's something we're trying to study more now."

While this study was small, it does provide a framework for how people react to misinformation on social media, the team said. Now researchers can use this as a starting point to seek interventions to help people resist misinformation in their feeds.

"Participants had these strong models of what their feeds and the people in their social network were normally like. They noticed when it was weird. And that surprised me a little," Roesner said. "It's easy to say we need to build these social media platforms so that people don't get confused by fake posts. But I think there are opportunities for designers to incorporate people and their understanding of their own networks to design better social media platforms."
-end-
Savanna Yee, a UW master's student in the Allen School, is also a co-author on this paper. This research was funded by the National Science Foundation.

For more information, contact Roesner at franzi@cs.washington.edu and Geeng at cgeeng@cs.washington.edu.

Grant number: CNS-1651230

University of Washington

Related Science Articles from Brightsurf:

75 science societies urge the education department to base Title IX sexual harassment regulations on evidence and science
The American Educational Research Association (AERA) and the American Association for the Advancement of Science (AAAS) today led 75 scientific societies in submitting comments on the US Department of Education's proposed changes to Title IX regulations.

Science/Science Careers' survey ranks top biotech, biopharma, and pharma employers
The Science and Science Careers' 2018 annual Top Employers Survey polled employees in the biotechnology, biopharmaceutical, pharmaceutical, and related industries to determine the 20 best employers in these industries as well as their driving characteristics.

Science in the palm of your hand: How citizen science transforms passive learners
Citizen science projects can engage even children who previously were not interested in science.

Applied science may yield more translational research publications than basic science
While translational research can happen at any stage of the research process, a recent investigation of behavioral and social science research awards granted by the NIH between 2008 and 2014 revealed that applied science yielded a higher volume of translational research publications than basic science, according to a study published May 9, 2018 in the open-access journal PLOS ONE by Xueying Han from the Science and Technology Policy Institute, USA, and colleagues.

Prominent academics, including Salk's Thomas Albright, call for more science in forensic science
Six scientists who recently served on the National Commission on Forensic Science are calling on the scientific community at large to advocate for increased research and financial support of forensic science as well as the introduction of empirical testing requirements to ensure the validity of outcomes.

World Science Forum 2017 Jordan issues Science for Peace Declaration
On behalf of the coordinating organizations responsible for delivering the World Science Forum Jordan, the concluding Science for Peace Declaration issued at the Dead Sea represents a global call for action to science and society to build a future that promises greater equality, security and opportunity for all, and in which science plays an increasingly prominent role as an enabler of fair and sustainable development.

PETA science group promotes animal-free science at society of toxicology conference
The PETA International Science Consortium Ltd. is presenting two posters on animal-free methods for testing inhalation toxicity at the 56th annual Society of Toxicology (SOT) meeting March 12 to 16, 2017, in Baltimore, Maryland.

Citizen Science in the Digital Age: Rhetoric, Science and Public Engagement
James Wynn's timely investigation highlights scientific studies grounded in publicly gathered data and probes the rhetoric these studies employ.

Science/Science Careers' survey ranks top biotech, pharma, and biopharma employers
The Science and Science Careers' 2016 annual Top Employers Survey polled employees in the biotechnology, biopharmaceutical, pharmaceutical, and related industries to determine the 20 best employers in these industries as well as their driving characteristics.

Three natural science professors win TJ Park Science Fellowship
Professor Jung-Min Kee (Department of Chemistry, UNIST), Professor Kyudong Choi (Department of Mathematical Sciences, UNIST), and Professor Kwanpyo Kim (Department of Physics, UNIST) are the recipients of the Cheong-Am (TJ Park) Science Fellowship of the year 2016.

Read More: Science News and Science Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.