Exposure to 'fake news' during the 2016 US election has been overstated

March 02, 2020

Since the 2016 U.S. presidential election, debates have raged about the reach of so-called "fake news" websites and the role they played during the campaign. A study published in Nature Human Behaviour finds that the reach of these untrustworthy websites has been overstated.

To assess the audience for "fake news," researchers at Dartmouth, Princeton and the University of Exeter measured visits to these dubious and unreliable websites during the period before and immediately after the election using an online survey of 2,525 Americans and web traffic data collected by YouGov Pulse (Oct. 7 - Nov. 16, 2016) from respondents' laptops or desktop computers. This method avoids the problems with asking people to recall which websites they visited, an approach that is plagued with measurement error.

According to the findings, less than half of all Americans visited an untrustworthy website. Moreover, untrustworthy websites accounted for only six percent of all Americans' news diets on average.

Visits to dubious news sites differed sharply along ideological and partisan lines. Content from untrustworthy conservative sites accounted for nearly 5 percent of people's news diets compared to less than 1 percent for untrustworthy liberal sites. Respondents who identified themselves as Trump supporters were also more likely to visit an untrustworthy site (57 percent) than those who indicated that they were Clinton supporters (28 percent).

The data also revealed that Facebook was the most prominent gateway to untrustworthy websites; respondents were more likely to have visited Facebook than Google, Twitter or a webmail platform such as Gmail in the period immediately before visiting an untrustworthy website.

Finally, the study demonstrates that fact-checking websites appeared to be relatively ineffective in reaching the audiences of untrustworthy websites. Only 44 percent of respondents who visited such a website also visited a fact-checking site during the study, and almost none of them had read a fact-check debunking specific claims made in a potentially questionable article.

"These findings show why we need to measure exposure to 'fake news' rather than just assuming it is ubiquitous online," said Brendan Nyhan, a professor of government at Dartmouth. "Online misinformation is a serious problem, but one that we can only address appropriately if we know the magnitude of the problem."
-end-
Nyhan is available for comment at Brendan.J.Nyhan@dartmouth.edu. Andrew M. Guess at Princeton University and Jason Reifler at the University of Exeter also served as co-authors of the study.

Dartmouth College

Related Data Articles from Brightsurf:

Keep the data coming
A continuous data supply ensures data-intensive simulations can run at maximum speed.

Astronomers are bulging with data
For the first time, over 250 million stars in our galaxy's bulge have been surveyed in near-ultraviolet, optical, and near-infrared light, opening the door for astronomers to reexamine key questions about the Milky Way's formation and history.

Novel method for measuring spatial dependencies turns less data into more data
Researcher makes 'little data' act big through, the application of mathematical techniques normally used for time-series, to spatial processes.

Ups and downs in COVID-19 data may be caused by data reporting practices
As data accumulates on COVID-19 cases and deaths, researchers have observed patterns of peaks and valleys that repeat on a near-weekly basis.

Data centers use less energy than you think
Using the most detailed model to date of global data center energy use, researchers found that massive efficiency gains by data centers have kept energy use roughly flat over the past decade.

Storing data in music
Researchers at ETH Zurich have developed a technique for embedding data in music and transmitting it to a smartphone.

Life data economics: calling for new models to assess the value of human data
After the collapse of the blockchain bubble a number of research organisations are developing platforms to enable individual ownership of life data and establish the data valuation and pricing models.

Geoscience data group urges all scientific disciplines to make data open and accessible
Institutions, science funders, data repositories, publishers, researchers and scientific societies from all scientific disciplines must work together to ensure all scientific data are easy to find, access and use, according to a new commentary in Nature by members of the Enabling FAIR Data Steering Committee.

Democratizing data science
MIT researchers are hoping to advance the democratization of data science with a new tool for nonstatisticians that automatically generates models for analyzing raw data.

Getting the most out of atmospheric data analysis
An international team including researchers from Kanazawa University used a new approach to analyze an atmospheric data set spanning 18 years for the investigation of new-particle formation.

Read More: Data News and Data Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.