False information on Twitter overpowers efforts to correct it by a ratio of about 3 to 1, according to a study published today by the American Press Institute.
Research conducted by Andrew Guess of Columbia University used algorithmic technology to collect a sample of nearly 100,000 tweets relating to the Ebola scare or Obamacare. In the Ebola case, the study found that only 27 percent of tweets about Ebola’s spread in the U.S. were attempts to debunk that misinformation.
The two studies released today…are a cause for optimism that fact checking in journalism can lead to a better-informed public.
A separate study in the API-commissioned research on fact-checking and accountability journalism dives into people’s misperceptions about important facts and whether those misperceptions can be corrected. Those findings include:
- Many Americans — liberal and conservative alike — hold wildly incorrect ideas about public policy issues, including welfare, Social Security, and China’s share of U.S. debt, according to a study by Emily Thorson of George Washington University.
- Respondents in that study also said they thought it was important that citizens know policy facts.
- Corrective information can be remarkably successful at eliminating false beliefs, even among those who were very confident in their misperceptions.
The studies are part of a series commissioned through API’s Fact-Checking Project, an initiative to increase and improve fact-checking journalism through research and training. The program is funded by the Democracy Fund, the William and Flora Hewlett Foundation and the Rita Allen Foundation.
The two studies released today also suggest that people’s erroneous beliefs can be changed; and that, with persistence, journalists have the ability to successfully battle bad information in social media. Both findings are a cause for optimism that fact checking in journalism can lead to a better-informed public.
False tweets take over
Guess examined fact checking on Twitter from January 2014 through the November elections. In the Obamacare case study, he looked at tweets falsely claiming the president’s health care reform would result in the loss of more than 2 million jobs. During the first three months of 2014, 93 percent of tweets related to the issue endorsed the false claim, while only 7 percent were “corrective” tweets.
But after a few weeks, misinformation in both the Ebola and Obamacare cases tapered off and the numbers of incorrect tweets vs. corrective tweets became more equal in number, according to the study.
The study also examined attitudes towards fact checking on Twitter. Fifteen percent of all tweets related to fact checking contained positive sentiments about the practice, as compared to 9 percent negative and 54 percent neutral.
Many Americans ‘confident’ in misperceptions, but can be corrected
In another study, Thorson conducted a representative survey designed to examine the misperceptions that underlie many citizens’ policy views.
Thorson found that about two-thirds of Americans believed China holds a majority of U.S. debt, with 29 percent saying they were “very confident” in that view. In fact, less than 10 percent of U.S. debt is held by China.
About 40 percent of respondents believed that Social Security benefits are paid by money that recipients contributed to a savings account while they were working, rather than by taxes on people who are currently employed. About 23 percent are “very confident” in this incorrect belief.
And about half of those surveyed incorrectly believed there is no limit on the amount of time someone can receive TANF (Temporary Assistance to Needy Families) benefits, with about 15 percent of respondents very confidently holding that misperception. The reality is that in 1996, welfare reform imposed a 60-month limit, and many states have even imposed stricter time limits.
Unlike some political misperceptions, these false beliefs were widespread among Republicans and Democrats alike, Thorson found. This pattern suggests that these misperceptions are not the result of exposure to politicians’ false statements. Instead, they likely occur when respondents attempt to “fill in the blanks” about complex policy issues.
Thorson’s study also found that corrective information can substantially reduce these misperceptions. In her research, Thorson asked participants to answer a set of a factual questions, and then showed them the correct answers. When participants were asked the same questions several weeks later, they were significantly more likely to answer correctly. For example, the number of people who were “wrong and confident” about China’s share of U.S. debt dropped from 29 percent to 15 percent.
Data Source: Identifying and Correcting Policy Misperceptions. Emily Thorson, George Washington University American Press Institute
Issue
% Incorrect answers after seeing correction
% Incorrect answers before seeing correction
Welfare
31.3
53.5
China debt
45.4
67
Social Security
29.7
39.3
Respondents also said it is important that citizens know the answers to these types of policy questions. Between 80 and 87 percent of respondents said it is “important” or “very important” to know the correct information about China’s ownership of debt, the time limit on welfare benefits, and the origins of Social Security benefits.
A need for tackling misinformation and misperceptions
Together, the studies show the virulence or persistence of misinformation and misperceptions, but also some reasons for journalists to prioritize fact checking — already a fast-growing media practice, according to additional research published last week by API.
“The role of fact-checkers in this process is clear: They provide much of the source material with which Twitter users confront mistaken beliefs,” said Guess. “The messages they promote appear to help make debate on the platform more accurate.”
And the need for journalists to tackle misperceptions transcends political coverage. Thorson’s study indicates that better explanations of policies and programs — like Social Security — in news stories can efficiently correct mistaken beliefs.
“Overall, these results suggest that small changes in how the media covers politics and policies could have a big positive impact,” said Thorson. “Simply including corrective information in coverage of issues like TANF, Social Security, and the national debt could could substantially reduce many of these misperceptions.”
In Thorson’s study, the fact that these misperceptions cross party lines also may create an opening for fact-checkers. “We know from research in political science and psychology that because these misperceptions did not arise primarily from partisan misinformation, they are easier to correct,” Thorson said.
The full reports can be read here:
Identifying Political Misperceptions
In the coming weeks, API will publish more findings from its fact-checking research, including a report by journalist Mark Stencel examining the impact of fact-checking on the behavior of those in the political arena.
For questions about the American Press Institute’s Fact-Checking Project or for more information about the research reports, contact Jane Elizabeth, API senior research manager, jane.elizabeth@pressinstitute.org.
You might also be interested in:
We'll share some of the resources, tools and lessons learned from our training sessions and research help desk. We hope you can use these as you plan your continuing accountability coverage and start thinking about the next election on the horizon.
When community members are no longer voters, their needs become diffuse once again and there is no clear, focusing mandate. So many newsrooms slip back into the usual: politics coverage driven by politicians and press releases. How do we avoid that backslide?
How can we avoid that backslide this time?
What news organizations continue to do in the days and weeks ahead will matter more than ever. They will bring people into community conversations or exclude them. They will create understanding or sow confusion.