Factually: Seven angles on a conspiracy theory

How do you cover a conspiracy theory? Journalists who write about misinformation know that the trick is to debunk the falsehoods without amplifying them or generating any suggestion of legitimacy. Context is critical, as is an exploration of potential harms for believers.

The pseudoscience-ridden, conspiracy-driven “Plandemic” video, which contains a number of baseless theories about the COVID-19 pandemic, provides a case study in the number of ways journalists can approach such a story. Since the video first appeared early last week, the myriad angles journalists have used to cover it show that a good debunking can be embedded in any number of story genres. Here are seven:

1. Fact-checkers worked methodically, claim by claim. As would be expected, traditional fact-checkers did plenty of straightforward claim debunking. PolitiFact’s Daniel Funke chose what he called the “most misleading” claims, like the falsehood that someone who’s ever had a flu vaccine has been injected with coronaviruses.

FactCheck.org also chose eight. Snopes did three. NPR’s Scott Neuman gave background on the story, then debunked five claims. There was some overlap, but it was also interesting that the fact-checkers didn’t all check the same claims – the video provided a buffet of falsehoods to choose from.

2. An investigative reporter’s treatment: ProPublica health care writer Marshall Allen started from a personal perspective – a request from his brother, a pastor in Colorado, for advice about how to answer people who were persuaded by the video. Then he put his investigative skills to work, asking basic questions such as: “Is the presentation one-sided?” and “Is there an independent pursuit of the truth?”

The effect was to bring readers into the inquiry and walk them through the process of disproving the video’s basic premises.

3. Profiling the main character: Who exactly is the scientist in the video, Judy Mikovits? There have been several profiles, but one of the most authoritative came from Science Magazine, which closely followed the story of Mikovits, a molecular biologist who in 2009 co-authored a since-retracted piece in Science about the cause of chronic fatigue syndrome. (In its piece, Science Magazine made clear that its news department is separate from the editorial department of Science that published and retracted the paper).

The authors of this week’s Mikovits profile, Jon Cohen and Martin Enserink, in 2011 wrote a detailed reconstruction of the 2009 episode.

4. Helping people respond to believers: For The Atlantic, Joe Pinsker, who covers families and education, talked to experts about the best way to respond to family and friends who believe conspiracy theories like the one outlined in the video.

The advice is similar to guidance we’ve heard before – don’t judge, don’t patronize, don’t personalize things. One key point: “It’s also important to know when to give up.”

5. An appeal to logic: Lifehacker’s Beth Skwarecki started her piece acknowledging that some of the assumptions underlying the video might appeal to people – that Big Pharma has a lot of power, for example, and that the public is getting conflicting messages.

Then she proceeded to dismantle its claims, focusing on some basic ways that they simply defy objective assessment. One example: “Anybody who thinks Big Pharma needs a conspiracy to make money hasn’t been paying attention to how Big Pharma actually works.”

6. Digging into the tech: The Verge’s Casey NewtonMIT Technology Review’s Abby Ohlheiser and TechCrunch’s Taylor Hatmaker wrote about the video’s virality, the challenge that social platforms face in policing it, and the effect of algorithms in its online performance.

7. Providing context: In their report for NBC, disinformation reporters Brandy Zadrozny and Ben Collins provided deep context, embedding nuggets in their story that weren’t found elsewhere. Among them: Misinformation researcher Joan Donovan’s insight that when tech companies take down this kind of content, it takes on a “limited-edition” aura that proponents can use to promote it.

BuzzFeed’s Jane Lytvynekno also provided context, explaining that Mikovits has been “attempting to insert herself into the COVID-19 pandemic narrative since late March.”

– Susan Benkelman, API

. . . technology

  • Twitter has started adding labels to tweets that include disputed information about the novel coronavirus, adding to a recent spate of actions by tech companies to confront COVID-19 misinformation on their platforms.

    • Asked (on Twitter) whether the policy would apply to President Donald Trump, Yoel Roth, the company’s head of site integrity tweeted: These labels will apply to anyone sharing misleading information that meets the requirements of our policy, including world leaders.

  • In a first of its kind data analysis in Germany, fact-checking network Correctiv released a report Tuesday finding that most of its audience encountered coronavirus misinformation via WhatsApp and YouTube. The network stressed this was not a representative study, but found that out of 1,800 audience fact-check requests:

    • 46% included links from YouTube, and 34% were distributed by WhatsApp

    • The researchers spoke to representatives from WhatsApp and YouTube who highlighted their efforts to crack down on coronavirus misinformation.

    • Both companies have funded separate IFCN fact-checking grants.

. . . politics

  • President Donald Trump this week made a baseless claim that television commentator Joe Scarborough should be investigated for murder. Trump has a running feud with the MSNBC personality. Still, said Washington Post fact-checker Sal Rizzo, “it remains astounding to see the president make a thinly veiled murder accusation devoid of evidence.”

  • Far-right fringe groups are upset about social media companies’ removal of COVID-19 related content that the tech platforms deem as misinformation, Politico reported this week. The Trump-supporting fringe groups have “loudly pointed to the incidents as concrete evidence of the vague conspiracy theories they have long pushed about social media giants trying to silence conservative voices,” wrote reporter Tina Nguyen.

. . . science and health

  • National Geographic has a fact-checker, too. Actually, Natasha Daly writes about animal welfare, exploitation and conservation. But amid the COVID-19 pandemic, she’s seen a lot of questionable stories about animals taking back their habitats during stay-at-home orders. “I could tell pretty quickly they were fake,” she told Columbia Journalism Review.

  • In Canada, nearly seven in 10 respondents to an online survey from the Social Media Lab at Ryerson University said they had personally encountered misinformation about the coronavirus crisis on social media platforms, or on popular aggregator websites like Reddit, the Toronto Star reported.

Italian fact-checking network Open debunked a claim that the Italian government was hiding the truth about COVID-19 by banning autopsies.

Cesare Sacchetti, a repeat misinformation offender who Open referred to as, “an old acquaintance,” tweeted an official document from Italy’s health ministry claiming it was proof of a conspiratorial state cover up.

To debunk Sacchetti’s latest hoax, Open used a time-tested method in the fact-checking world: The team read the document. Open’s powers of reading comprehension revealed the document was guidance for municipalities for when and how to safely perform autopsies during the pandemic. Nothing in the directive banned autopsies outright.

What we liked: This fact-check is a good example of how misinformation can spread when an official document is taken out of context or divorced from its original meaning. As we saw last week with discredited medical researcher Judy Mikovits, official documents and findings can be misrepresented to give a totally bogus claim the appearance of credibility. Open’s fact-check reminds us to interrogate our sources, and be mindful of whether a claim is supported by evidence.

— Harrison Mantas, IFCN 

  1. IFCN Associate Director Cristina Tardáguila spoke with Facebook’s Journalism Project about the network’s collaborative approach to fighting the COVID-19 infodemic.
  2. A study out Monday from the Knight Foundation finds that 78% of Americans think COVID-19 misinformation is a serious problem with 36% saying the amount of information is overwhelming.
  3. Twitter created a curated list of the IFCN’s Indian signatories to help fight COVID-19 misinformation.
  4. Authors of the European Journalism Centre’s Verification Handbook for Disinformation and Media Manipulation held a panel discussion this week about their work.

Misinformation researcher Joan Donovan has been named director of Harvard’s Shorenstein Center on Media, Politics, and Public Policy.

That’s it for this week! Feel free to send feedback and suggestions to factually@poynter.org. And if this newsletter was forwarded to you, or if you’re reading it on the web, you can subscribe here. Thanks for reading.

Susan and Harrison 
  1. Get your facts faster. Sign up for our weekly newsletter delivered to your inbox every Thursday morning.