Craig Silverman is the quotable, go-to source for your publication’s stories on media errors. Outside his job as director of content at Spundge, he writes the popular Regret the Error column at Poynter, which chronicles media errors and takes a look at what went wrong. He’s also the person behind this tweetable story to keep handy for whenever breaking news reporting story goes awry: “This is my story about breaking news errors that just happened.”
The Verification Handbook, of which Silverman is the editor, was released Tuesday. Housed by the European Journalism Centre and authored by a number corrections-conscious journalists, the initial version of the book provides tools, techniques and guidelines for how to handle user-generated content during emergency situations.
I caught up with Silverman to chat about some of the greater problems with corrections on the web. With a full year of media corrections behind us, we talked about where publishers might focus energy regarding corrections in 2014, such as getting corrections to the right people, noting the problem of ambient news awareness, teaching readers to spot fakes, and debunking as a content strategy.
News consumers now get news from multiple sources, through multiple delivery systems and across multiple devices. There isn’t just one go-to newspaper with its inside front-cover corrections. How are today’s readers getting corrections to something they read on an obscure site to them, say from a link in someone’s tweet? Are they?
CRAIG SILVERMAN: This is a great question, in part because I think few news organizations have realized what the multiple source, multiple device world changes about corrections.
My sense is most people are not seeing the corrections they need to see, based on their reading habits. Part of the reason is the ethic and execution of corrections have not been fully ported to the digital world. News organizations may add a correction to an offending article, or video page, but they largely fail on what I might clumsily call “the last mile” problem for corrections: how do you ensure it actually finds the right people. People almost never come back to a story they previously read, which means they won’t see the correction. This is different than someone who subscribes to a newspaper, or who watches the same evening news program.
On social media, some people may delete an incorrect tweet, or Facebook post. Or they may issue a correction, but the last mile again comes into play: how do you ensure the correction spreads just as far as the original error? I often refer back to this chart produced by Gilad Lotan to show how an initial, mistaken tweet compared to the subsequent correction.
What all of this means is journalists and news organizations have to first, make the correction, and second, help it reach the audience.
Newsrooms now put a tremendous amount of effort into spreading and sharing the content they produce. We have to do the same for our corrections.
If you add a correction to a story, you also need to push it out to the platforms where you promoted the initial piece. You need to flag it for people, and try to help the correction spread by identifying influential people who may have shared, retweeted and so on the original. That’s what we have to do if we really care about people getting the correct information.
The final note on this is we are seeing the emergence of what I personally see as the “Notification Wars.” Smartphones are tremendously important information consumption, creation and sharing devices. I’d argue they are now the most important, in fact, when it comes to the developed world.
Apps such as Circa, BreakingNews, and even Apple’s new OS X Mavericks and others are working to find the right way to enable and push alerts based on what you need and want to know. I hope that corrections and updates become part of this battle. Who’s going to do it best, and lead the way for others? It seems Circa is already thinking about it. (And, disclosure, I spoke to Circa’s David Cohn about corrections earlier this year.)
Let’s talk about stories that end up as hoaxes for a minute and revisit a relatively recently debunked story. Say you’re part of the general, non-journalism public. You saw an Elan Gale notes-on-a-plane headline in your news feed on Facebook, but didn’t click into it. You didn’t see a “correction” to the story, anywhere, later. Does it matter whether the story was real or fabricated?
SILVERMAN. There is an element of ambient news awareness that exists today thanks to platforms like Twitter and Facebook. You can see people talking about, and sharing, something without ever actually reading or digging into it yourself. If you see lots of chatter about something that supposedly happened on an airplane but don’t click through, there is still a sense of consciousness about it.
The question, as you noted, is whether that same sense of ambient awareness will often include seeing a correction or debunking. It’s going to depend on the situation, though I think most debunkings may fall short in terms of spread.
To answer the question on the most basic level, I think it matters whether the story was real or fabricated, and it matters that we’re able to sniff out hoaxes like that. Honestly, it’s a good content strategy to be the debunker. More news organizations need to embrace this, rather than go for the easy pass-along traffic that can bite you in the ass. (Can I say ass? I just did!)
Overall, if we in the press can’t do this kind of work consistently, then there will be an effect on society. It matters that we see this reflected in ambient news awareness, as it’s a good indicator that corrections and debunkings are managing to spread and cut through.
This past fall Facebook started allowing users to edit posts and comments. Twitter does not. Do edit features to posts on social media solve any part of these problems?
SILVERMAN: To paraphrase Homer Simpson, the ability to edit social media posts is potentially the cause of, and solution to, lots of problems.
Being able to edit out a small typo, or to fix a post so that as it spreads it doesn’t carry misinformation, is a good thing. But it’s not good if journalists — who have a responsibility to be honest and forthright in admitting errors — use this feature to scrub away errors and pretend like it never happened. We need to make sure we fix and correct.
So, change the error but also acknowledge it, d’oh!
Is the burden of fact-checking everything on the internet all on the news organization? Who is responsible for verifying what? What are the responsibilities for the involved parties: the people publishing the news, the people sharing the news, and the reader?
SILVERMAN: The democratization of publishing brings an element of shared responsibility. I do think, however, that journalists have a greater responsibility. We have codes of ethics and the tradition of correction to uphold.
Aside from that, if we want to build credibility and a strong relationship with the public, we need to demonstrate a dedication to accuracy. We have to show we’re worthy of attention and loyalty. [News Corps’ SVP for Strategy] Raju Narisetti often talks about the promiscuous nature of today’s information consumer. What are you going to do to get them to come back?
You can get short term gain by putting up everything you can find that might get traction. But for the vast majority of journalists and news organizations, that’s not going to build long term value.
An example I’ll give is the Gawker sites, which are often brought up in the discussion about spreading hoaxes. After all, they [had] the viral wizard Neetzan Zimmerman on staff. But what I see more and more from them is excellent debunking work. Deadspin’s Manti Te’o dead girlfriend debunking was a huge hit, both in terms of traffic and in terms of respect and credibility. I think they are reorienting to capture that value. As Nick Denton recently said, “the crowd will eventually choose the juicy truth over a heartwarming hoax.”
For me, the element of shared responsibility relates to the need for everyone to be equipped with the skills and knowledge to apply a skeptical eye to what we see and share. We should all know the telltale signs of a fake Twitter account, of an urban legend gone digital. So I think there is a responsibility for us as journalists to model that behavior and teach and evangelize it broadly. These skills are essential for all of us.
The fact is, the average person is free to share and talk about whatever they like. It’s not about putting barriers up or shaming them when they help spread a hoax. It’s about equipping everyone with a better bullshit detector so we can all attain a basic level of verification and debunking.
Not all corrections or changes to stories are because of hoaxes. Sometimes changes are more subtle, such as word choice in a headline. Sometimes non-error changes grab attention, however (e.g., sometimes word choice change can appear to readers as “softening” a story). What’s good criteria for what should be noted after any online change? Really spell it out, if you can.
SILVERMAN: A change should be noted — as a correction, editor’s note or apology — publicly when:
-
There is a factual error.
-
There is a typo or other form of mistake that creates a factual error, or a level of confusion for the audience.
-
Something was removed or aded to a story as a result of new information, criticism, legal concerns or ethical issues.
-
Plagiarism or fabrication was present.
When in doubt, just note it. Why? Because research shows that reader actually trust a news outlet more when they see corrections. It communicates that there is an accountability structure in place. So err on the side of correction and it will add value, not hurt, your organization’s reputation.
News organizations aren’t the only ones on the internet who are practicing some form of journalism. There are a number of sites or blogs or individual bloggers who may not have the same standards for corrections. Is there any way journalists or anyone else can contribute to a culture of corrections? Where does it start?
SILVERMAN: Bloggers actually ended up doing a little bit of correction innovation. In the relatively early blogging days, you’d often see <strike>strikethrough</strike> used to cross out a typo or error. This was a lovely use of the medium, as it showed what was incorrect and also included the correct information after. In that respect, bloggers modelled good behavior, and showed how digital corrections can work. We can learn from that.
It all starts with a broad commitment to acknowledge and even publicize mistakes. That is the core of the culture, the ethic of correction.
I’m happy when I see CNN, for example, push out a correction on the web, on mobile, on Twitter and in email newsletter, as it did with its big SCOTUS mistake last year. We should lean into the discomfort of admitting our errors and help push them out. Good things come back when we do, in the form of trust and engagement.
I’m also happy when I see news organization adding a “report an error” button to all stories, and a corrections form to their website so people can easily report errors.
We should be open to learning from the community and from other people who are creating and publishing. We should also model the right behaviour, which is to show that corrections are not only essential but in fact a good thing.
I’ve always said: we need more corrections, not fewer. Today, that could be amended to read: we need more corrections on more platforms, not fewer.
You might also be interested in:
With November fast-approaching, we are re-upping both Election Day and post-election resources that news leaders may want to use.
AP has expanded explanatory reporting efforts to debunk elections misinformation and reach as many people as possible with the facts they need.
Longtime philanthropic supporters of journalism are doubling down on local journalism specifically—and encouraging others to join them.