FCP logoEarlier this month, on the same day Rolling Stone magazine issued an apology for errors in a jarring story about campus rape, a small newspaper in western Virginia published an investigative series with similar sensitivities. The series told the stories of dozens of nurses who stole and became addicted to their patients’ drugs, threatening the lives and health of the patients and themselves, and bringing shame and chaos to their families and employers.

But in its transparency, The News Leader’s “Addicted Nurses” series was different: No one in the story was anonymous, documents were provided, videos were recorded. And the staff’s fact checkers were named.

The story began a year earlier when the paper looked into the case of James Colus Walker, a former local nurse addicted to cocaine. By the end of the year-long project, the staff had tracked 2,366 nurses who had been cited for such violations.

[pulldata context=”Another sensitive story about a hidden crime.This one with transparency, fact checking, named sources.”]

William Ramsey is the local editor of the 20,000-circulation newspaper and the lead editor on the “Addicted Nurses” project. For the American Press Institute’s Fact-Checking Project, Ramsey explained how and why the publication examined and changed its fact-checking efforts to produce the series.

William Ramsey

William Ramsey

For the “Addicted Nurses” series, The News Leader was able to publish interviews with nurses who were addicted to drugs — who even stole their patients’  drugs — and use their real names. How were you able to accomplish this?

In our small newsroom, we help each other. So reporter Patricia Borns gave us some names from state records, and we all slowly contacted people through Facebook, phone calls and door knocking. Our best source was a nurse whom Borns slowly built a rapport with. The fact that we don’t use anonymous sources or fake names, per standing policy, helped persuade people to tell their stories on the record. Also, we sent certified letters to two dozen of the Virginia nurses whose public records we were reporting, and we got a trickle of calls from people wanting to talk.

The Virginia Nurses Association would not be interviewed for this story. Why not, and do you think this affected your ability to provide crucial information to your readers?

We’re not quite sure why association officials wouldn’t agree to an interview. We tried repeatedly. I think it’s related to the culture of silence around this workforce topic in Virginia. Luckily, enough clean and addicted nurses spoke with us. We didn’t need the VNA.

Can you describe how the fact checking was conducted for this series? Did you use a checklist? A spreadsheet? A particular process?

We had a multi-pronged approach. We generated a list of every factual statement (not actual copy) from the main stories and sent it to state officials, who used investigators and PIOs to verify the information. This was critical since a portion of our reporting featured narratives rebuilt from disjointed case records. We also sampled a percentage of our hand-built database and determined an error rate, which was really low. We made those error fixes, and re-sampled another portion, which held up. For accuracy in [Borns’]  writing, we extracted facts from her project’s main story and made a Google spreadsheet for the team, using it to log verification of each fact, the source, the person checking and a note when a change was made to the draft.

For personal stories, I did some interviewing of the reporter’s sources that I don’t normally do. For example, I spent two hours in the living room of one nurse’s parents, checking background details. And we found a patient’s family member to round out our behind-the-scenes understanding of an issue at one of the health care facilities. Also, I showed some early confidential drafts of our main story to a nurse I trust, to get expert reaction.

We’d lose credibility if we made a mistake on the subject-matter details.

You made a point of highlighting your fact checking, even acknowledging the staffers who fact checked the series. This is not typical. Why was the fact checking different for this project?

We were able to take our time. There was no rush to publish. And we needed to be an authority on this subject to make statewide claims about the system. We’re a small daily paper far from the state capital. We’d lose credibility if we made a mistake on the subject-matter details. The nurses who shared their difficult personal stories with us also deserved accuracy at a different level than is possible when you’re covering daily news.

In the fact checking of this series, were there any lessons learned that will be used at the News Leader in the future, or could be replicated at other news organizations?

I hope so. We tried two new ideas I liked: war room Fridays and a black hat review.

For the Friday sessions, we took over a conference room and brought in reporters not connected to the project. On one Friday, for example, our government reporter spent the day checking story drafts against state records.

For the “black hat” review, borrowed from the software development industry, we took turns playing a critic’s role, peppering ourselves in a hostile interview about process, sources and conclusions. It gave us actionable information to improve the content before it published.

What is the return on investment for the labor-intensive fact checking of “Addicted Nurses”?  Are you able to measure fewer errors or increased credibility?

We received little critical feedback from readers and the state based on accuracy. We published online on a Friday at noon, and multiple Virginia public relations officials and experts started reading the digital content, a ton of material. By late Saturday afternoon, they had given us about a half-dozen reasonable small tweaks or corrections in the material, which we made instantly online and reflected for print publishing, which started the next day in the Sunday edition and ran for five days. And on the data side, our fact checking let us get to a much lower database error rate than I thought possible, really tiny.

We have had one correction. A drug rehab facility we mentioned where medical staffers go for treatment was listed correctly by name but had the wrong town after it. That’s all.

A1 Day1 ii Staunton

You might also be interested in:

  • We’ve gathered reflections from researchers in social science who have attended recent API Local News Summits, where they had the chance to interact with and explore how their work helps — and can be improved by insights from — local journalism.

  • The press will be much more effective in serving people and strengthening democracy if it learns from what researchers are learning. Among the examples and takeaways, you will find that news leaders and non-news experts alike value the opportunity to think differently about the challenges in front of them, about how local news can change and how research can ask different questions.

  • We'll share some of the resources, tools and lessons learned from our training sessions and research help desk. We hope you can use these as you plan your continuing accountability coverage and start thinking about the next election on the horizon.