“The results of [investigative] reporting do not come cheaply, but they are a bargain to society,” James T. Hamilton writes in his new book Democracy’s Detectives: The Economics of Investigative Journalism, out this month from Harvard University Press. Through his research, Hamilton, the Hearst Professor of Journalism at Stanford University, finds that while investigative journalism often comes at a high cost to news organizations, the benefits it provides to society are often even greater.
Hamilton’s research was primarily funded by Duke University and Stanford University. The National Science Foundation contributed a grant for the book’s chapter on computational journalism.
We talked to Hamilton about how he calculated real-world impact of investigative journalism and what news organizations can do to fund this important work and measure their impact.
Why is the economic impact of accountability journalism often hard to quantify, and how did your study attempt to do that?
Problems end up getting addressed by government in part because they are hard to quantify or to solve through contracts. The implementation of government policy often involves the delegation of decision making power, which generates problems of hidden action (did the person do as directed?) and hidden information (what options were available when a choice was made?). If a newspaper doesn’t take the time to revisit a past investigation and measure its impact, the paper may be missing out on a new story to unravel.
Charting the economic impacts of accountability journalism is hard because government policies may involve public goods that are hard to measure, because government officials try and hide their actions, and because competition drives you to new stories rather than old victories.
In the book’s three case studies, I try to do a partial benefit-cost analysis of investigative work. I first estimate the costs to a newspaper or local TV station of producing an investigation. I next estimate how the world changed because of the reporting. In the cases I looked at, this involved calculating police shootings that did not happen in Washington, D.C., murders that did not take place in North Carolina by people out on parole, and hospitalizations that did not occur in Los Angeles from people getting sick from unsanitary restaurants. I then used the values that economists use in benefit-cost analyses, such as the value of a statistical life saved or hospital visit avoided.
I find that for each dollar invested in an investigative story, there can be over $100 in benefits to society. These benefits though are spread over people who may never subscribe to the newspaper that did the work, which means that a paper cannot reap the full benefits of the change it produces.
What practical lessons should journalists learn from your book and research? What about non-journalists?
Pursue government records. In Democracy’s Detectives, I examine stories submitted to the Investigative Reporters and Editors prize competitions from 1979 to 2010. About 14 percent of IRE stories involved government records requests.
The stories with the most impact were more likely to involve FOIA or FOI requests. About 40 percent of the stories that triggered review of policies involved government records requests. Government records were requested in at least a quarter of stories that resulted in people being fired, in instances that triggered audits, or investigations where policies changed. In stories where laws had been passed because of the reporting, 24 percent involved government records requests.
Team up and partner. For investigative work recognized by with Pulitzers, Selden Ring, Goldsmith and Worth Bingham awards, I find that over time there has been an increase in awards which cite the work of four or more people and the percentage of awards involving three cooperating media outlets. Division of labor works, which means that people who have acquired expertise can do more when they come together as a team on an investigative project. A story discovered and developed by media outlets that are collaborating can magnify the reach and impact of the work.
Remember that one person can make a difference. Pat Stith, a reporter for the The News & Observer in Raleigh, N.C., did more than 300 investigations from 1966 to 2008. 149 of these generated substantive changes, 110 sparked deliberations and further investigations, and 49 triggered individual impacts such as firings or resignations. His reporting generated 31 new laws in North Carolina. Working with Melanie Sill and Joby Warrick, he produced a Pulitzer-winning series called “Boss Hog.” Their revelations about the political economy of hog farming in North Carolina led to financing for new inspectors, new regulations, and a moratorium on expansion or construction of hog farms.
A take-away for government officials is that they should recognize that public affairs journalism involves a market failure. When a reporter tells a story that changes laws, there is no mechanism that fully rewards a newspaper for the social benefits arising when public policy changes.
The stories that hold institutions are accountable are public goods, but readers may free ride and let others pay for the journalism that uncovers new facts. Government could address this by adding journalism as a field in research and development competitions focused on algorithms, data, and technology. The National Science Foundation, National Archives and Library of Congress could add reporting as a public good to be supported through computational research. This policy change would be content neutral and platform agnostic — it does not involve elevating certain topics or favoring a particular medium. It would though support the development of tools that would help reporters find and tell public affairs stories.
In your article for Nieman Reports, you write that “the greater availability of data may make accountability reporting itself more accountable.” How do you think data can be used to make accountability reporting more accountable?
In the days when three-part investigative series rolled out in the Sunday paper, editors would often include a short note about where reporters got their data. Now investigative reporters can make their data sets and code shareable on GitHub, post papers about their methodology for critique, and provide reporting recipes for journalists at other outlets to follow as they localize a national story.
This sharing of methods, data, and code makes it possible to examine assumptions, check for errors, and replicate results, in a manner akin to research norms in social science. The Washington Post’s sharing of its database of shootings by police officers and Los Angeles Times’ data desk sharing of code and data behind many stories show how papers can make their work more transparent and allow others to build on their work.
ProPublica is also a standout on transparency. Their Surgeon Scorecard series examined Medicare data from 2009 to 2013 on more than 16,000 surgeons, provided a database to allow readers to look up death and complication rates for individual doctors, and posted a white paper on its methodology. This made it easier for other researchers to critique their work, and ProPublica published feedback they received and their responses to questions about their work.
Computational journalism is a key theme of Democracy’s Detectives. What do you mean by “computational journalism,” and how do you think that kind of journalism helps watchdog coverage?
Computational journalism is an evolving field that involves the use of computation to change how stories are discovered, told, distributed, monetized, and archived. It generally involves larger data and more sophisticated algorithms than the computer-assisted reporting of the 1990s.
Recent advances in computational journalism involve reporting by algorithms (such as the automated stories from Narrative Science and Automated Insights), stories about algorithms (such as writing about algorithms used by companies to segment customers), and stories through algorithms (such as the use of machine learning algorithms by the Atlanta Journal Constitution to identify cases of sexual abuse by doctors).
Research in computational journalism can support watchdog coverage in several ways. On the supply side, development of tools that help journalists discover stories can lower the costs of doing investigations. On the demand side, telling stories in more personalized and engaging ways can lead to the type of product differentiation which would allow you to charge for the work or draw larger audiences for advertisers.
What can smaller newsrooms with fewer technology tools at their disposal learn from this?
Investigations at small newsrooms can yield big results. I find that for every 100 stories submitted by small newspapers in IRE prize competitions, 11 resulted in further investigations by others, seven generated discussions of reform, three prompted government hearings, six triggered resignations, and three led to firings. Investigations at newspapers though often take time, averaging six months for newspaper stories submitted to IRE.
There are many organizations that want to lower the costs for smaller newsroom to doing accountability work. IRE offers on-site Total Newsroom Training sessions that newspapers can apply for. Individual reporters can also seek advice through IRE’s NICAR listserv. When Daniel Gilbert at the Bristol Herald Courier (circulation 33,000) was investigating natural gas royalties in Virginia, he attended IRE training to learn how to build and analyze databases. The result was his series “Underfoot, Out of Reach,” which won the Pulitzer Prize for Public Service. Other resources for reporters at small papers include DocumentCloud, which makes it easier to analyze documents and share documents with online readers, and the Fund for Investigative Journalism, which provides grants for accountability reporting.
Do the results of your research help make the case for crowd-sourced funding for accountability journalism? What about foundation funding?
When newspapers change public policy the results spill over onto non-readers and non-subscribers, which means they do not get fully rewarded in the market for their work.
When The News & Observer did an investigative series on the probation system that decreased murders in North Carolina, if the paper had been able to capture just 10 percent of the net benefits to society from that series, they could have hired more than 90 new reporters. Going to individual donors or foundations to support the public goods provided by accountability reporting could generate more investigative work.
In my analysis of contributions of $500 or more to three online public affairs sites (MinnPost, Texas Tribune and Voice of San Diego), I found that their large-scale donors were people who were politically active and had donated in federal elections. Foundations could play a larger role in supporting research on what motivates individual support for nonprofit media. Understanding the framing and context for donations would help determine how to raise support from the crowd.
Understanding better how to measure the impact of accountability work would also help make the case for foundation investments in investigative work. Both are research questions where more work could generate results and advice that would scale across newspapers and nonprofits.
How does the economic impact of accountability reporting compare to the benefits of other social-impact investments?
In my case studies, I find that for each $1 invested in a story, there are net policy benefits to society in the first year that policies change of $287 for The News & Observer probation series and $143 for the Washington Post police shooting series. For comparison, when the Office of Management and Budget looked at the total ratio of annual benefits to annual costs of some regulations, the ratios were 3.0 for a Department of Labor rule on hazard communication and 5.5 for a Department of Energy conservation standard. Investing in investigative work appears like a relatively good investment from society’s perspective.
Another way to see this comes from work by Richard Tofel of ProPublica. He found that ProPublica’s annual budget of around $10 million per year was generating reporting that resulted in about 8 to 10 changes in policy that were significant each year. This expected value of one meaningful policy change for each million dollars in nonprofit expenditures is a ratio most donors would appreciate and support.
Does your research find any financial benefit to the publisher? If so, what specifically?
Investigative reporting is original work, about substantive issues, that someone wants to keep secret. This means it is costly, underprovided in the marketplace, and often opposed. It gets done when a publisher has the resources to cover the costs, has an incentive to tell a new story, cares about impacts, and overcomes obstacles. When you tell important stories that are unique, you develop a brand for quality and a reputation for offering what cannot be easily found elsewhere.
I am not able to measure how this brand translates into advertising and subscription; I am able, though, to describe what happens when publishers do offer investigative work.
Outlets associated with family ownership, such as The New York Times, The Seattle Times and The Washington Post, have great track records in investigative reporting awards. Newspapers overall are more likely to focus on institutional stories than other media.
Larger newspapers are more likely to cover topics with wider geographic scope, such as the operation of federal government and defense. Smaller newspapers devoted a greater share of their investigative stories to very local topics such as education or community development and housing.
The branding is long-lasting and companywide. Companies distinctive for their number of IRE submissions in the period from 2002 to 2005 were Tribune Company, McClatchy Newspapers, and Knight Ridder. They were also ones with higher submission rates over a 30-year period, more likely to have done civic journalism projects in the 1990s, and had CEOs talking more about social responsibility from the 1970s through the 1990s.
While the market generates many investigative stories, the amount provided is less than ideal because newspapers cannot fully capture the positive spillovers when they change laws and lives. Generating more accountability work will depend on attracting more journalists, computer scientists, philanthropists and journalism educators to build better tools for story discovery and telling.
You might also be interested in:
Liz Worthington has interacted with more than 800 publishers worldwide and worked directly with 400 of them over the past 10 years.
Longtime philanthropic supporters of journalism are doubling down on local journalism specifically—and encouraging others to join them.
By sending data from their targeted audiences in Adobe Analytics into MFN, Crain was able to more clearly understand what topics, categories, and even story types were engaging readers in key parts of their coverage area.