What is the impact of journalism, and how can it be measured beyond audience reach and website traffic?

Those questions are tackled by Charles Lewis, executive editor of the Investigative Reporting Workshop at the American University School of Communication, and Hilary Niles, a graduate assistant at Investigative Reporters & Editors, co-authors of a new report, “Measuring Impact: The art, science and mystery of nonprofit news assessment.”

Lewis and Niles build upon the existing literature and various models for assessing impact to identify several principles as the cornerstones upon which to build a common framework for impact assessment.

This interview has been edited for length and clarity.

1. You define “impact” as the ultimate social outcome, not just audience reach and website traffic. Can you explain why this distinction between social outcome and website traffic is important?

Hillary Niles

Hillary Niles

HILARY NILES: In particular, it’s important in regards to nonprofit news, but really most journalism. It’s more meaningful.

The goal of the journalism is to play an important role in a democracy for the sake of accountability and informing the populace. So, website traffic doesn’t necessarily mean anything — you can’t necessarily draw any conclusion about the impact from that populace just by the number of hits a page gets.

It’s shifting attention to that ultimate social goal. It’s aligning your evaluation process more closely with your mission, and the mission of the nonprofit news organization is not to generate a lot of website traffic. The mission of a nonprofit news organization is to play that role in democracy.

2. The report’s foundation rests on this idea of different “strata” of engagement (Schaffer & Polgreen, 2012), or the progressive measurements of reach, engagement, and impact. You write, “Understanding which strata to aim for is foundational in assessing the success of journalism, because more reach does not always equate to greater impact.” Why did you decide to use this framework? Can you clarify each of these strata of engagement?

NILES: In the course of the literature review, it seemed to us that that was where the prior research into this topic had led people to understand these different aspects. To explain them as strata is helpful in conceptualizing the way that it works — both the way that it works in the community and the way that a news organization can position its own goal and frame its own evaluation processes.

We chose to adopt it because it was a very established and well-explained paradigm. And so we wanted to build on that and then sort of clarify. We already had this idea that we wanted to hone in — how to evaluate the social impact — so it helped us clarify the focus of our research, which was on that final strata of impact.

If people are engaging with your content … then you know that you’re moving them to some degree of action. It’s more than a passive action — they’re being more than a receptacle, they’re engaging you back. That’s exciting and that’s what you want.

As for each strata, the first is reach — and this applies to any medium. An example from print journalism that I would use to illustrate what reach means is circulation number. You know that’s the number of newspapers you don’t have at the end of the day, but you don’t know how many people have actually read what’s inside it, or which articles inside that they’ve read.

With website traffic, which is measurable on a much more granular level, you know more but you don’t know what it does to them as a person, you don’t know whether it teaches them something or shows them something they didn’t know before, you don’t know whether it compels them to strike up a conversation at dinner or in line at the post office about it, and you definitely don’t know the rest of it. So that’s where you get into the level of engagement.

There are actually a couple strata of engagement. If people are engaging with your content by sharing your content on Facebook or tweeting a link, or commenting on the page, then you know that you’re moving them to some degree of action. It’s more than a passive action — they’re being more than a receptacle, they’re engaging you back. That’s exciting and that’s what you want. The other level of engagement is when they’re engaging not just with you but with their community, in some way sparked by the journalism.

That’s where it gets closer to impact, and subsequently harder to measure. The more offline you go — and that’s where you want to go, you want to bring the engagement offline to get to impact — but that’s also where it gets harder to measure.

3. You note that “veteran reporters and editors, particularly of the investigative ilk, have an inherent, almost visceral dislike of audience measurement and engagement strategies and other metrics-producing data.” How do you get those veteran reporters and editors to buy into the value of this data? What is the value? Why does impact matter?

NILES: A conventional perspective on metrics-producing data has been — we inform the public and what readers do with it is their own business. The paradigm shift comes in caring what they do with it. It’s not that the conventional paradigm doesn’t care, it’s just that the conventional paradigm doesn’t think it’s journalism’s business what readers do with it.

At the newsroom I’m working in, we started a really simple spreadsheet to try to track our impact in all different manners. To be realistic, in the nonprofit journalism world, it’s the case that you make to your funders. When nonprofit publishers are writing a grant, or reporting on a grant that they’ve received, they need to be able to demonstrate that the money is going to good use.

The paid advertising model is predicated more on reach, but the nonprofit and personal sponsorship model — whether it’s foundation funding or from donations from individuals and sponsors who are just invested in the mission of the nonprofit — they’re looking for impact. So, certainly, part of it is about making your case to your funders who keep your newsroom afloat.

It’s useful to have an expanded conception of what the impact of the reporting is when a story doesn’t get nearly as many hits or comments

I think there can also be value in data in editorial decision making. Every newsroom has limited resources and any good journalist has three to 15 times more story ideas than they do time to write and report those stories. So the reporter and editors have to decide where to invest the personnel and the newsroom’s resources. And if the mission of the organization is to have real social impact and you have gone out of your way to do what you can to track your impact, then some of the lessons can help inform that editorial decision making — not in a way that panders to funders, but in a way that helps the newsroom maintain its focus on its mission.

When you’re doing this kind of work, it’s especially important to pay attention to all of the strata of impact. I look at Google Analytics almost every night, I know that reach is not all, but it doesn’t stop me from being invested and looking at how many hits my stories get. It’s useful to have an expanded conception of what the impact of the reporting is when a story doesn’t get nearly as many hits or comments, but I know from reporting that story that there are public records that were not available before I started reporting this that are going to be available now. It helps reporters — it helps me — to have a more thorough sense of the impact my work is having in my community.

4. Not only are the terms reach, engagement, and impact often conflated — but the terms advocacy and mission-driven are as well. What is the difference there? Is there a difference?

NILES: There can be a difference — it depends on the mission. If the mission is to advocate, then that’s advocacy. My opinion about advocacy and objectivity in journalism — I think there’s a certain foundational threshold of social values.

As a journalist, I don’t feel comfortable advocating on one side or another of almost every politically charged topic, except for issues of accountability and justice. I think that’s determined by the social fabric and what the baseline social values are. If it goes above those baseline social values — to inform and encourage community engagement, or to hold public officials accountable — then it gets into the territory of advocacy. But not all mission-driven journalism is advocacy journalism.

5. The report debunks the assumption of ubiquitous digital connectivity — and reminds us that a digital divide does persist. To survey those “offline” communities is expensive and you often get smaller sample sizes. Is there a way to overcome those challenges to measure the offline impact?

NILES: I don’t have the answer to that, but I think that the answer would depend on each newsroom and the community that that newsroom is trying to serve.

I work for an online publication and we have some print partnerships, so some of our content gets picked up by print publications around the state, but we are an online news website. I think there’s a parallel in one of those points that I picked up from Tom Rosenstiel — that one of the most common myths about journalism today is the idea that the platform is the source.

There’s a parallel to that because if we get a tip from a reader, or from any source and sometimes that might come to us digitally, we track it. We have started tracking that as impact. How many times do our reporters and editors and publisher get tips? That doesn’t necessarily have anything to do with digital connectivity. How many record requests do you file? How many scoops do you get? Are you the first person to report on a topic in the news cycle? I think that there are plenty of ways to track offline impact.

6. Another thing that struck me in the report was the observation that technology choice and use are often the focus of audience research — such as “How do you get your news? From where/what?” — rather than inquiries into how consumption of news changes the decision-making process of the audience or alters the course of their community. How would you expand on current audience behavior work to include those changes in news consumption? Can you ask what people do with a piece of news?

NILES: With all surveys, when you’re asking people to self-report on their own behavior, there’s a certain degree to which you cannot take everything they say at face value. But why couldn’t you ask? You can totally ask.

For the sake of efficiency in a survey it’s probably good to have the standard questions — nominal or ordinal responses such as where do you get your news, what mediums, or how often do you do x, y, z. But I think it also helps to ask some open-ended questions so you can draw those stories of engagement out from your audience. What we often do is try to tell stories so I think inviting stories back from your audience is a good thing to be open to and to solicit.

The mission isn’t to have a certain number of audience members engaging with a certain technology. The mission is more about the impact.

We can ask different questions of our audience because engagement — achieving the different strata, reach, engagement, impact — none of that is predicated on technology choice. All of that is predicated on, in my opinion, how effectively the journalism is produced, how well it’s reported and how well it’s presented.

I think it’s interesting to know about technology choice — it definitely matters. In particular, I can say from the perspective of an online publication, we can find out what platform, what operating systems most of our viewers are accessing our content from. That helps us design our product for our maximum aesthetic and functionality.

But again, it goes back to the mission. The mission isn’t to have a certain number of audience members engaging with a certain technology. The mission is more about the impact. If we wanted to know more about who we’re reaching or about how successful we are in achieving that mission, then we need to ask questions that are more geared towards it.

7. The report mentions Lauren Hasler of the Wisconsin Center for Investigative Journalism as someone doing innovative work in this field. In your opinion, who else is leading the field in impact assessment? What are some innovative solutions that have come out on this problem so far?

NILES: One of the reasons that we chose to highlight the work of the Wisconsin Center is because not only are they doing the work, they’re also sharing the methodology. I think that, at the most basic level, that’s the most innovative work being done right now — the collaborative nature of which the work is being done. I can say that personally, I got a lot out of Richard Tofel’s methodology from ProPublica.

I also loved geeking out on some of the toolkits that have been developed by other foundations and researchers. There&ssquo;s a forthcoming report that we mentioned from the LFA Group, “Deepening Engagement for Lasting Impact: Measuring Media Performance and Results.” From what Chuck and I saw in our sneak peek of it, it looks like a really useful synthesis of the research to date and a great set of worksheets for a publisher or an editor, even at a small newsroom, to be able to walk through it it. The LFA group has done a ton of great research into this and produced some excellent reports. FSG were another research group that has done some great work into this.

When Chuck and I first started this research, we did not know how much there already was done on the topic. I was impressed by how once people who were looking into this started to establish the basic concepts. I was also impressed by how quickly people moved to producing a really practical tool that newsroom could use to evaluate their own work.

8. The focus of the report is primarily on nonprofit news, but how do you see these tools and frameworks being transferred to for-profit news, or other news organizations? Are they applicable?

Chuck Lewis

Chuck Lewis

CHARLES LEWIS: The newspaper industry and others have been studying for decades how reporting and “news” can be better packaged and organized and more visual and otherwise more engaging — to reverse the decrease in newspaper sales and subscriptions that began in the early 1950s.

At the same time, foundations have poured tens of millions of dollars into studying issues of public apathy, cynicism, “citizen engagement” and all of that in relation to information about current affairs, also known as “news.”

Against that rather imposing background of past work and introspection, I honestly am not sure how helpful or insightful the Investigative Reporting Workshop’s “measuring impact” report is or will be for the newspaper industry. At the same time, I would think some of what has been written might be useful and interesting in a broad and comparative way.

A more intriguing issue, not asked by you but fascinating to me, is the evolving relationship between for-profit news organizations and nonprofit news organizations. Now we have been seeing joint hiring of Pulitzer Prize reporters by the Post and American U/The Investigative Reporting Workshop. What is the future of such collaborations and partnerships that also involve joint publication? That is off topic from what you asked, but relevant vis-a-vis nonprofit organizations, philanthropy, etc.

9. I’ll expand that question a bit. You mention toward the end of the report what newsrooms can do — right now — to honor their audiences and their own reporting efforts by measuring their impact in real ways, right now. So what can a newsroom, nonprofit or otherwise, do right now to begin measuring their impact?

NILES: Anything — just start. As I was doing the part of the research that I did for this paper, I would come across some of these really sophisticated studies and tools and methodologies that can be applied, but that level of investment and complexity is just not practical for a lot of newsrooms — most newsrooms. I think most newsrooms can do is just start simple and start by paying attention.

I have a meditation philosophy about it. You just observe your breath. You don’t judge your breath, you don’t judge how you’re breathing. And the sheer fact of the observation, of sustained observation, will improve your breathing. That’s how it felt to me. The ultimate takeaway is just to start anywhere. And that’s why I put together that really simple spreadsheet for us in our newsroom. And it’s fun. We started to have conversations about impact, just in passing, just short little comments and it’s helpful to just tune your mind to it.

10. What’s the single most important lesson you took away in writing this report?

NILES: I am not sure if it’s so much a lesson but the observation I was most struck by, or most inspired by, was the collaborative nature of the direction that this field is going. I think the nonprofit news sector lends itself to collaboration because, by their nature, nonprofits are not as proprietary about their business as for-profit companies. And it echoes the collaborative and open-source trend we see in so many sectors these days — not just journalism. Still, I think the collaborative direction that impact assessment is significant for a couple reasons.

First, it’s just plain necessary. News is a hectic business and nonprofit newsrooms are such lean operations that it’s neither feasible nor sensible for every organization to explore impact assessment in a silo. Instead, newsrooms can compare notes on how they do it and hopefully learn from one another. Then, from the foundation perspective, the collaboration would seem to help streamline their work with newsrooms — if everyone is speaking the same language about impact assessment, the conversation is bound to be more productive.

That said, and I hope Chuck and I made this point in the report, assessment tools must be nimble. Even the most collaborative of newsrooms are not cookie-cutter versions of each other. Nor are the communities they serve, for that matter, or the needs of those communities where the impact is intended.

Someone I spoke with just the other day about fragmentation of local governments said he had coined the term “coordinated autonomy” to describe his philosophy of efficient and decentralized state government. I think that describes pretty well the direction of collaborative efforts that we found in our research.

You might also be interested in: