We examined 428 newspaper stories from the websites of eight local newspapers across six states (California, Florida, Idaho, Pennsylvania, Texas, and Washington). Only news articles focusing on 2016 non-presidential election campaigns, including federal (U.S. Senate, U.S. House), statewide (e.g. gubernatorial, Attorney General, Supreme Court, etc.), and local races (e.g. mayor, council, supervisors, school board, etc.), were analyzed as part of this project. We exclude articles on ballot and referenda campaigns. For each news outlet, we examined articles that were published in the month prior to each state’s primary election. The unit of analysis was each article.

We coded the characteristics of each news article:

  • Relevance: We first coded whether news stories flagged as “political” by each newsroom were relevant to the purposes of this project by mentioning down-ballot political campaigns. Of 1,824 total stories that had anything to do with government or officials, 428 were relevant to elections (24%) and used for further analysis.
  • Type: Whether the article reported hard news (78%) or integrated opinion (22%) in some manner (e.g. editorial, blog commentary, letter to the editor).[ref We include the type of news article as a control in our regression analyses of story page views, average time on page, and social referrals. Only in the case of page views does a hard news article contribute to a greater amount of story views (B=.40 SE=.17; p<.05).]
  • Campaign type: Whether the article referenced a federal (46%) and/or state-based (61%) campaign.
  • Headline: Whether the article had a more traditional, summary headline (89%) or a clickbait-oriented headline (11%).
  • Fact-check: Whether the news article was structured as a fact-check, including offering campaign claims with a subsequent journalist evaluation of the claims (2%).
  • Veracity check: Whether the news article included the journalist’s own judgment related to a campaign claim (e.g. false, true, a mistake; 6%).
  • Strategy coverage: We next examined the political strategy-related content in the news article, including whether public opinion/polling information (20%), fundraising or campaign spending information (41%), or horserace references (e.g. “favorite,” “frontrunner,” “underdog,” “sure loser,” “too close to call;” 41%) were present. We combined these references into one additive measure for political strategy (M = 1.03, SD = 1.02, Range = 0 to 3).
  • Issue mentions: We coded for mentions of the economy in general (14%), jobs and unemployment (11%), budgetary matters (22%), income inequality and wages (13%), international trade (4%), energy (12%), national security (11%), immigration (8%), public safety (25%), health care (18%), education (23%), infrastructure and transportation (19%), the environment (13%), and social issues including abortion, LGBTQ rights, women’s issues, and drugs (23%). We combined these mentions into one additive measure for issue mentions (M = 2.18, SD = 2.46, Range = 0 to 14).

Before beginning coding, we conducted a reliability analysis on a minimum of 10% of the total sample of news stories to ensure that the team of five coders was correctly identifying campaign news features. We used Krippendorff’s alpha as a measure of reliability, where scores above 0.67 are considered acceptable.

Reliability analysis

News article characteristics   Issues  
Local campaign (Article relevance) 0.91 Economy (in general) 0.84
Type 1.00 Jobs/employment 0.78
Federal campaign (1 = U.S. Senate) 0.90 Budget-related 0.79
State campaign 0.84 Income inequality 0.77
Headline (1 = Non-traditional) 0.79 Trade 1.00
Energy 0.73
Strategy National security 0.84
Poll 0.92 Immigration 0.85
Fundraising 0.90 Public safety 0.70
Horserace 0.74 Health care 0.75
Education 0.90
Fact-check article 1.00 Infrastructure 0.90
Veracity check 0.79 Environment 1.00
  Social 0.73

American Press Institute and Engaging News Project

About the authors

Joshua M. Scacco (jscacco@purdue.edu) is an Assistant Professor of Media Theory & Politics in the Brian Lamb School of Communication and courtesy faculty in the Department of Political Science at Purdue University. He also serves as a Faculty Research Associate for the Engaging News Project. Lauren Hearit is a doctoral student in the Brian Lamb School of Communication at Purdue University. Lauren Potts is a master’s student in the Brian Lamb School of Communication at Purdue University. Jeff Sonderman is Deputy Director of the American Press Institute. Natalie Jomini Stroud is an Associate Professor in the Department of Communication Studies, Assistant Director of Research at the Annette Strauss Institute for Civic Life, and Director of the Engaging News Project at the University of Texas at Austin. The authors thank Alex Curry, Katie Steiner, Cameron Lang, and Alishan Alibhai for their logistical and coding assistance at various stages of this project. The Engaging News Project and the American Press Institute are grateful for funding and support from the Democracy Fund for this research. We also appreciate support from the Moody College of Communication at the University of Texas at Austin and the Brian Lamb School of Communication at Purdue University.

Share with your network

You also might be interested in:

  • By sending data from their targeted audiences in Adobe Analytics into MFN, Crain was able to more clearly understand what topics, categories, and even story types were engaging readers in key parts of their coverage area.

  • Ways to support conversations for balancing innovation and stability in your news organization, essential considerations about this often overlooked topic, and guidance to include them in your technology decisions.

  • Successfully and efficiently marketing your work can be hard, especially for local news teams with limited resources, but marketing yourself to your audience is an essential skill for news organizations to drive revenue and promote sustainability.