Behind API’s Tech Talks, two products (Metrics for News and Source Matters) and friendly customer service to our 100+ local newsrooms, there are two engineers who tirelessly work to make sure our products are up and running and collaborate with our customer-facing teammates to process feedback and inform future product development.

Meet senior application engineer Stephen Jefferson and web applications engineer Marita Pérez Díaz. The two have been closely following all the generative AI trends this year — including the assumption that certain products or technologies are silver bullets that fix huge categories of problems on their own. We sat down with Stephen and Marita to discuss the trends they’re seeing, their favorite resources and what you need to know to continue evolving through this era.

Photos of Marita Perez Diaz and Stephen Jefferson

As journalists and engineers who have been working in news for the majority of your careers, how are you feeling about generative AI and local news? What concerns do you have, and what are you looking forward to?

Marita: I’m very concerned about how much content will be generated [using generative AI] and deepfakes. How will newsrooms keep up with fact-checking? It’s going to be huge — it’s already huge. How will tools be developed to keep up with the misinformation? 

But I’m also concerned about people who are ignoring AI, who only see it as a trend. This is happening so fast that if people don’t get trained and educated about it to use it to enhance their work, they will be behind in the industry. [News leaders] need to stay informed to make the best decisions for their newsrooms.

So many people are starting to use AI tools, but keeping their old structures in place for delivering news to the user. But now we have to take the user into account more than ever. In 2021, Modular Journalism  made a way to create articles in modules depending on what the audience wanted. The article would change depending on the audience persona. I find that very interesting. It doesn’t mean all news has to be experimented on that way, but the goal is to figure out better and updated ways to deliver news. 

Stephen: AI is not a totally separate period of technology, it’s a continuation of what we’ve been working on for awhile where we can reuse resources we already have.

I’m concerned that there are organizations thinking they can just jump into it [using AI in their work], but they should follow some of the resources out there. There’s a big gap of misunderstanding and it doesn’t seem like journalism [industry] is taking it quite seriously as it should. How are they thinking about experiments and time spent on this?

There are task forces spinning up and guidebooks and guidelines and ethics policies for AI, but none of those are new concepts — especially for fact-checking and personal data and privacy. We need to apply some of those practices.

Stephen, how will AI affect local newsrooms and what should they do to prepare? Editor’s note: Stephen hosted a presentation on AI preparations for newsrooms in May.

S: The fun and nerve-wracking thing about that presentation in May was that it was about preparation and readiness about AI in journalism — so many newsrooms were prepping at that time, and just the day before, Google announced a hub [dedicated to the effort]. It was a timely topic to present, but also so much to learn.

It’s not a single linear solution to prepare for AI. It is a whole continued process to make sure you have the right things in place to stand on so you have eyes in the air for new announcements with partners or new regulations or policies to guide you. How do orgs see themselves positioned with that? Approaches can help them understand when opportunity comes up, whether or not they should participate.

Marita, how might local newsrooms take advantage of AI tools and approaches to meet their editorial goals?

M: I would recommend taking advantage of education, guidelines and training first, not only on what to do but also on what NOT to do with AI. It is easy to make mistakes, especially ethically or by delivering wrong information generated by machines. Not learning how to use it and not being transparent about the use of AI could have a huge negative impact.

I believe that journalists could learn the basics of coding as well, especially if they are part of a local small newsroom that is understaffed. That could facilitate communication with engineers on the team or give reporters the possibility to set up tools that contribute to their work. For example, when I was a journalist in a small newsroom, I automated some processes using Zapier, which helped me capture subscribers for MailChimp using a Google Form and Instant Articles. The AI tools of Canva could also help a reporter to set up social media posts in an automated way.

A good strategy to follow that many experts recommend is to reflect first on what problems your newsroom would like to solve, and then look out for an AI tool that could help to solve the issue. But even tools that we use daily, like Google Workspace, Google Meet or Zoom, have already integrated AI tools that we can take advantage of, so, for example, reporters won’t have to take notes manually again and free up that time for something more important.

In many communities, reporters could also connect with open-source databases and automate the reporting of local crime, restaurant inspections or school sports. For example, in South Florida open source data could be used to tell your audiences if beaches are open or not, automatically.

Establishing partnerships with other newsrooms and with local universities is also a great way to experiment with LLMs and AI in general, as the CS departments may be able to share resources that local journalists don’t have access to. In the last JournalismAI course, there was a great example of that kind of collaboration, where Ojo Público worked with a professor at the Central European University to develop a tool called FUNES which lists government contracts and assigns a corruption risk score.

Stephen, most journalists are interested in finding ways to automate busy work. What are your thoughts on setting up automation workflows and maintaining those products over time? What should local news folks know?

S: I like this article about thinking about AI preparation like baking a cake. There was a study to see whether people who want to make a cake prefer just adding water to a mix, or a more lengthy process of adding in eggs and other ingredients. They found that people don’t want to just add water — there’s something human that is missed when you don’t add more involvement in the process. Cake mixes changed strategically to have humans add eggs to get them involved in the making. The same goes for AI tools — humans want to be part of that process, not just writing up a scope of work and saying build this, they want that kind of interaction.

I’m a skeptic on automating workflows. I see 50% of these projects to automate workflows as unnecessary — for example, at an information architecture conference I attended, there was an organization who wanted to use AI to build a system for knowledge management, basically for institutional knowledge. You don’t have to use AI for that, you just need to organize information better to be more accessible for you.  

This regards how newsrooms approach “problem framing.” Many folks are employing AI to take on “repetitive tasks,” which are seen as problematic inefficiencies. However, there are other lenses that could propose other solutions — perhaps the task being repeated might be reorganized to resolve repetition altogether. This is “dissolving the problem,” one of Russell Ackoff’s four problem frames. In my own work, I usually look at the “dissolving” lens first.

I worked with newsrooms for over a decade in data structuring, and…to go through tagging something correctly and structuring the data in the right way, journalists will do that to an extent and want to skip steps to make themselves more efficient by going straight to AI. 

I’m not all for automating workflow just because it’s tedious. If we keep going down the path of skipping steps, we’re going to be more dependent on those technologies. I’m nervous about that tendency to “skip” in the name of efficiency.

M: I agree that you need to be organized as a starting point, but once you get that sorted out, if it needs manual time or doesn’t need creative elements, you absolutely need to automate that because it’s taking time from focusing on more important things like fact checking or creating ideas for reporting.

TRY IT OUT NOW

Share with your network

You also might be interested in:

  • Successfully and efficiently marketing your work can be hard, especially for local news teams with limited resources, but marketing yourself to your audience is an essential skill for news organizations to drive revenue and promote sustainability.

  • As news teams begin thinking about their election coverage plans, it may feel like adding more tasks to an already full plate, with a fraction of the staff and resources they once had. But that doesn’t have to mean figuring out how to do more with less — maybe it’s doing less with less.

  • We reached out to Danielle Coffey, the CEO of American Press Institute’s parent corporation, the News/Media Alliance, to learn more about the legal fight for news organizations’ rights with AI.