The final step in successful innovation involves gathering feedback, measuring it, and using that to direct how you change and evolve your approach and/or product. This process ensures that projects, products and initiatives proceed based on data and user behavior and feedback.
Develop iteratively
Kareem Amin, the head of product at News Corp, is working on a set of new storytelling tools to help News Corp journalists easily create content that appeals to an increasingly mobile audience.
“Our users are getting older and our products don’t have as much reach into the younger generation, and we would like to reach them on mobile devices,” he said, describing the problem to solve at the Collab/Space event in New York. In a subsequent interview, he said it’s not just about young users: people of nearly all ages are consuming more and more content on mobile devices.
Amin wanted to rethink how news content was delivered on mobile, and to test ways content could be repackaged to fit different use cases and user needs.
To achieve that goal, he broke the project down into smaller steps. Amin gathered support, data and feedback at each step to help him move to the next one. This iterative approach ensured he and his team were able to test and validate their assumptions and work before investing additional time and capital.
“The first thing is we started to brainstorm what is a good approach to telling stories on mobile devices, and we then start building custom web pages that would test [some of our ideas],” he said.
[pulldata align=center context=”Gather data and feedback at each step to validate assumptions before investing additional resources”]At this early stage, he didn’t have formal approval or budget. Amin recruited a designer to work with him based on the idea that the project would secure formal approval once they had something to show. With prototypes in hand, he bought Facebook ads to test how people interacted with the stories.
As shown below, one of the things he prototyped was a very visual way to present news stories using cards and graphics, as opposed to lots of text:
“Build a prototype without asking permission.” First prototype of new @newscorp product. Data viz-led #collabnyc pic.twitter.com/JUpbndQC6Q
— Andrew Losowsky (@losowsky) July 8, 2014
“It treats data as a first-class citizen,” Amin said.
They then examined the data to see how users were interacting with the content. For example, did they swipe through all the way to the end of the story?
“Completion of story was to me the most interesting [metric],” he said, noting that 85 percent of people who started a story completed it.
With a basic prototype and encouraging feedback, Amin took the next step of bringing it to the News Corp executive to secure funding. He also structured his request in a way to reinforce the iterative process: Each tranche of budget would be released when a specific set of goals was achieved.
“I wanted to make [the executive] comfortable, but I also wanted to impose responsibility onto ourselves to make sure we are being diligent about what we are trying to achieve,” he said. “The checkpoints were as much for the team as they were for the executive.”
Amin followed a five-step process for the project:
- Question assumptions. He said one assumption news organizations make is that it’s better to create content once and have it flow to different platforms, such as mobile and tablet. Related to that, he said in his experience newsrooms believe the extra effort to create platform-specific content does not pay off in terms of additional engagement. Amin wanted to test those assumptions.
- Bootstrap. “I asked a friend who was a designer to design a few concepts based on things I was thinking about,” he said. “That wasn’t approved yet, and then we came back and built a case around that.” Amin also referred to this early unfunded stage as getting started “without asking permission.” Without any committed resources, he bootstrapped to get something created.
- Build support. Amin made a point of sharing what he was doing with people at different News Corp properties. This ensured they knew about the project, and that they felt involved in the process. Most important was spending time with people face-to-face. “A big portion of my time was spent to socialize the project and taking time with each person who is involved to do demos,” he said. “That introduces [the product] to a lot more people who are thinking about how they can use this.”
- Place technology bets. The Word Wide Web Consortium standards body has been working on something called web components. Amin knew this functionality will soon be rolled out in major web browsers, meaning it will be standardized. He decided to develop the tools using web components. “With that tech bet we have been able to do things in an easier way than it otherwise would have been,” he said. It will also give them a leg up on the competition, according to him.
- Develop iteratively. Amin said the milestone-oriented budget structure was one of the key manifestations of this approach. The use of Facebook ads also ensured they tested everything they built, and created new versions based on feedback. For example, Amin said data from the ads caused them to make significant changes to story navigation. Ultimately, the data gathering and budget structure made sure that “we proved a set of things before we moved on,” he said.
Now that he’s proven people like the prototype, Amin said the next steps are to examine scalability and monetization opportunities.
The above approach avoids what Michael Maness said is one of the biggest mistakes he sees people make in news innovation. They “get an idea and going straight to building a brand,” he said.
“We see that a lot,” he continued. “It’s like we have this idea and let’s put money behind it and let’s not launch it until it’s perfect.”
But that’s the opposite of the iterative approach that must be at the core of innovation, according to Maness. He described what an iterative process looks like:
As an organization, what that means is you’re going to be a lot more raw about this stuff. You going to lower the barriers and let people look through your windows. And you have to do that especially when you’re building new products because if you build something in a vacuum, I almost guarantee you it will fail. And then the other thing is you look at the iteration research around this, [it has found that] most things need four versions of themselves before they stabilize…. We see this a lot where people are like, “Well, we built this and it cost us $500,000 and now we know what we’re doing.”
And so then we get asked for another $500,000. We’re like, “No, you should have built this for $5,000 — the first one. And then you would have learned the same things.’ So that staging and the iteration and the versioning of that, it’s still something we don’t see enough of.
Amy Webb agrees that news organizations struggle with adjusting their ideas and projects once they are underway. They don’t start with a plan to iterate, and so when they get results that are different than expected, they don’t know how to react.
“The problem is that there was no plan to recalibrate it,” she said. “That is really, really important.”
Gather and act on feedback
One key to being able to iterate — and to know when you need to rethink things — is to prioritize generating quality feedback from your target users. Whether those are journalists in your newsroom, or a specific group of consumers, you need to get what you’re doing in front of them, and listen.
An example of this is how NPR gathered feedback for its new NPR One app. The app is a major launch from NPR, and promises to use user behavior and algorithms to deliver “a stream of public radio news and stories curated for you.”
“The big idea of NPR One is to go after the millions of casual public radio listeners — and not the other millions already keenly habituated to public radio listening,” wrote Ken Doctor for the Nieman Journalism Lab.
So how did they get feedback from target users in order to guide development and iteration of the app? By letting visitors to NPR’s news headquarters play with the app at the end of their tours.
“With costs of user testing potentially very high, we realized early on that we had a built-in supply of free beta testers coming to us on a weekly basis,” said Joel Sucherman, the senior director of digital products at NPR.
There was, however, one drawback to this group: they were big enough fans of NPR to want to tour its new headquarters. That’s not a typical user group.
“But oftentimes we ended up with spouses who were being dragged along on those tours, so as it turns out we did have a pretty good mix of NPR fans and those who were kind of lukewarm on us,” Sucherman said.
[pulldata align=center context=”NPR got user feedback on its new app by letting visitors play with it at the end of their building tours”]With a group of testers identified, they developed a process for gathering feedback, and keeping connected to their testers.
Jeremy Pennycook, product manager for NPR One, explained the process:
We would recruit folks from the tours and ask them if they wanted to test out our new app. The folks that volunteered would then sit down one-on-one with our User Experience specialist, Vince Farquharson, and be given a series of tests. The specific test would depend on what hypothesis we were attempting to learn about at the time. But these sessions could last up to an hour. For people who showed an interest in exploring further, we would then give them the link to join a beta group we had set up and allow them to be able to download the Beta app and test it on their own. These testers would post in a Facebook group or email us in more feedback as we continued to change the experience.
From there, Farquharson compiled the findings and presented them to the development team, and also often showed them to management.
“These recommendations would then inform future sprint [development] cycles based on the outcome of the testing,” Pennycook said. “Sometimes, we would be satisfied with an interaction or feature and be able to move on to other items with confidence. Other times, the feedback would indicate we needed to go back to the drawing board to come up with a better solution.”
All of this effort was done to ensure the app was meeting the needs of users.
Fred Dust, a principal with design firm IDEO, is quoted in the paper by Brown and Groves about the importance of empathy, and of taking the time to listen and understand the needs of the intended user:
There’s real value to spending real time with the people you’re designing for, in context. Don’t let your judgment or pre-knowledge override the people you’re designing for. Empathy gets to better solutions.
Brown and Groves emphasized the importance of incorporating data and feedback from key stakeholders:
The key takeaway here for newsrooms and their managers is to create standard processes that involve audience feedback well before a final product is launched or project is published, an abrupt departure from the typical situation in which most audience participation is restricted to post-publication comments.
Measure your results
In March 2014, TIME relaunched its website. As part of this effort, the team also decided to take something of a risk with their successful email newsletters.
At the time the publication had 10 email newsletters focused on topics such as technology, business and politics. There were close to 1 million subscribers spread between the newsletters.
“The open rates were in the 20s [percentage-wise] on average, and the industry standard for media and publishing is about a 23 percent open rate,” said Schweitzer.
But those newsletters reflected the old TIME digital strategy. The new site was built with the idea of merging the magazine’s roots as the first weekly aggregator of news from all over the world with the new real-time era of information. The new homepage featured one lead story, with 11 underneath it to give the visitor a quick briefing of what they need to know right now.
“It’s the idea of doing for the minute what TIME had done for the week,” Schweitzer said.
Ten vertical email newsletters didn’t fit with the concept of giving a quick, essential briefing.
“They were all algorithmically derived, with no editorial curation, and they were focused on a vertical,” Schweitzer said.
So even with close to 1 million subscribers and industry-average open rates, the TIME team blew up its newsletters and started over.
Ultimately, Schweitzer said, they decided, “Okay, so we have 10 things that are doing decently well. Why not try and make one thing that’s fantastic?”
Ten newsletters become one, The Brief. Schweitzer and the team decided that their primary measurement would be to see if they were increasing open rates.
“We have seen, since March, our open rates are now at over 40 percent which is just huge — I mean just really a fantastic amount of growth,” Schweitzer said.
A big driver of the growth is the constant iteration and optimization of the newsletter. Schweitzer outlined some of the ways they test and adapt The Brief, and how they measure the results:
We saw that we could do so much experimentation with optimization of the newsletter itself … We designed it for mobile so that people could read it on their phone during their commute, and we added an audio integration with SpokenLayer so people could have The Brief read to them as a podcast. We have been testing everything from different subject lines to sending it earlier to see how that affects open rates.
She said tracking key metrics and sharing them with the team has helped reinforce success.
“Everything that we’re doing, the good news is we’ve really had the data to back it up to say, ‘This is successful,'” she said. “[We can tell people], ‘Look, you did A and then B happened.’ And I think that something that has also been really great is that everybody has really gotten behind this idea of the importance of metrics and data and transparency in the newsroom.”
Decisions about how to evolve — or whether to continue with — an initiative must be guided by feedback from the intended users, as well as by data. Mixing qualitative with the quantitative is key. So too is picking which metrics you intend to measure.
A few good, relevant metrics that everyone can agree on and understand is better than a raft of numbers that don’t provide actionable information. Those contacted for this study cautioned against picking too many metrics, or key performance indicators (KPIs).
At Medium they have two key metrics that guides work across teams, according to Lee, the senior editor.
“As a company we optimize for engagement, which is time spent reading on the platform, and have developed our own metrics for that,” she said. “That’s a goal that everyone is following.”
Since anyone can write on Medium, the team also tracks an engagement metric that tracks the number of users who return to Medium three days within the trailing seven days. (This is applied both to readers and writers on Medium.)
Teams look at other metrics, but Lee said the engagement metrics are what guide their optimization efforts.
“We certainly follow traffic, but not all of our efforts are to increase traffic,” she said.
Vox Media helps communicate which metrics are important by building analytics dashboards that product teams can use to gather data about what they’re building. Brundrett said the choice of which dashboards to build communicates which metrics matter. It then also makes it easy for everyone to use these metrics.
He gave an example of Vox building a dashboard to measure website performance on mobile.
“We put together a dashboard called Tempo that tracks all of our load times and performance metrics, and every team can easily hook into that and load that up,” he said. “So now people understand: mobile performance is important.”
Once you have the data, you then have to evaluate it. This is one of the trickiest parts. Some metrics will be clear: open rates have increased by X percent, and that’s a good thing.
But in other cases, one metric may seem to be tanking, while another is returning unexpected results. They key is to gather all of the data and discuss it as a team. Work to understand the story the data is telling you.
“The hard part is to understand what the trajectory of the data is,” Brundrett said. “So if you’re launching something new that is experimental [it’s important that you] understand what success means. If you’re iterating off that data you have to know when to jump and when not to jump and to be patient. You have to have a bigger vision for that. You can’t run the whole thing off the data.”
Maness encourages teams to focus on where there seems to be some momentum or encouraging trend lines. Work to understand what is happening, and why. He gave an example of a mobile app where one key metric of success might be the number of downloads.
“You could have a lot of people download it, but no one uses it,” he said. “Or you can have very few people download it, but the people who do use it twice as much as other apps.”
In the latter scenario, results are dragging on one metric, but overachieving with another. It’s essential to understand why that’s happening, and to talk to your users to gain a better understanding, Maness said.
One final piece of advice from Amy Webb is to avoid setting unfair or unrealistic metrics early on. This, for example, could mean leaning too much on metrics that are focused on ROI or revenue. Pick something reasonable to measure as a start and then evolve as the project progresses. Act, then learn. It applies to metrics as well.
“You can’t immediately predict what ROI will be because too many variables are out of the control of the news organization,” she said. “If you at least have a clear plan for how you will measure, that is sometimes enough to appease people who will otherwise quash the project.”
[newsletter align=”right” blurb=”Don’t miss API’s next research and programs on news innovation — sign up for our Need to Know daily newsletter and learn what we’re learning.”]It’s also important to be reasonable about setting expectations for what can be achieved in the short term, according to Drury’s Jonathan Groves.
“Pixar took a decade to develop fully; news organizations shouldn’t expect their big ideas to explode in six months, especially in today’s crowded media landscape,” he said.
Share with your network
You also might be interested in:
As news organizations enter the final stretch before Nov. 5, we’ll share actionable lists, articles and guides to help you prepare to address misinformation and navigate Election Day while keeping well-being at top of mind.
We imagine a future where evidence, data and peer assessment support decision-making in journalism — whether by reporters, editors or news executives — and where journalism better informs the questions researchers ask.
A white paper based on the research of more than a dozen journalists and scholars will provide details on why the gap exists, as well as solutions for solving the problem.