Exploring the Potential Risks and Rewards of Using AI in Newsrooms

Journalists from Reuters and the AP joined an NYU assistant professor and moderator Gina Chua, executive editor at Semafor, to discuss how to get the most value from AI while minimizing the risks of deep fakes and lost trust from the public. The Nov. 29 event for NYU students, faculty, journalists and guests was the second panel of the evening as part of the launch of the Ethics & Journalism Initiative. You'll find the EJI Takeaways from the panel and a link to a video in this post.

Event Date: Nov. 29, 2023

Panelists:  Hilke Schellman, an investigative reporter, data journalist, computer scientist and NYU assistant professor; Amanda Barrett, the Associated Press’s vice president of standards and inclusion; and Mo Tamman, a Reuters investigative reporter.  Moderator: Gina Chua, executive editor of Semafor.

View the complete event on YouTube.

We have recommended readings to dive deeper into this topic.

The EJI Takeaway

Don’t miss the opportunity to enhance your reporting with AI

Reuters investigative reporter Mo Tamman advised journalists not to let their apprehension about AI get in the way of utilizing it. 

“We can wring our hands all day long about the risks that we face with these new technologies,” Tamman said. “Or we can decide that we’re going to embrace them and use them to our advantage—rather than having our own fear about that existential fear of our existence overwhelm our need to use these tools.”

Just don’t forget to verify the information AI produces, a skill that NYU assistant professor and investigative journalist Hilke Schellmann thinks journalists will be able to leverage going forward. As AI expands and takes a larger role in society, she thinks journalists will be used more and more for verification. Journalists should be prepared to do that and do it well.

AI can automate tedious processes inherent to journalism

Amanda Barrett, vice president of standards and inclusion at the Associated Press, said that AI can automate a lot of processes that some journalists might find tedious, such as compiling a given week’s top stories or writing story summaries. Taking that kind of manual work off of journalist’s plates gives them more time to do more meaningful work. 

NYU’s Schellmann says she’s made an AI tool that can go through documents to find keywords, which has helped her reduce the time that process takes. Additionally, AI could empower freelance journalists who don’t have the financial backing of a large news organization to be able to go through large sets of data or documents.

Moderator and Semafor Executive Editor Gina Chua suggested that journalists create custom bots to help copy edit work and improve writing. Plus, AI can help journalists translate their work, which could improve their copy in other languages. 

AI could be a boon for local news

Because AI can go through huge sets of documents and data quickly, Tamman and Chua said it could have the power to “resurrect local news.” An AI bot could not only synthesize what was discussed at town hall meetings, but it could also notice patterns and trends. In that way, journalists who are often single-handedly covering an entire community could spot an interesting pattern or trend emerging in minutes of meetings they couldn’t attend. The journalist could then follow up on the AI-generated lead with additional reporting.

Crosstown, a non-profit newsroom in Los Angeles, ingests data from LA neighborhoods and generates reports. It also hopes to help other local news sites do the same.

Relying on false information gathered through AI can erode credibility

Barrett raised that there are a lot of risks when using AI in reporting. One is hallucinations, or when artificial intelligence produces information that—though it may look cited and referenced— is completely made up. And AI doesn’t usually provide nuance; it can’t “produce real human stories.” 

“If we rely on information that’s not true,” Barrett said, “That can erode what little credibility we have with large numbers of society.”

That said, AI offers humans the “opportunity to see things that we cannot see because we are small-brained,” Tamman said. Humans are only so accurate in their reporting—that’s why outlets fact check. We need to do the same for AI. 

“There are opportunities to learn and to assimilate information that we cannot do,” Tamman said. “It can make our world better.”

The bottom line: Verify when reporting using AI, and don’t forget that it has its own biases, too.

Familiarize yourself with AI

Barrett advised that journalists familiarize themselves with terms associated with AI—the AP just released a AI glossary—learn what information and data AI is gathering in their communities and within their beat, and know how that information is used. 

“Who’s putting that information together?” Barrett said. “Is there any bias in that information?” 

Schellmann said journalists should test models themselves with a healthy dose of skepticism. Plus, those testing experiences can become stories

Tamman takes interacting with AI to a whole new level: He says he uses a chatbot throughout his work day daily, and “it’s like having the perfect assistant” to help him think through problems and find new solutions—plus aid him with coding, grammar and eliminating passive voice.

“I don’t expect it to be right half the time when I’m asking a question,” Tamman said. “But I’m having a conversation with it.”