UNITED STATES (AP) – Quotes from Wyoming’s governor and a local prosecutor were the first hints that seemed a bit odd to veteran Powell Tribune reporter CJ Baker. Then there were some of the almost robotic phrases in certain newspaper stories.
The clearest hint, however, that a reporter at a competing news outlet was using generative AI to help write his stories came in a June 26 article about comedian Larry the Cable Guy, who was chosen as the grand marshal of a local parade. It concluded with an explanation of the inverted pyramid, the basic approach to writing a breaking news story.
“The 2024 Cody Parade promises to be an unforgettable celebration of American independence, led by one of comedy’s most beloved figures,” the Cody Enterprise reported. “This structure ensures that the most critical information is presented first, making it easier for readers to quickly grasp the main points.”
After some digging, Baker, who has been a reporter for more than 15 years, met with Aaron Pelczar, a 40-year-old who was new to journalism and who Baker said admitted to using AI in his stories before he resigned from the Cody Enterprise.
The publisher and editor of the Cody Enterprise, which was co-founded in 1899 by Buffalo Bill Cody, have apologized and promised to take steps to ensure it never happens again. In an editorial published Monday, editor Chris Bacon said he “failed to uncover” the use of AI and fake citations.
“It doesn’t matter that the fake quotes were the apparent mistake of a hasty rookie reporter who relied on AI. It was my job,” Bacon wrote. He apologized because “the AI was allowed to put words that were never said in the stories.”
Journalists have derailed their careers by fabricating quotes or facts in stories long before AI came along. But this latest scandal illustrates the potential pitfalls and dangers AI poses to many industries, including journalism, as chatbots can spit out spurious, if somewhat plausible, articles with just a few prompts.
AI has found a role in journalism, including automating certain tasks. Some newsrooms, including The Associated Press, use AI to support reporters, but most AP staff are unable to use generative AI to create publishable content.
The AP has used technology to assist with financial earnings reports since 2014, most recently for some sports stories. It is also experimenting with an artificial intelligence tool to translate some stories from English to Spanish. At the end of each of these stories is a note explaining the role of technology in their production.
Being upfront about how and when AI is used has proven important. Sports Illustrated came under fire last year for publishing AI-generated online product reviews that were presented as being written by reporters who didn’t actually exist. After the practice became known, SI said it would fire the company that produced the articles for its website, but the incident damaged the once-powerful publication’s reputation.
In his Powell Tribune article revealing Pelczar’s use of AI in his memos, Baker wrote that he had an awkward but cordial meeting with Pelczar and Bacon. During the meeting, Pelczar said, “Obviously, I have never intentionally tried to misquote anyone” and promised to “correct them and issue apologies and say that these are misstatements,” Baker wrote, noting that Pelczar insisted that his mistakes should not reflect on his editors at the Cody Enterprise.
After the meeting, the Enterprise undertook a full review of all the stories Pelczar had written for the paper in the two months he had worked there. They discovered seven articles that included AI-generated quotes from six people, Bacon said Tuesday. He is still reviewing other stories.
“These are very credible quotes,” Bacon said, noting that people he spoke to during his review of Pelczar’s papers said the quotes sounded like something they would say, but that they never actually spoke to Pelczar.
Baker said seven people told him they had been quoted in notes written by Pelczar but had not spoken to him.
Pelczar did not respond to a telephone message left by the AP at a number listed as his to discuss what happened. Bacon said Pelczar declined to discuss the matter with another Wyoming newspaper that had contacted him.
Baker, who regularly reads the Cody Enterprise because he is a competitor, told the AP that a combination of phrases and quotes in Pelczar’s stories aroused his suspicions.
Pelczar’s story about a shooting in Yellowstone National Park included the line: “This incident serves as a stark reminder of the unpredictable nature of human behavior, even in the most serene environments.”
Baker said it sounded like the summaries of his stories that a certain chatbot seems to generate, in that it adds a sort of “life lesson” at the end.
Another story, about a poaching conviction, included quotes from a wildlife official and a prosecutor that appeared to come from a news release, Baker said. However, there was no news release and the agencies involved did not know where the quotes came from, he said.
Two of the stories in question included false quotes from Wyoming Gov. Mark Gordon, which his staff only learned about when Baker called them.
“In one case, (Pelczar) wrote a story about a new OSHA rule that included a quote from the governor that was completely fabricated,” Michael Pearlman, a spokesman for the governor, said in an email. “In a second case, he appeared to make up part of a quote, and then combined it with part of a quote that was included in a press release announcing the new director of our Wyoming Game and Fish Department.”
The most obvious AI-generated copy appeared in the story about Larry the Cable Guy.
It’s not hard to create AI stories. Users could feed a criminal affidavit into an AI program and ask it to write an article about the case that includes quotes from local officials, said Alex Mahadevan, director of a digital literacy project at the Poynter Institute, the leading journalism think tank.
“These generative AI chatbots are programmed to give you an answer, regardless of whether that answer is complete garbage or not,” Mahadevan said.
Megan Barton, editor of Cody Enterprise, wrote an editorial calling AI “the next, advanced form of plagiarism, and in the field of media and writing, plagiarism is something every media outlet has had to correct at one time or another. It’s the ugly part of the job. But, a company willing to correct (or literally write) these mistakes is a reputable company.”
Barton wrote that the paper has learned its lesson, has a system in place to recognize AI-generated stories and “will have longer conversations about how AI-generated stories are not acceptable.”
The Cody Enterprise did not have an AI policy, in part because it seemed obvious that journalists should not use it to write, Bacon said. Poynter has a template from which news outlets can create their own AI policy.
Bacon plans to have one in place by the end of the week.
“This will be a topic of discussion prior to employment,” he said.
#Reporter #resigns #write #stories
2024-08-15 23:18:47