close
close
Tue. Sep 10th, 2024

Wyoming reporter caught using artificial intelligence to create fake quotes and stories

Wyoming reporter caught using artificial intelligence to create fake quotes and stories

HELENA, Mont. (AP) — A quote from Wyoming’s governor and a local district attorney were the first things Powell Tribune reporter CJ Baker found a little off. Then some of the phrases in the stories seemed almost robotic to him.

However, the announcement that a reporter from a competing news station was using generative artificial intelligence to help write their stories came in a June 26 article about comedian Larry the Cable Guy being chosen as marshal of the Cody Stampede Parade.

“The 2024 Cody Stampede Parade promises to be an unforgettable celebration of American independence, led by one of comedy’s most beloved figures”, Cody Enterprise reported. “This structure ensures that the most critical information is presented first, making it easy for readers to grasp the main points quickly.”

After doing some digging, Baker, who has been a reporter for more than 15 years, met with Aaron Pelczar, a 40-year-old who was new to journalism and who Baker says admitted to using AI in his stories before he resigned. from Enterprise.

The editor and publisher of the Enterprise, which was co-founded in 1899 by Buffalo Bill Cody, apologized and promised to take steps to make sure it never happened again. IN A editorial published on Monday, Enterprise editor Chris Bacon said he “failed to catch” the AI ​​copy and fake quotes.

“Never mind that the fake quotes were the apparent error of a hasty rookie reporter who trusted the AI. It was my job,” Bacon wrote. He apologized that “AI was allowed to put words that were never spoken into the stories.”

Journalists they have derailed their careers of making up quotes or facts from stories long before the advent of AI. But this latest scandal illustrates potential pitfalls and HAZARD that AI presents for many industries, including journalism, as chatbots can spit out fake, if somewhat plausible, articles with just a few prompts.

AI has found a role in journalism, including automating certain tasks. Some newsrooms, including The Associated Press, use AI to free up reporters for more impactful work, but most AP employees are not allowed to use generative AI to create publishable content.

The AP has used the technology to help with stories on financial earnings reports since 2014 and more recently for some sports stories. They are also experimenting with an AI tool to translate some stories from English to Spanish. At the end of each such story is a note explaining the role of technology in its production.

It has proven important to be direct about how and when AI is used. Sports Illustrated was criticized last year for publishing AI-generated product reviews online that were presented as being written by reporters who didn’t actually exist. After the story broke, SI said it was firing the company that produced the articles for its website, but the incident damaged the publication’s once-strong reputation.

In his Powell Tribune story breaking news of Pelczar’s use of AI in articles, Baker wrote that he had an awkward but cordial meeting with Pelczar and Bacon. During the meeting, Pelczar said, “Obviously I’ve never tried to misquote anybody,” and promised to “correct them and apologize and say they’re misrepresentations,” Baker wrote, noting that Pelczar had insisted that his mistakes should not reflect on him. Cody Enterprise Publishers.

After the meeting, the Enterprise released a full review of all the stories Pelczar had written for the paper in the two months he had worked there. They discovered seven stories that included AI-generated quotes from six people, Bacon said Tuesday. He is still revising other stories.

“They are very credible quotes,” Bacon said, noting that people he spoke to while reviewing Pelczar’s articles said the quotes sounded like something he would say, but that they had never spoken to Pelczar.

Baker reported that seven people told him they had been quoted in stories written by Pelczar but had not spoken to him.

Pelczar did not respond to an AP phone message left at a number listed as asking to discuss what happened. Bacon said Pelczar declined to discuss the matter with another Wyoming newspaper that reached out.

Baker, who regularly reads the Enterprise because he is a competitor, told the AP that a combination of phrases and quotes from Pelczar’s stories raised his suspicions.

Pelczar’s story about a shooting in Yellowstone National Park included the sentence: “This incident serves as a stark reminder of the unpredictable nature of human behavior, even in the most serene settings.”

Baker said the line sounded like the summaries of his stories that a certain chatbot seems to generate, in that it refers to some sort of “life lesson” at the end.

Another story — about a poaching conviction — included quotes from a wildlife official and a prosecutor that appeared to come from a news release, Baker said. However, there was no press release and the agencies involved did not know where the citations came from, he said.

Two of the stories in question included fake quotes from Wyoming Gov. Mark Gordon, which his staff only learned about when Baker called them.

“In one instance, (Pelczar) wrote a story about a new OSHA rule that included a quote from the governor that was entirely fabricated,” Michael Pearlman, the governor’s spokesman, said in an email . “In a second instance, he appeared to have fabricated part of a quote and then combined it with part of a quote that was included in a press release announcing the new director of our Game Department and Wyoming fish.”

The most obvious AI-generated copy came in the Larry the Cable Guy story that ended with the inverted pyramid explanation, the basic approach to writing a breaking news story.

It’s not difficult to create AI stories. Users could feed a criminal complaint into an artificial intelligence program and ask it to write an article about the case, including quotes from local officials, said Alex Mahadevan, director of a digital media literacy project at the Poynter Institute. the preeminent think tank in journalism.

“These generative AI chatbots are programmed to give you an answer, whether the answer is complete garbage or not,” Mahadevan said.

Cody Enterprise editor Megan Barton wrote an editorial calling AI “the new advanced form of plagiarism, and in media and writing, plagiarism is something that every media outlet has had to correct at some point or another. It’s the ugly part of the job. But, a company willing to correct (or literally write) these mistakes is a reputable one.”

Barton wrote that the paper has learned its lesson, has a system in place to recognize AI-generated stories and will “have longer conversations about how AI-generated stories are not acceptable.”

Enterprise didn’t have an AI policy, in part because it seemed obvious that journalists shouldn’t use it to write stories, Bacon said. Poynter has a pattern from which news outlets can build their own AI policy.

Bacon plans to have one by the end of the week.

“That will be a topic of discussion prior to employment,” he said.

Related Post