Florida nonprofit news reporters ask board to investigate their editor’s AI use
Suncoast Searchlight launched this past February with the goal of bringing rigorous local investigative reporting to greater Sarasota, Fla. As of last week, the nonprofit publication had four full-time staff reporters and two editors breaking stories about communities across Sarasota, Manatee, and DeSoto counties. In recent days, though, the small newsroom has become a flashpoint for the ethics of using generative AI to edit stories without disclosure.
On Tuesday, November 11, all four reporters at Suncoast Searchlight sent a letter to the nonprofit’s board of directors accusing their editor-in-chief, Emily Le Coz, of using generative AI tools, including ChatGPT, to edit stories and hiding that use from staff. The letter stated that Le Coz’s use of these tools introduced hallucinated quotes, a reference to a nonexistent state law, and other factual inaccuracies into multiple reporters’ story drafts. After reporters questioned her about the edits, they say she did not immediately disclose her use of AI tools but instead claimed she’d made the errors herself.
“We fear that there may be extensive undisclosed AI-generated content on our website and have questions about what retroactive disclosure is needed for our readers,” the reporters wrote. “We also expressed to Le Coz the breach of trust that her actions have created among us.”
The reporters asked the board, which includes prominent media executives and investigative journalists, as well as local business people, to intervene. They outlined several requests, including implementing an AI policy and a thorough fact-checking process, conducting an internal audit to identify AI-generated writing that might have been published on the site, and asking Le Coz to promise not to use AI tools for editing in the future.
The following morning, less than 24 hours after the board received the letter, Le Coz and her deputy editor fired one of the reporters who’d signed it. McKenna Oxenden, who had started in her role 33 days earlier, says they told her she was being terminated for performance issues, including story output and news judgment.
Oxenden, who was previously a reporter at the Houston Landing and a breaking news fellow at The New York Times, told me she was not on a performance improvement plan (PIP) and hadn’t received any previous performance-related warnings. Among several cited in the meeting as evidence of poor performance, two examples — a missed story deadline and a failure to follow up with sources — occurred on November 11. That was “the same day we had a long meeting that ended with staffers telling Emily that we essentially had no trust in her,” Oxenden said.
In LinkedIn and X posts last Friday, Oxenden wrote, “It is not lost on me that I was fired one day after several staffers, myself included, raised concerns to our executive editor about her undisclosed AI usage during the editing and writing process.”
The three other journalists who signed the letter were not fired.
Oxenden told me that she felt obligated to bring the issue to the board because of how challenging it was for the reporting team to hold Le Coz accountable for her undisclosed ChatGPT usage. “If any employee was found to be using AI in this manner, I firmly believe that their boss should be notified. The board happens to be Emily’s boss,” she said.
On Thursday, November 13, board chair Keith Woods, who retired from his position as NPR’s chief diversity officer earlier this year, responded to the staff letter, writing that several board members had spoken with Le Coz about her use of AI tools. After those conversations, he said, the board had “full confidence in her editing processes, her sense of ethics, and the integrity of her published work.” The board also agreed that the newsroom should adopt an AI policy in the coming weeks.
In the social media posts about her firing, Oxenden claimed the board “did not ask to speak with any staffers nor examine the evidence we had to prove the questionable AI usage” before sending that email reaffirming Le Coz’s leadership.
The board is now conducting an internal review. “While the investigation remains ongoing, no issues have surfaced concerning the accuracy of Searchlight’s journalism or the ethics of the editorial process. The board remains highly confident in the integrity of the Searchlight team’s journalism,” Woods said in a statement to Nieman Lab on behalf of the board. “We are proud of their work and value the contributions of every team member.”
Woods maintained that, so far, no errors have been identified in published stories as result of AI use. “It is important to state clearly that no concerns were raised about the accuracy of Searchlight’s work product,’ he added.
The board declined to answer questions about Oxenden’s firing, saying they were unable to comment on personnel matters. Le Coz, the editor-in-chief, did not respond to my requests for comment.
“Where did this quote come from??”
Earlier this month, Le Coz edited a staff reporter’s story about a Florida Senate housing bill. While reviewing Le Coz’s edits, the reporter noticed a fabricated quote attributed to one of her sources, according to a document detailing the incident shared with Suncoast Searchlight staff and the board of directors earlier this week that I reviewed.
In the draft, someone opposing the bill is quoted as saying, “This state is so diverse — Tallahassee shouldn’t be telling people how to live.” The second half of that quote was not in any of the reporter’s notes. When the reporter asked Le Coz where the quote came from, she said she’d written it as an example of the type of quote she wanted in that section, but forgot to flag that in the document.
The reporter then went into the version history of the Google document. Earlier that day, Le Coz was working on the document when suddenly several blocks of text were added over the course of four minutes. In that document version was “language resembling an AI chatbot prompt, which was quickly replaced with paragraphs of unexplained text that reflected the prompt-like instructions,” wrote the reporter in the document sent to the board. This included “confusing statements and numerous outright fabrications which were not present in my original draft.”
For example, this new text said that the “roots of Senate Bill 180 trace back to the ‘Local Government Emergency and Recovery Management Act,’ a 2023 measure passed in the wake of Hurricane Ian.” That law does not exist. The copy also included odd phrases like “infrastructure timing” and falsely stated that Florida Gov. Ron DeSantis signed the housing bill into law in May 2025. The correct date was in the reporter’s original draft. A second new quote was also added, which stated the bill wasn’t helping hurricane victims. That quote was attributed to an unnamed source, referred to in the copy as a “local official involved in post-Ian planning.” The reporter never spoke to such an official. (Screenshots of the Google document version history corroborate these details.)
Le Coz removed some of those factual inaccuracies before sharing her finished pass with the reporter, but not all of them. The reporter removed the partially hallucinated quote and other inaccuracies before it was published on November 7. It has since been republished by regional and statewide outlets, including the Tampa Bay Times. Language disclosing the use of AI text generation or research tools was not included in the Suncoast Searchlight version of the story or in its republished versions.
The day after the story ran, the reporter confronted Le Coz with sections of the final story she worried contained undisclosed AI-generated text. In the same document shared with the board, the reporter wrote that Le Coz “initially [completely] denied using AI to rewrite my story, but later stated that she had messed up and said she didn’t want me to feel like she cheated.”
This was not the first hallucinated quote Le Coz introduced into a reporter’s story draft. “Several staffers had warned me to fact-check my stories extra carefully and watch for new content that was put in stories,” Oxenden, the former staff reporter, told me, referring to her first weeks on the job this fall. “They said there were instances where hallucinated quotes were placed in their story with no explanation.”
For example, Suncoast Searchlight regularly produces shortened versions of its stories in order for them to run in partner publications. The reduced word count makes the stories more attractive to local publishers like Observer Media Group, which owns several print newspapers, magazines, and news sites serving Sarasota, Longboat Key, and Manatee County, Fla.
In April, a different reporter found a hallucinated quote and an altered quote in one of these trimmed story drafts about school cuts to mental health programming. Le Coz had edited the shortened story. The reporter added a comment to the document asking, “Where did this quote come from??” Le Coz resolved the comment without responding, according to screenshots I reviewed. Only on November 11, after being confronted by the reporting team about her AI usage, did Le Coz acknowledge that she had “experimented” with using ChatGPT to assist in creating shortened versions of stories in the partnership program, according to meeting notes and a document detailing the incident sent to the board and other staff members earlier this week. Le Coz told staff she abandoned the experiment soon after because the tool introduced mistakes.
In that same November 11 meeting, Le Coz claimed none of these “experiments” with ChatGPT ultimately made it into the hands of partner publications. Screenshots of a Google Drive folder I reviewed, however, show that the same shortened story about mental health programming had been shared with a representative of Observer Media Group. No disclosure that generative AI writing or editing tools were used to produce that story were included in the document.
As AI text generators and search tools are increasingly normalized in newsrooms, tensions have boiled over at several newsrooms where staff and management have differing adoption philosophies. Disclosure is one of the most hotly debated topics. Should readers be notified when a generative search tool was used to research a story, but that story was still fact checked before publication? What about when ChatGPT was used to produce a block of AI-generated text, but not an article wholesale? In Suncoast Searchlight’s case, no such disclosures have occurred with its readers or with its partner publications — despite the investigative outlet centering transparency in its work.
Far less debated is the view that AI usage should be disclosed internally in newsrooms: that reporters and editors have a responsibility to share with their colleagues which AI tools they are using, as well as when and how they are using them. That’s especially true of a product like ChatGPT that produces errors often, and confidently. The risk of masking this usage isn’t merely the chance that a hallucinated quote ends up in print, but eroded trust among colleagues.
In tapping the board of directors, reporters at Suncoast Searchlight said trust had been lost and called on a bench of experienced journalists to intervene. Board member Chris Davis is a deputy investigations editor at The New York Times and the former executive editor and vice president of investigations for Gannett and USA Today. Included in the initial response email to staffers was also board member Kelly McBride, the senior vice president and chair of the Craig Newmark Center for Ethics and Leadership at the Poynter Institute, who has been a leading voice in the ethics of AI adoption in newsrooms. McBride was one of the authors of Poynter’s AI ethics guidelines, an editorial standards starter kit that has been adopted by newsrooms across the U.S. and internationally.
Suncoast Searchlight currently has no public or internal AI editorial policy. In response to the concerns of staffers the board said it will be adopting one. “We will continue our review, adopt AI guidelines, and work with the newsroom to ensure Suncoast Searchlight’s continued success in covering news of import in Sarasota, Manatee and DeSoto counties,” said Woods in a statement on behalf of the board.
Woods did not answer questions about whether the board will be auditing other stories edited by Le Coz or looking into Oxenden’s firing as part of its investigation.
On her decision to go public last week, Oxenden said, “I felt strongly about standing up for journalism ethics and morals when I was an employee and that didn’t change when I lost my job.”
