Tim Calabro (left) interviewing Dr. Maury Smith Credit: Ben Deflorio

For as long as he has owned the Randolph-based White River Valley Herald, editor and publisher Tim Calabro has had trouble getting reporters to cover local selectboard meetings. Usually held at night, they can be long and boring. Most of the paper’s paid correspondents and volunteers would rather write about something else.

But public meeting coverage is a big part of local news. Looking for a way to provide it to readers, Calabro decided last year to try using artificial intelligence, or AI. Urged on by his brother-in-law, a New York State engineer who had been investigating AI models, Calabro downloaded a set of meeting minutes and fed it into ChatGPT.

The application, created by OpenAI, is what’s known as a large language model. It’s been “trained” on a vast trove of text available online and draws on that information to respond to commands or prompts. This process can produce music, artwork, reports, data analysis and a host of other outputs, including a written narrative.

Calabro asked ChatGPT to create a summary of the meeting. He compared the result with the minutes, to see how well it had captured what happened.

“It wasn’t awful,” Calabro said.

Encouraged, Calabro started to hone his skills using it. He directed ChatGPT to write stories using the Associated Press Stylebook, a standard for grammar and usage employed in many newsrooms. Calabro checks the AI-generated summaries against the meeting notes to make sure they are accurate, and he adds context, such as biographical or historical information, if needed.

ChatGPT doesn’t adhere to some newsroom conventions, such as referring to people by their last names after they are introduced. But overall, Calabro is pleased with the results, and he’s published about a dozen meeting stories created by the tool. They’ve appeared in the paper and online alongside other stories. The ChatGPT write-ups read much like the ones completed by humans.

Summary from meeting minutes” runs at the bottom, but nothing indicates that ChatGPT did the summarizing. Calabro said he doesn’t think it’s worth pointing out each time.

But he did give readers a brief notice about the practice in a September 21 column. “Like the invention of the computer itself, AI promises to fundamentally change work,” Calabro wrote. “Whether that’s good or bad will depend on how thoughtfully we can decide how to utilize this new tool.”

Calabro’s willingness to publish AI-generated content sets him apart from most of his peers in Vermont. Many are watching closely as national media companies experiment with AI, and they haven’t ventured into publishing AI-generated stories. While AI can replace some of the more routine tasks carried out by journalists — for example, Seven Days journalists use Otter.ai, an app that records and transcribes interviews — many newspaper editors are uneasy with the idea of using it in place of human intelligence.

“We are not using it and have no plans to do so,” Seven Days publisher and editor in chief Paula Routly said.

Steve Pappas, the executive editor of the Barre-Montpelier Times Argus and the Rutland Herald, said he doesn’t allow AI to replace in-person reporting because it can’t provide context or identify the most important concepts in a public meeting. “I would much rather have vetted information in the papers until such time as I know AI can be trusted and reliable,” he said.

Dave Mance, the editor of Vermont Almanac: Stories From & for the Land, outlined the dangers of AI in a September fundraising appeal. He raised the specter of a future in which writers, photographers and artists would be replaced by robots.

“As critics of AI point out, the danger is not in the act of a robot generating an image, it’s in the idea that people will cease to care that there’s a difference between computer-generated and human-generated art,” Mance wrote.

Tim Calabro with his dog Sadie Credit: Ben Deflorio

Despite widespread concern about how AI will change the world, interest in the tools soared late last year when developers made their programs available to the public, often without charge. In business, education, health care, art and other pursuits, innovators are mining AI for solutions and ideas.

Some news outlets outside Vermont jumped right in, creating disasters that became news stories of their own. The website CNET and its associated website, Bankrate, were widely criticized for using AI to write dozens of articles that required lengthy corrections. And the deputy editor of Gizmodo’s io9 site lashed out at its parent company, G/O Media, in July after it published a problematic AI-generated piece.

“It is shoddily written, it is riddled with basic errors … It is shameful that this work has been put to our audience and to our peers in the industry as a window to G/O’s future,” James Whitbrook wrote on X, the site formerly known as Twitter.

Reputable news outlets are exploring a more judicious deployment of AI. Sports Illustrated is using it to generate brief articles and story ideas, according to the Columbia Journalism Review, which also noted that the Associated Press has employed AI for years to fill in numbers in corporate earnings reports. On October 10, the AP announced that it’s experimenting with five AI-based products to help small news organizations deliver information about public safety incidents, weather, city council meetings and news. All will be overseen by human editors, according to Nieman Lab, a fellowship and research program for journalists.

Ties between AI and community journalism are growing. On September 25, the Boston Globe reported that a pair of entrepreneurs had created an online news site called Inside Arlington that uses AI to write up reports on city government meetings. The founders said they’d like to offer the technology nationwide. With advertising revenue steadily migrating to the internet, local media outlets need all the help they can get. Roughly 2,500 U.S. newspapers have disappeared since 2005.

“We work so hard to be taken seriously in this industry. As soon as people think their newspapers are using AI, their credibility goes downhill.” Tabitha Armstrong

With three reporters and one correspondent, Vermont’s Newport Daily Express has a tough time providing meeting news in its coverage area, which spans all 18 towns in Orleans County. General manager Tabitha Armstrong said her advertising department has talked about using AI to write ad copy. But news is a different story.

“We work so hard to be taken seriously in this industry,” she said. “As soon as people think their newspapers are using AI, their credibility goes downhill.”

Greg Popa, editor and publisher of the Vermont Community Newspaper Group, which includes five newspapers in north-central Vermont, said many readers don’t realize the work, relationships and history that writers put into covering a story. He doesn’t think AI can replace that.

Mance agreed. “If you take the human element out of that, you take this kind of important concept of community out of it, too … We’re left with this algorithmically delivered drivel that we absorb through our computer screens.”

There are other concerns about ChatGPT. Some journalists and other creators don’t want their work used to train AI. Forbes reported in July that the Associated Press reached a deal with OpenAI to license archived news stories. But authors, performers and others have joined class-action lawsuits seeking to stop OpenAI and other companies from using their work.

While applications such as ChatGPT mimic human writing, they don’t “know” what they’re writing about. Language models simply predict what words are likely to come next — which means that they frequently insert fabrications known as “hallucinations.” A Columbia Journalism Review piece headlined “Is AI software a partner for journalism, or a disaster?” outlined ways CNET’s AI-generated pieces not only broadcast errors but also plagiarized journalists’ work. It quoted New York University psychology and neural science emeritus professor Gary Marcus, who called AI “a giant autocomplete machine.”

There is nothing automatic about Calabro’s paper, formerly known as the Herald of Randolph. Calabro, a Royalton native, worked for years as the paper’s photographer before he bought it from founder and longtime publisher and editor Dicky Drysdale in 2015. The Herald usually just breaks even, but ad sales are down these days, and the paper is on track to lose money this year. Calabro, one of the paper’s two paid staff writers, also empties the trash and makes the coffee. His wife, a high school teacher, gets up at 2 a.m. to help him deliver copies.

The Herald covers 16 towns, and Calabro depends on the enthusiasm of his correspondents to learn what’s happening in them.

“This is sort of a quasi-volunteer, community-journalism-role thing,” Calabro said of his crew, which includes a meteorologist, a lawyer and a wastewater consultant.

“Some are really gung-ho about getting to every selectboard meeting and covering the hell out of them, and some are more interested in just making sure events in town get funneled in our direction. That’s helpful, too.”

Calabro said he’d rather rely on a human journalist — and, for the moment, an intern is covering meetings — but when she’s gone, he’ll go back to using AI. It’s not taking work away from a reporter, he reasoned, and can at least give readers a basic understanding of what their selectboard is doing.

“The information is important either way,” Calabro said. “It should be in people’s hands.”

The original print version of this article was headlined “Bye-Bye Byline? | Short on reporters, a Vermont newspaper turns to AI”

Got something to say?

Send a letter to the editor and we'll publish your feedback in print!

Anne Wallace Allen covered business and the economy for Seven Days 2021-25. Born in Australia and raised in Massachusetts, Anne graduated from Bard College and Georgetown University and spent several years living and working in Europe and Australia before...