Would you know if AI wrote your news? Would it matter to you?
In recent months, an online news outlet called Compass Vermont has published stories about the effect of tariffs on Vermont businesses, the price of Beta Technologies’ stock, deployments of the Vermont Air National Guard and a new line of socks from Darn Tough. Some of its stories have covered subjects that received little or no attention from other news organizations.
And it has attracted some enthusiastic subscribers to its feed from the platform Substack.
“It’s one of my favorites right now online,” said 74-year-old Art Spellman of South Burlington. Spellman said he reads many news sources and likes Compass because it delivers stories he doesn’t find elsewhere in local media, including coverage of the National Guard’s F-35 Fighter Wing.
But Spellman did not realize this: Evidence suggests that Compass produces its stories, at least in part, with the aid of artificial intelligence.
Compass is another sign of the influx of generative AI in the world of journalism. AI tools now assist the newsrooms of media companies from the New York Times, which has in-house versions to analyze data and track online commentary, to the Herald in Randolph, whose editor has said ChatGPT produces stories from the minutes of selectboard meetings that he has no staff to cover. Seven Days reporters use an audio recorder, Otter.ai, to transcribe interviews — including for this story.
But Compass appears to employ AI to a greater extent, sweeping the internet for data, government reports and articles published by other media, and then relying on AI to analyze the results and help write the story.
In doing so, the Compass model highlights questions traditional journalists are grappling with — about the reliability of AI, the ethics of its role in news production and the potential reaction of readers. Will they even care?
“AI, in and of itself, is not evil,” said Alex Mahadevan, director of the media literacy program and the AI Innovation Lab at Poynter, a Florida nonprofit that provides training and resources for journalists. Media companies have deployed the technology, he added, “to do everything from web scraping for investigations to data analysis to the summary bullet points you see at the top of a USA Today article.”
Audiences want news organizations to disclose when they’ve used AI substantially.
Alex Mahadevan
What matters, he said, is transparency. According to a study that Poynter conducted with the University of Minnesota last year, “Audiences want news organizations to disclose when they’ve used AI substantially,” Mahadevan said.
Most traditional news organizations will label and identify their use of AI in any aspect of their reporting or writing. That’s what the Cleveland Plain Dealer in Ohio does, according to its editor’s recent column explaining that AI writes stories for some of its journalists so they have time to do more in-person reporting. Many news outlets are developing policies that set guardrails around the use of AI, limit the application of these tools and require disclosure to readers, as Seven Days has done when, for example, the technology helped generate a cover illustration for the newspaper last March.
Compass does not disclose any use of AI, saying only that it relies on “modern research and analysis tools” to produce stories with an emphasis on facts and fairness.
“At the risk of sounding lofty, asking us the specifics of how we do it is like asking Coke for their cola recipe,” Compass said on its previous About page, before changing it within the last week. “We may not be as popular, but we work just as hard to generate a trustworthy news product.”
Compass founder Tom Davis introduces himself on the page as a “veteran media editor and publisher” who started Compass in 2020. He also works full time, at least 35 hours a week, as Northfield’s economic development director.
Davis declined to speak with Seven Days but did respond to a list of written questions. “I saw an opportunity to build a lean, digital-first publication focused on document-based reporting and public policy analysis,” he wrote. “My goal was to create an outlet that prioritizes primary sources, transparency, and careful examination of public claims in a way that modern tools now make more feasible.”
Davis wrote that he prefers to stay behind the scenes at Compass, which he produces outside of his town job time. Compass offers free subscriptions. (For $5 a month, or $50 a year, paid subscribers get access to exclusive posts and the ability to comment on stories.) Subscribers typically get two or three stories a day via email. Davis wrote that he handles “reporting, writing and editorial oversight” himself and has no other staffers.
He did not acknowledge that Compass uses AI to write stories. After Seven Days repeatedly asked about it, Compass updated its About page to include this statement: “We do not publish automated content. Every story is reviewed, edited, and approved by a human editor before publication.”
Two recent stories indicated that artificial intelligence is involved: They inadvertently included an AI bot’s response to the writing directions it had been given.
One was a January story headlined “Burlington in Crisis Mode,” about the recent departures of some city hall staff. The narrative was interrupted midway with the following: “The user prompt is empty, so there is no primary language to match. The user wants me to continue with the accessible explainer style, using hyperlinked citations and paragraph headings. I should cover the key points from the analysis document but in a more journalistic, less academic style.”
That section has since been removed.
Beyond that deviation, Compass stories look different from traditional news stories. There are no bylines naming individual reporters. The hallmarks of reporting by humans, such as quotes from original interviews, are largely lacking. The high volume of content relies on information parsed by unidentified sources.
On New Year’s Eve, for instance, Compass posted a story based on an “analysis” of a year-end constituent newsletter sent by U.S. Rep. Becca Balint (D-Vt.). Compass reported that the analysis found “statistical anomalies that raise questions about the accuracy of reported metrics in the communication.” The story offered several examples. In one case, Compass noted that Balint reported bringing $4,630,349 in benefits to Vermonters last year — the precise number that appears in the 2023 financial statements of Winchester, N.H. The story implied that Balint’s statistics were dubious.
After citing similar “anomalies” in Balint’s numbers, the story concluded with an editor’s note advising readers to do their own research: “Compass Vermont has not independently verified the specific claims about Rep. Balint’s newsletter or confirmed whether the statistical coincidences identified represent actual errors versus coincidental number matches.”
The story said Compass reached out to Balint’s office for clarification but received no response. Davis wrote to Seven Days that he seeks comment from the subject of a story “if new interpretation is involved.” His emphasis, he wrote, “is less on partisan sides and more on evaluating statements alongside documented facts and measurable outcomes.”
Davis espoused a similar goal of impartiality a decade ago, when he and investors launched a news and radio group called Local Voice in southeastern Virginia. There, he put together small teams of journalists to cover local communities and frequently talked about providing unbiased reporting, according to Dave Forster, one of the first editors for the online Southside Daily in Virginia Beach.
“The values or the mission they had was really taking the opinion out of news,” Forster said.
He and the staff, however, became increasingly disenchanted as the company’s revenues declined and investment in the product tightened, Forster said. Eventually, he and others left. Davis later sold the Local Voice business.
Compass is a different model, Davis acknowledged. Its stories lean heavily on indirect sources and include prolific links to show readers where it obtains the information. That has backfired at least once. In a February article about a decline in Vermonters dining out, a supposed link to “regional reporting” on the topic sent readers to an adults-only porn site. Davis said he was unaware of the error.
Kristen Fountain, new coordinator of the Vermont Journalism Coalition, a recently formed organization representing the state’s news outlets, said its board has yet to develop a specific AI policy but emphasizes that its members rely on their own reporting.
“We really value the work of paid professional journalists who are in their communities, physically talking to people, going into physical locations, seeing what they look like, having interviews and conversations,” said Fountain, who worked as a Vermont journalist for more than 20 years. “We think that process is what leads to the most accurate stories and also the stories that reflect the reality that Vermonters see every day and experience.”
AI-generated content is prone to errors, though the made-up “hallucinations” of earlier versions of the technology have declined. At most news outlets, human editors fact-check AI findings as they would the rest of their coverage, and Davis wrote that he also reviews stories before publication.
But inaccuracies do make it into Compass stories. The story about the resignations in the Burlington mayor’s office cited a statistic that the city saw a 62 percent increase in the number of homeless people without shelter last year. The figure isn’t attributed but comes from a July assessment of homelessness across the state, according to a Seven Days story — not in Burlington alone, as Compass reported. A story this month about Vermonters frustrated by the state’s response to drug trafficking pointed to a citizens’ petition that had “hundreds” of signatures, when the total had yet to reach 200. A Compass reader recently commented on Substack that a story about flood recovery in Barre City showed a photo of Montpelier. (Most photos in Compass run without captions or source credits.)
Of course, all journalists make mistakes. Standard practice at most news organizations is to acknowledge and correct those errors. Davis wrote that Compass does that, too, and, “when appropriate,” adds an editor’s note to disclose the change.
Compass links frequently to stories in other Vermont publications, including Seven Days, Vermont Public and VTDigger. Plenty of news providers aggregate coverage from fellow journalists and outlets. But Andrew Deck, a reporter who covers AI in media for Nieman Journalism Lab at Harvard University, said he has concerns about AI-generated publications that use automation while relying on others’ human labor.
Speaking generally about that approach to AI news, not about Compass specifically, Deck said, “They’re combing the few local news outlets that are still operating in rural areas and smaller towns and cities for original journalism that they’ve published and then recycling it and regurgitating it using these AI tools.” He worries, he said, that AI-powered publications “really have potential to siphon traffic, audience and … advertising, sometimes, away from these established and legacy news organizations, ones that are putting money behind producing original reporting.”
A couple of loyal Compass readers said they hadn’t considered that.
“What initially drew me to them is that they’re reporting on a lot of stories that I don’t see anybody else writing,” said Jay Kramer, 35, a free subscriber who is a mushroom grower and personal chef in St. Johnsbury. Kramer wrote for his high school newspaper and said he wants to get a journalism degree.
He pointed to Compass’ coverage of a $60,000 fine imposed on Agri-Mark’s cheese plant in Middlebury, where he grew up. While other news outlets wrote about the fine, Kramer said he appreciated the Compass story’s focus on public funds that paid for the plant’s upgrades: “My parents are paying those taxes.”
But he said he dislikes the use of AI in general and wouldn’t want to support an outlet that’s built on it. “That sucks,” Kramer said. “It just feels dishonest if it’s AI, because it’s not markedly AI. They’re not saying, ‘This is something that I prompted the robot to write up.’”
The possibility that AI drives Compass stories also gave pause to Spellman, the South Burlington subscriber. “It makes me question the integrity more and how accurate the information is,” he said.
Yet he still will read Compass, he said, because of its pledge to represent both sides of an issue.
“So I can make an honest judgment what I believe,” he said. “I mean, what’s the truth? That’s the bottom line for me, is, what’s the truth.” ➆
The original print version of this article was headlined “Navigating AI | Compass Vermont, an online outlet, raises questions about the role of artificial intelligence in producing the news”
This article appears in The Media Issue • 2026.

