The importance of human oversight in AI-driven reporting
- Sebastian Griffin
- 7 hours ago
- 3 min read

AI is transforming how we produce, consume, and distribute information. Tools powered by artificial intelligence can accelerate research, summarize gigantic data sets, write papers, and demystify for others how algorithms that analyze us work. But the speed of progress carries with it a truth of its own: faster isn’t always better, especially when it comes to government documents.
The recent publication of the U.S. Department of Health and Human Services report 'Make America Healthy Again' provides a reminder of this. The report was packaged as an ambitious, sweeping vision for improving health outcomes, but it came under scrutiny after it was revealed that dozens of citations were either plagiarized or fabricated, with some references citing studies that did not actually exist. This issue from the HHS report doesn’t show a new problem as much as it illustrates a new twist on an old one. In the past, government agencies have relied on employees and researchers in order to generate reports and citations. The presumption, both then and now, is that such materials are rigorously vetted before publication. The real mess was not the AI, but the review process. Best practices don’t change: Regardless of who or what produces a report, it’s the human official signing off on the report who is responsible for its verification.
This lack of verification isn't new. In 2023, as reported by Reuters, a U.S. judge imposed sanctions on two New York lawyers who submitted a legal brief that included six fictitious case citations generated by an artificial intelligence chatbot, ChatGPT.
Even traditional media has run into these problems. Early in 2024, the BBC tested the capabilities of AI tools such as ChatGPT, Copilot, Gemini, and Perplexity. The result? More than half of the AI answers tested contained significant factual errors, such as misquoting sources, making up facts, or providing bad advice.
These are the kinds of errors that should give us pause, especially as AI comes to be more deeply embedded in education, media, and policymaking.
At Mountain States Policy Center, we are all about innovation. For example, MSPC is blazing the trail with the launch of an AI-based assistant, “WONK.” Built to make research on legislation both more accessible and more transparent, WONK is designed to allow citizens, legislators, and reporters to quickly get details on legislative bills. By using artificial intelligence, MSPC isn’t just talking about modernizing government and policy conversations; we’re doing it.
We also believe in accountability and transparency. At a time when large language models can produce policy memos, budget summaries, and even state reports in minutes, the role of human reviewers becomes more important, not less.
AI can be a valuable tool, but it does not have the judgment, the context, or the moral compass of human reasoning. It is unaware of the consequences of an erroneous statistic or a fabricated fact that is relied upon. It doesn’t know when a citation warrants double-checking, nor does it take into account how an error might corrode public trust, especially in policymaking.
To reap the benefits of AI without sacrificing public trust, policymakers should establish basic guardrails, like ensuring that AI-assisted government reports, press releases, and policy memos must be human-reviewed and verified before they are published.
It would also be good for stakeholders to develop an AI literacy (what AI can and cannot do) training for civil servants who work with sensitive data, legal documents, and public health to ensure every public servant has access. A well-trained staff can avoid the very mistakes now filling the newspapers.
This isn’t about stifling innovation. It is about keeping the public trust at the forefront and making sure new technologies serve truth, not just speed. The future should be rooted in accountability, not fear. AI is here to stay, and we should all learn how to use it as a tool rather than a speedy crutch.