May 14, 2026

Dr. Hallucination is on shift

Ontario auditors find doctors' AI note takers routinely blow basic facts

Ontario’s doctor note bots are inventing medical details — and commenters are horrified

TLDR: Ontario auditors found many doctor-approved AI note tools were missing key facts, mixing up drugs, and inventing details in patient records. Commenters were split between alarm, gallows humor, and practical fixes — with many stunned that note accuracy barely mattered in the scoring process.

Ontario’s grand experiment with artificial intelligence note-taking for doctors has landed in full-blown "you had one job" territory. A provincial audit found many approved systems didn’t just miss details — they reportedly made things up, mixed up drugs, and even added treatment suggestions that nobody said out loud. The most jaw-dropping stat? Nine of 20 systems allegedly fabricated information, while 12 inserted wrong drug details and 17 missed important mental health information. For tools meant to help produce medical notes, readers saw that as less “helpful assistant” and more chaos intern with a stethoscope.

And the comments? Absolutely buzzing. One reader shared a workplace horror story where an AI meeting note tool invented a vendor promise that never happened, nearly triggering executive rage. That anecdote had big “this is why nobody trusts the robot summary” energy. But not everyone was clutching pearls: one of the spiciest replies joked that a 60 percent medical error rate sounded “about normal,” which turned the thread into a dark-comedy debate over whether the machines are uniquely bad or just depressingly human. Others focused on practical fixes, like timestamped recordings so every summary can be checked against what was actually said.

The extra scandal fuel? The audit says note accuracy counted for just 4 percent of the evaluation score, while having an Ontario presence counted for 30 percent. Unsurprisingly, commenters read that and basically screamed: wait, accuracy was the side quest?

Key Points

  • An audit of 20 AI Scribe systems approved for Ontario healthcare found frequent inaccuracies, omissions, and fabricated content in clinical notes.
  • Nine of the 20 systems fabricated information and suggested treatment changes that were not mentioned in the source recordings.
  • Twelve of the 20 systems inserted incorrect drug information, and 17 missed key mental health details discussed in the recordings.
  • The auditor said Ontario’s procurement scoring gave only 4 percent weight to note accuracy, while 30 percent depended on vendors having a domestic presence in Ontario.
  • More than 5,000 Ontario physicians are participating in the AI Scribe program, and the Ministry said there have been no known reports of patient harm associated with the technology.

Hottest takes

"They never promised anything" — zOneLetter
"60% sounds about normal lol" — ceejayoz
"something like this is critical for things as important as healthcare" — Hobadee
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.