March 25, 2026

Paging Dr. Drama: Notes vs Bots

Health NZ staff told to stop using ChatGPT to write clinical notes

Banned from ChatGPT, told to trust “approved” bots — staff cry pressure, patients cry foul

TLDR: Health NZ warned staff off free AI like ChatGPT for clinical notes and threatened discipline, while rolling out its approved scribe, Heidi. Comments split between “follow policy” advocates and critics crying hypocrisy and privacy fears, with added worry that slick-looking AI notes can still hide dangerous errors.

Health NZ just dropped a memo telling mental health staff to stop using free AI tools like ChatGPT, Claude, and Gemini for clinical notes — and yes, even if you anonymise. The warning: do it and you could face discipline. Official line? It’s about privacy, data security, and accountability. The irony machine revved up fast because HNZ is also rolling out its own “approved” AI scribe, Heidi, which emergency doctors say saves time per RNZ.

Commenters immediately split into camps. One crowd says this is basic patient safety — use approved tools, not random chatbots. Another calls it whiplash: ban the free stuff, push the branded bot. A union voice says the memo’s threat-first tone will only make burnt-out staff hide their workarounds instead of asking for help. Meanwhile, patients chime in with a big side-eye, claiming they’re nudged to accept AI note‑taking and that “local only” promises don’t always match reality.

The spiciest fear? LLM notes that look perfect but could be wrong. One GP reportedly ditched Heidi over “too many errors,” while others insist specialised tools are miles better than free chatbots. Net vibe: a messy three‑way tussle between privacy rules, exhausted clinicians, and skeptical patients — with AI stuck in the middle wearing a name badge.

Key Points

  • HNZ issued a memo prohibiting staff from using free AI tools like ChatGPT, Claude and Gemini to draft clinical notes.
  • The ban applies even if AI-generated notes are anonymised and later transcribed into handwritten or typed notes.
  • HNZ’s AI policy requires tools to be registered with the NAIAEAG; this includes the approved AI scribe tool “Heidi.”
  • HNZ warned that using unapproved AI tools could lead to formal disciplinary action and declined to disclose incident numbers.
  • The Public Service Association criticised the memo’s approach and urged investment in training and approved tools, citing staff workload pressures.

Hottest takes

“Patients are guilted into allowing the doctors to use it” — samglass09
“This is just about not using free/public AI tools” — keithnz
“LLM-written clinical notes probably look fine. That’s the whole problem” — gurachek
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.