I misused LLMs to diagnose myself and ended up bedridden for a week

Chatbot says “you’re fine” — Lyme says otherwise; commenters yell “see a doctor”

TLDR: A scared user let a chatbot reassure away Lyme disease, ending in a lumbar puncture and a week in bed. Comments erupt with “see a doctor,” vendor blame for unsafe answers, and memes mocking DIY diagnoses—clear consensus: AI isn’t your GP, and dodging care can get costly fast

An anxious poster used a chatbot to soothe away a spreading rash and fevers, and wound up with real-deal Lyme disease flirting with meningitis, a lumbar puncture, IV antibiotics, and a week in bed. The comments lit up like a waiting room: the top vibe is tough love. “Play stupid games; win stupid prizes,” snapped one user, while others dragged the AI vendors for even answering medical questions. A gentler chorus chimed in: it’s fine to say LLMs aren’t doctors, stop apologizing, and go see a human who is.

The drama splits on blame. Some want chatbots to hard-refuse anything medical, full stop; others say this is just common sense—don’t be your own doctor. One practical take: don’t avoid going to the doctor, whether a bot or a buddy says “it’s nothing.” There’s humor too: “WebMD but vibes-only,” “Dr. GPT prescribes reassurance,” and throwback memes to The Net. A side-eye cameo mocks the pop-psych reading list suggestion, because internet brain hacks won’t remove a tick bite. The crowd’s bottom line: fear plus a friendly chatbot is a dangerous combo—pay the co-pay, not the consequences. Meanwhile, a few blame healthcare costs for the delay—another sore spot the thread can’t ignore

Key Points

  • The author advises never using AI or the internet for medical advice and to consult a doctor instead.
  • In July 2025, the author developed flu-like symptoms followed by a non-itchy, non-painful circular rash that expanded.
  • The author used a popular LLM with leading questions to seek reassurance and avoided medical care as symptoms worsened.
  • The condition was diagnosed as Lyme disease that nearly progressed to meningitis, requiring a lumbar puncture and antibiotics.
  • The article highlights how fear and seeking reassurance can lead to irrational reliance on LLMs, delaying necessary medical treatment.

Hottest takes

"Play stupid games; win stupid prizes" — buellerbueller
"don’t use LLMs to diagnose yourself" — only-one1701
"don’t avoid going to the doctor" — cowlby
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.