April 29, 2026
No Sarah, no secrets, just panic
The end of "Just ask Sarah"
When the office genius quits, the whole company suddenly remembers it should’ve written things down
TLDR: The article says companies can’t keep relying on one person to remember why things were done a certain way, especially as AI tools do more work from written instructions alone. Commenters agreed the “Sarah” problem is real, but argued over whether better notes fix it — and roasted the whole situation with burnout jokes.
A spicy little workplace truth bomb just dropped: too many teams still run on "just ask Sarah" energy, meaning one overworked person quietly carries the backstory for every weird decision, odd rule, and messy workaround. The article’s big warning is simple: when software helpers and AI tools are asked to do more work, they can only use what’s actually written down. If the reasons behind past choices live only in one person’s head, the machine guesses — and the chaos gets repeated.
But the comment section was where the real fireworks happened. Some readers agreed with the premise and turned it into a labor rant, with one self-described office Sarah warning people to avoid becoming that person at all costs, saying it can trap your career while everyone depends on you but ignores your advice. Others were far more skeptical, arguing the article is overselling documentation like it’s some magic cure. Their hot take: written notes can also be misunderstood, and no document can fully replace a real human who knows the history.
Then came the jokes, and they were brutal. One commenter sneered that the writer should "ask Sarah how to write an article without relying on an LLM," while another delivered the darkest punchline of the thread: Sarah burned out and is now a barista. Ouch. The mood was a mix of agreement, cynicism, and gallows humor — less “save the docs” and more “maybe stop building companies around one exhausted legend.”
Key Points
- •The article argues that engineering organizations often rely on a long-tenured person who informally holds critical architectural and historical context.
- •It says AI agents cannot access undocumented organizational memory the way human engineers can through direct conversation.
- •The article describes missing ADRs and specs as a source of “intent debt,” where agents inherit implementation patterns without the reasoning behind them.
- •It contrasts recoverable human knowledge transfer with agent sessions that reset and lose context unless information is stored durably.
- •The article concludes that code captures implementation outcomes but not the reasons behind design decisions, making durable documentation more important.