What's Missing in the 'Agentic' Story

Who’s your computer really working for? Users rage over bossy bots and lock‑in

TLDR: Mark Nottingham warns that so‑called “AI agents” may serve companies more than users. Commenters clash over control vs convenience, call agents a band‑aid for weak AI, and fear vendor lock‑in—boiling down to one question: who does your helpful bot actually answer to?

Veteran web guru Mark Nottingham says the quiet part out loud: your “AI agent” might not be your agent at all. His post warns that modern computers and smart assistants often serve their makers’ interests, not yours—and the comments section totally ignited. One camp cries “nostalgia alert,” with users mocking rose‑tinted memories of the 90s where we spammed Ctrl‑S because apps crashed every five minutes.

The big brawl: control vs convenience. The loudest voices say they want to steer, not be chauffeured. “Agents” feel like bossy back‑seat drivers—pushing pop‑ups, taking actions, and turning the user into a passenger. Meanwhile, AI skeptics drag the hype: one commenter claims agents are just a clever band‑aid for Large Language Models’ limits—not space‑age magic. Translation: less “genius butler,” more “context‑trimming intern.”

Then the lock‑in panic hits. Builders warn that while MCP (an open tool standard) sounds nice, the actual harnesses—think Claude Code and Cursor—are proprietary. One wrong corporate move and your product breaks. The crowd demands open harnesses so we don’t “rebuild mobile” with a new, shinier leash. And the stakes? Some go full stakes‑on‑fire: AI must be loyal to the user, not the state—no middle ground. The memes write themselves: agent or double agent?

Key Points

  • The article challenges the assumption that computers inherently act solely on users’ behalf, an expectation formed in the era of local, task-specific software.
  • Traditional tool analogies and fiction have reinforced beliefs that technology reliably serves user interests with minimal intrusion.
  • In Internet-connected systems, users must trust multiple layers (hardware, software, network services), each embedding creators’ interests that may diverge from users’.
  • Market forces and regulation can align producer and user incentives (e.g., in chipmaking), but these mechanisms are imperfect.
  • Modern businesses exploit gaps in alignment, making narratives like “your AI agent” unsafe assumptions about loyalty and control.

Hottest takes

"I want to be in the driver’s seat." — ryandrake
"your product is one anthropic decision away from breaking." — aykutseker
"ensure that its loyalty is to the user, and not the state." — aeon_ai
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.