I turned Markdown into a protocol for generative UI

Dev crowd split: genius hack or UI chaos

TLDR: A new demo turns simple Markdown into a live control room where an AI streams text, code, and data to build apps on the spot. Fans call it a clever shortcut the AI already knows, while skeptics ask how real buttons, good design, and reliability survive the chaos—stakes are big if agents build our screens.

Eric Schmidt says “UIs will go away,” and this prototype dares to make it real: an AI writes text, code, and data in one Markdown stream (the simple text format you see on GitHub), then spins up live interfaces on the fly. The demo shows code firing as it’s typed, React components popping into place, and data trickling in live—equal parts magic and “is this cursed?” energy.

The crowd is buzzing. One camp is hyped about the cheat code: LLMs already “speak” Markdown, so no new language needed. As one fan put it, this is just giving the model a runtime for what it already knows. Others are kicking the tires with alternatives like Markdown UI and name-dropping rival formats like OpenUI and JSON-render.

But the pushback is spicy. Skeptics warn that good design won’t fall out of a chatbot, saying there’s more to real interfaces than auto-generated forms. Practical folks ask the hard stuff: how do buttons and clicks actually work if the AI keeps changing the UI? Fans of MCP-UI (a chat UI framework) say it’s easier when you “hard code” the interface—less mystery, fewer surprises.

Meanwhile, jokesters are already branding it “hyper text”, and meme-lords are quoting the author’s own line—“Is it cursed? Yes. Works? Mostly.”—like it’s the back-of-the-box sticker. Welcome to Markdown Mayhem.

Key Points

  • Prototype uses Markdown as a single protocol stream for text, code, and data.
  • Three block types: Markdown text, tsx agent.run (server execution), and json agent.data (data streaming).
  • Introduces streaming execution so statements run as they are generated, using bun-streaming-exec with vm.Script.
  • Console output and exceptions are fed back to the LLM for iterative feedback during execution.
  • A mount() primitive serializes React components server-side for rendering client-side, enabling reactive UIs.

Hottest takes

"If you're still looking for a name let me suggest "hyper text"." — zeroq
"there’s a lot more to (good) UIs than what an LLM will ever be able to bring" — iusethemouse
"Using markdown as the transport layer is clever because every LLM already speaks it fluently" — wangmander
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.