March 8, 2026
We typed to a file—AI talked back
Llm9p: LLM as a Plan 9 file system
Plug-and-play AI: type to a file, the robot writes back
TLDR: llm9p lets people talk to AI by writing to files instead of using apps, sparking a retro-meets-modern frenzy. Fans call it the perfect way to chain AI tasks; skeptics question Claude’s system prompt support, igniting a practicality vs nostalgia showdown that could reshape how we script with AI.
Meet llm9p, the throwback-meets-future hack that lets you talk to an AI by writing to a file and reading the reply—no fancy apps, just your computer’s everyday file tricks. It uses the old-school 9P network file system (think: treat faraway stuff like local files) to hook into Large Language Models (LLMs), with support for Anthropic’s API, the Claude Code command-line tool, and local models coming soon. The vibe? Retro wizardry colliding with modern AI hype—and everyone has thoughts.
Commenter mleroy frames it as part of a bigger movement: “files as the ideal interface for agents,” linking to a lively HN thread. Fans gush that this makes AI “scriptable like Lego,” chaining prompts with pipes and shell commands. But then the plot twist: stingraycharles drops a reality check, asking how llm9p sets a “system prompt” with Claude Code when the tool doesn’t truly support it—apparently “—appendSystemPrompt” just posts a normal user message. Cue the “is this brilliant or cursed?” debate. Jokes fly about “cat-ing a robot” and “pipelines building Rube Goldberg AIs,” while skeptics mutter “just use curl.” It’s nostalgia vs practicality, hacker joy vs confusing edge cases. Whether genius or gimmick, the crowd agrees: it’s delightfully chaotic, and very on-brand for the Plan 9 fan club and shell-script maximalists.
Key Points
- •llm9p exposes LLMs through the 9P filesystem protocol, enabling prompt/response via file reads and writes.
- •Current backends include Anthropic API and Claude Code CLI; a local Ollama backend is planned.
- •The project offers Go-based installation or source builds and runs a 9P server (e.g., on :5640).
- •Mounting options include plan9port (9pfuse/9p tool), Inferno OS (Infernode), and Linux’s kernel 9p module.
- •Users can manage interactions and settings via files (ask, tokens, model, temperature, system, context).