Ask HN: What's a standard way for apps to request text completion as a service?

Dev begs for one switch for text magic; HN says "no standard, DIY"

TLDR: A dev asked for a built-in way to get text completion from the OS, and the crowd said there’s no universal standard yet. Suggestions ranged from OS models and API wrappers to a $LLM variable, with pricing and token headaches fueling the debate as apps rush to add AI features.

A developer asked the internet’s favorite peanut gallery: is there a standard way for apps to ask your computer for AI-powered text completion? The crowd’s verdict: nope, but everyone’s got a workaround and an opinion. One camp waved at Windows and macOS, claiming they ship “small models” you can tap with a wrapper, while the pragmatists shouted: just use Ollama’s generate API or Simon Willison’s cross-platform llm tool and move on.

Then came the protocol purists, pointing to MCP—Model Context Protocol—and its “sampling” feature via this spec. Translation: there’s a fancy way, but it might be overkill for a tiny text tool. Meanwhile, a fan favorite joke landed: Linux has $EDITOR to choose your terminal editor, so why not $LLM for your AI autocomplete? “Convene the committee,” someone quipped, and the thread instantly felt like standards-war cosplay.

The spiciest tension? Money and tokens. One commenter said apps want AI without juggling usage-based pricing or forcing users to paste secret keys. Local apps can ask users for a token, but for web apps, it’s messy. The vibe: everyone wants a “plug into system AI” button; no one agrees on what it should be.

Key Points

  • A developer asks if a standard OS-level interface exists for apps to request LLM text completion.
  • They seek an implementation-agnostic mechanism for invoking text completion services.
  • An example use case involves a TUI browsing JSONL files with natural language parsing.
  • The desired feature would translate natural-language queries into jq commands.
  • If no standard exists, the author asks what it would take to build and widely deploy one.

Hottest takes

"Windows and macOS does come with a small model" — billylo
"Some type of service is needed... not deal with usage based pricing" — cjonas
"support the $LLM or $LLM_AUTOCOMPLETE variables" — netsharc
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.