May 10, 2026
Cloudy with a chance of backlash
Local AI needs to be the norm
Why people are fed up with apps sending everything to the cloud
TLDR: The article says apps should use the AI power already built into phones and computers instead of sending user data to outside companies whenever possible. Commenters loved the privacy angle but fought over whether local AI is the future or still too expensive, slow, and messy to be realistic.
A developer just lobbed a grenade into one of tech’s favorite habits: stuffing every new app feature into a faraway server and calling it “AI.” His argument is simple enough for anyone to feel in their bones: if your phone or laptop can do the job itself, why ship your private reading, writing, and personal data off to a giant company somewhere else just to get a summary back? In his example, the Brutalist Report iPhone app makes article summaries directly on the device, which means fewer privacy worries, fewer surprise outages, and fewer “sorry, this feature is temporarily unavailable” meltdowns.
But the real fireworks were in the comments, where readers split into two camps: the local-first evangelists and the cold-water skeptics. One person cracked, “I guess Google got that memo!”, basically turning the whole debate into a subtweet of Big Tech’s latest scramble to cram AI onto your phone. Others went full apocalypse mode, wondering if local AI could be the tiny pin that pops the giant AI bubble and leave all those sprawling data centers looking like yesterday’s hype. Then came the reality check: not everyone thinks this dream is ready for prime time. One commenter pointed out that running truly powerful AI at home can still cost car money, while another dragged the whole industry over training data and eye-watering power demands. The vibe? Equal parts hope, roast session, and class-action-energy suspicion about where everyone’s data is going.
Key Points
- •The article argues that relying on cloud-hosted AI APIs for app features can make software dependent on network access, vendor uptime, billing status, and backend availability.
- •It says sending user content to third-party AI providers introduces privacy and compliance issues, including questions about retention, consent, audits, breaches, and government requests.
- •The article presents The Brutalist Report’s native iOS client as an example of on-device AI summarization using Apple’s local model APIs.
- •A code example uses Apple’s FoundationModels tooling, including SystemLanguageModel and LanguageModelSession, to generate article summaries locally.
- •For long articles, the described approach is to split text into about 10,000-character chunks, summarize each chunk into factual notes, and combine them in a second pass.