April 17, 2026
Backspace to the Future
I'm spending 3 months coding the old way
Brooklyn coder goes old‑school for 3 months—devs split between craft and comfort
TLDR: An AI engineer is spending three months coding without AI at a Brooklyn retreat to rebuild “mental muscles.” Commenters split: some say real power is mastering AI assistants, others praise the back‑to‑basics grind, while veterans joke that twenty minutes of manual debugging isn’t exactly hardship—showing craft vs. convenience is the new fault line.
AI engineer Miguel Conner just hit pause on robot helpers and moved to Brooklyn for a three‑month coding retreat, promising to write software “by hand.” He says agents are amazing tutors and speed boosters, but admits they can make you skip the hard thinking that builds real skill. He even cites writer Cal Newport’s idea that mental strain is like a gym workout—painful but essential. Cue the comment section cage match.
The loudest crowd? The agent wranglers. One user argued the true power move now is learning to manage AI assistants, not escaping them. Another chimed in with pure pragmatism: let the bots grind the “money job,” then go play with brainy languages for fun. On the other side, craft purists cheered the back‑to‑basics vibe, with a Recurse Center alum dropping wholesome “stay curious” energy.
Then the drama: a veteran dev flexed sore wrists and said large language models cut the typing without cutting quality—why go back? But a skeptical commenter roasted Miguel’s note about asking an AI for help after “twenty minutes” of debugging, turning it into a mini‑meme: twenty minutes? Old‑timers are cackling. Between jokes about “keyboard bootcamp” and “carpal tutor,” the thread split into two camps—build your brain vs. build your bot. Still, a reluctant truth keeps popping up: the best coders will probably do both—push their own minds, then make the machines hustle for them.
Key Points
- •Miguel Conner is spending three months in Brooklyn coding mostly without AI to deepen his understanding of codebases and fundamentals.
- •He previously built AI agents at Aily Labs in Barcelona, including an internal web search agent in early 2024, prior to public releases by Anthropic and OpenAI.
- •His team adopted tools like Cursor early and used LLMs for tasks such as constructing knowledge graphs while testing new approaches.
- •He led a weekly journal club on open-source LLMs (DeepSeek R1, AI2’s OLMo 3, Meta’s Llama 3) to evaluate tradeoffs between training in-house models and using closed models.
- •He argues manual coding enhances learning and codebase grasp, while coding agents speed iteration and serve as effective tutors; he cites Cal Newport’s view on mental effort as key to craft.