April 30, 2026

C code, chaos, and robot accusations

Show HN: TRiP – a complete transformer engine in C built from scratch just by me

One coder spent 18 months building an AI brain in C, and the comments instantly got messy

TLDR: A lone developer built an all-in-one AI engine in C over 18 months as a learning project, and people were instantly fascinated. The comments turned it into a mini-drama over whether the code was truly handmade, with others demanding hard proof it performs well.

A solo coder dropped TRiP — a homemade AI engine written in plain old C, built over 18 months of lunch breaks and weekend nights — and the crowd immediately split into two camps: "legend" and "hmm, but is this secretly robot-written?" The project is basically a from-scratch toolkit for running, training, and even chatting with transformer-style AI models, all without Python or giant frameworks. Translation for normal people: this is the kind of hardcore DIY build that makes programmers either applaud wildly or squint suspiciously at the screen.

The loudest applause came fast. One commenter practically threw confetti, praising the creator for the project and especially for not relying on "some AI bot" to do the work. That turned into the thread’s emotional core: this wasn’t just code, it was a flex of human persistence. But then came the instant plot twist every internet story needs: "This looks AI generated code, is it?" Oof. The creator had already headed off that exact drama by openly listing the few bits that were AI-assisted — things like a JSON parser, some image-window handling, and README help — while insisting the real heart of the project was hand-coded.

Then the practical crowd barged in with the least glamorous but most predictable question in tech comments: "Any data on performance?" Classic. So while one side was celebrating artisanal coding and another side was sniffing around for robot fingerprints, the third group was basically saying: cool origin story, but how fast is it? In other words, the comments delivered the full internet package: admiration, suspicion, and benchmarking obsession.

Key Points

  • TRiP is a transformer engine written in C and built over 18 months as an educational project to understand transformer internals.
  • The project supports inference, training, tokenizer creation, chat, and vision tasks for Llama 2, Gemma, PaliGemma, and GPT-2 models.
  • Supported checkpoint formats include SafeTensors from Hugging Face and formats used by llama2.c and gpt2, with bf16, float16, and float32 weight support.
  • Training features include full backpropagation with AdamW, cosine annealing learning rate scheduling, and gradient clipping; inference supports greedy, top-k, and top-p sampling.
  • The build process relies on gcc, OpenMP, libjpeg, and libx11, and the article notes that float32 currently performs better than bfloat16 or float16 on CPUs.

Hottest takes

"not using some ai bot for it" — devlsx
"Any data on performance?" — upupupandaway
"This looks AI generated code" — thenewguy077
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.