March 20, 2026
When 3x faster meets 10x spicier takes
We rewrote our Rust WASM Parser in TypeScript – and it got 3x Faster
They ditched Rust for TypeScript — commenters say the real speedup was smarter code
TLDR: A Rust+WebAssembly parser was rewritten in TypeScript and sped up by cutting cross-browser data overhead and using a smarter streaming approach. Commenters say the language swap isn’t the hero—algorithm changes and a rethink are—while side drama flared over a confusing “Open UI” name and unclear product messaging.
A bold headline claimed a 3x speed boost after rewriting a Rust+WebAssembly parser in TypeScript — but the comments instantly turned into a courtroom drama. The loudest verdict: it wasn’t a language knockout, it was smarter strategy. One commenter says the real win was switching from a heavy, “do-everything-again” approach to a lighter “remember-and-skip” one, cutting extra work during streaming. In human speak: they stopped re-checking the same stuff and got faster.
Another twist: WebAssembly (a way to run non-JS code in the browser) added a “boundary tax” — shuffling data back and forth between two worlds. Fans joked the speedup came from “not walking across the hall every time you need a file.” Attempts to return data as native objects were actually slower, so a plain JSON string won. Fewer, bigger moves beat lots of tiny ones.
Cue the meta chaos: one user praised the blog’s slick design, name-dropping Fumadocs, while another slammed the company’s “Open UI” branding for clashing with the long-standing W3C group Open UI. Meanwhile, someone simply asked what the tool actually does — proof that communication, not compilers, may be the real bottleneck. Meme of the day: “Rewrite-driven performance gains.” Or as one commenter put it: it’s not the language, it’s the rethink.
Key Points
- •The openui-lang parser initially implemented in Rust and compiled to WASM incurred significant cross-runtime overhead per call.
- •The WASM boundary costs (string copies, JSON serialization/deserialization) dominated latency, not Rust execution speed.
- •Replacing JSON serialization with serde-wasm-bindgen to return JsValue directly was slower by 9–29% in benchmarks.
- •JSON round-tripping was faster because it minimized boundary crossings and leveraged optimized V8 JSON.parse.
- •Porting the parser to TypeScript eliminated the WASM boundary and yielded a reported 3× overall speedup.