April 15, 2026
Point, click, gauge, rage
Gemini Robotics-ER 1.6
Robots that point, plan, and read gauges — fans cheer while skeptics yell “just buy a $50 meter”
TLDR: Google launched Gemini Robotics-ER 1.6 to help robots understand space, plan tasks, and read gauges via the Gemini API. Commenters split between “near-brain smarts” hype and sharp skepticism, demanding latency numbers and joking that a cheap $50 gauge might beat a pricey, slow demo — stakes are real for home and industry.
Google just dropped Gemini Robotics-ER 1.6, promising robots that can point precisely, understand multiple camera views, and even read old-school analog gauges — a trick refined with Boston Dynamics. Devs can play with it now via the Gemini API and Google AI Studio with a handy Colab. But the comments? Absolute chaos. Some dream big: one fan imagines near-human behavior if the thinking speed (aka inference) gets faster. Others are roasting the demo: jeffbee calls it “the murder dog reading a gauge” and says, hey, we’ve had cheap machine vision and $50 digital gauges forever. Practical voices want numbers, not vibes — “what’s the latency in hertz?” — while home DIYers ask if a consumer app exists to point a camera at a gauge and auto-plot it. Safety hawks warn that in homes, one broken dish is a PR disaster, so the real breakthroughs might be stuck behind the lab door. Meanwhile, Google’s pitch is that this model is the high-level brain that can call tools like Google Search and other functions, and its new success detection (knowing when a task is done) could be the secret sauce. Meme of the day: “Pointing is the new clicking.”
Key Points
- •Google introduced Gemini Robotics-ER 1.6, enhancing embodied reasoning for robots with better spatial reasoning and multi-view understanding.
- •The model acts as a high-level reasoning component and can call tools like Google Search, VLAs, and third-party functions to execute tasks.
- •Version 1.6 improves over Gemini Robotics-ER 1.5 and Gemini 3.0 Flash in pointing, counting, and success detection.
- •It adds a new instrument reading capability to interpret gauges and sight glasses, developed with Boston Dynamics.
- •Gemini Robotics-ER 1.6 is available now via the Gemini API and Google AI Studio, with a developer Colab for setup and prompts.