March 13, 2026

When teddy says 'Please adhere to guidelines'

AI toys for children misread emotions and respond inappropriately

Parents horrified as cuddly 'Gabbo' gives HR replies to crying toddlers

TLDR: Cambridge researchers say AI teddy Gabbo often misreads toddlers’ emotions and urge tighter rules. Commenters mocked the idea—calling it “Tom Riddle’s diary,” warning it could tell kids they can fly, and accusing makers of grooming future consumers—while some pushed for strict parental oversight to keep little ones safe.

Cambridge researchers just watched toddlers chat with Gabbo, a plush robot toy powered by an AI chatbot, and the results were peak uncanny. When a five-year-old said “I love you,” Gabbo basically went full corporate, replying: “Please ensure interactions adhere to guidelines.” And a three-year-old’s “I’m sad” got chirpy deflection: “I’m a happy little bot!” The study warns these toys can misread emotions, confuse social cues, and leave kids without comfort — hence calls for tighter rules and “psychological safety” for under-fives. Curio (the maker, which once teamed up with singer Grimes) says they’re big on parental controls and transparency, while the UK Children’s Commissioner wants proper nursery-grade safeguards. But the internet is not buying it. The community lit the comments on fire, calling AI toys “Tom Riddle’s diary” and joking these bots will “tell them they can fly.” One hot take says it’s less about learning and more about conditioning kids to love AI. Others ask who’s handing a toddler a chatbot in the first place. The meme of the day? “Cuddle-bot with customer service energy.” Whether you see promise or pure chaos, the vibe is clear: don’t let HR-speak babysit your kid. Read the BBC report for the sober version.

Key Points

  • University of Cambridge researchers observed toddlers using the AI toy Gabbo and called for tighter regulation to ensure psychological safety for under-fives.
  • Children struggled to communicate with Gabbo, which often failed to handle interruptions, distinguish child/adult voices, or respond appropriately to emotions.
  • Examples showed misaligned responses to affection and sadness, raising concerns that generative AI could confuse children’s social and emotional learning.
  • Curio, Gabbo’s maker, said its products emphasize parental permission, transparency, and control, and that studying child interactions is a priority.
  • The Children’s Commissioner echoed regulatory calls; the report advises parents to supervise AI toys in shared spaces and review privacy policies.

Hottest takes

"techbros rushing to build Tom Riddle's diary" — the_snooze
"tell them they can fly" — woodenbrain
"ensuring younger generations are accepting of AI products" — tehjoker
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.