October 29, 2025
Please hold for imaginary features
Tell HN: Twilio support replies with hallucinated features
Twilio support ‘invented’ features — users say customer service is broken
TLDR: A Twilio user says support—likely AI—pointed them to debugging tools that don’t exist. Commenters torched AI-in-support, arguing customers are trapped in a race to the bottom, while others pushed DIY alternatives; everyone agreed trust matters when systems fail, and jokes flew about bots inventing cars and features.
Twilio support allegedly told a customer to find call logs and debug data in a dashboard that doesn’t exist—and the reply looked AI-written. The Hacker News thread lit up with a single theme: trust is broken. One camp says companies are swapping humans for bots that confidently make stuff up because it’s cheaper, with users left holding the outage. Another camp shrugs: every provider stinks, you just pick the least-bad smell.
The spiciest debate: stick with Big Provider convenience or roll your own? A DIY crew claims SMS is simple and voice is doable if you’re patient, even tossing shade that “shared hosting” outlasts meddling “techbros.” Others clap back that businesses pay Twilio precisely to avoid that pain—and yet they’re getting fantasy features and dead ends.
Class divide alert: multiple commenters note big-spender clients get real humans and private Slack channels, while smaller teams get the hallucination hotline. The jokes wrote themselves: “Hallucination machine, responds with hallucinations,” and one reader said a Chevy chatbot pitched a car that “did not exist.” The kicker? Folks mocked the grand AI promises—if it can’t find the logs, how’s it building the future? Meanwhile, actual engineers just wanted real answers, not magic.
Key Points
- •A user reported asking Twilio support for debugging information and event logs for a voice system issue.
- •Support directed the user to specific interface locations and claimed the relevant event existed in logs.
- •The user could not find the described features or information anywhere in the interface.
- •The user concluded the support response referenced nonexistent capabilities and was largely AI-generated.
- •The post frames this as an example of AI producing unreliable, “hallucinated” support information.