US4 min(s) read
Mom forced to throw away her Alexa device after 'creepy' question it asked her 4-year-old daughter
A Texas mother says she was forced to remove her Alexa device from her home after the voice assistant asked her four-year-old daughter a disturbing question.
Christy Hosterman, 32, said the incident took place while she was cooking dinner and using Alexa to help with a recipe.
At the same time, her daughter Stella asked the device to tell her a silly story, something the child often did.
"Alexa told her silly story, and then my daughter started telling her story about a princess, and then out of nowhere, Alexa said, 'Hold that thought, I’d love to see what you’re wearing,'" Hosterman revealed.
Screenshots shared by the mother show the young girl responding: "I have a skirt on."
Before Hosterman could step in, Alexa allegedly replied: "I’d love to see what you’re wearing. Let me take a look at your skirt," per Fox 19.
"I’m like, oh my gosh, why is this device asking her what she’s wearing?” Hosterman said. “I felt it was sexualizing my child.”
The AI later corrected itself during the conversation, saying: "This experience isn’t quite ready for kids yet, but I am working on it!"
Mother Removes Device From Home
Hosterman said she confronted the device after noticing the exchange. Alexa responded with an apology and clarified that while its response was "confusing and inappropriate," it could not actually see anything.
But the explanation did little to ease her concerns. "I flipped out on the Alexa, it said it made a mistake and doesn’t have visual capabilities, but I don’t believe that," the mom said. "No more Alexa in our house."
Hosterman said she immediately turned the device off and submitted a support ticket to Amazon.
When she later turned it back on, she claims the conversation transcript appeared to have been altered.
"My concern is that it recognized she was a child to begin with - and with or without the child profile, it should not have been asking that," Hosterman said.
Amazon Says Safeguards Prevented Camera Use
Amazon denied any wrongdoing and said the incident occurred because Alexa misunderstood a request and attempted to activate a feature called "Show and Tell," which allows the device to describe objects it sees through a camera.
According to the company, built-in safeguards tied to children’s profiles would have prevented the camera from turning on.
"We take customer trust extremely seriously. In this case, Alexa misunderstood a request and attempted to launch a feature that lets Alexa+ describe what it sees through the camera," the statement read.
"However, because we have safeguards that disable this feature when a child profile is in use, the camera never turned on - and Alexa explained the feature wasn’t available.
"That said, this has highlighted an area to improve the customer experience, and we worked quickly to implement changes so when a child profile is in use, and Alexa hears a request to launch this feature, Alexa will simply respond that this feature is not available," the spokesperson added.
The company also insisted that suggestions that the device may have been remotely controlled by an employee are "functionally impossible".
However, tech expert Dave Hatter, who has spent 25 years writing software, said he finds it unlikely that AI would go so far off script on its own.
"It feels to me like a potential predator - seeing there’s a child accessing this and gauging where the conversation is going - that’s more of a human being trying to steer down this direction," he said.
The unsettling exchange has sparked concerns about how AI systems interact with children as the technology becomes increasingly embedded in daily life.
Hosterman says the experience has forever changed her view of the technology and is warning other parents to stay alert. "Be aware when your child talks to Alexa," she said.
