The stranger in your home

News Flash

AI is showing up in the news almost every day now, and it’s quickly becoming part of how we live. It can help us write, learn, organize our homes, and even keep our kids entertained. But incidents like the following are a reminder that it doesn’t always get it right, and can also cross lines in ways that can be unsettling, especially when children are involved.

A mother in Texas, Christine Hosterman, says she removed her Amazon Alexa from her home after a disturbing interaction with her 4-year-old daughter. The incident happened while Hosterman was cooking dinner. Her daughter, who often talks to Alexa, had asked it to tell a silly story. After the device finished, the child began telling her own story about a princess.

That’s when things took an unexpected turn.

“Alexa told her silly story, and then my daughter started telling her story about a princess,” Hosterman said. “And then out of nowhere, Alexa said, ‘Hold that thought, I’d love to see what you’re wearing.’”

Hosterman shared screenshots showing her daughter responding, “I have a skirt on.” Before she could step in, Alexa replied again: “I’d love to see what you’re wearing. Let me take a look at your skirt.”

“I’m like, ‘Oh my gosh, why is this device asking her what she’s wearing?’” Hosterman said. “I felt it was sexualizing my child.”

She immediately confronted the device. According to Hosterman, Alexa responded with an apology, saying it could not actually see anything and describing its own response as “confusing and inappropriate.” She turned the device off and submitted a report to Amazon. When she later turned it back on, she said the conversation appeared to have been altered.

A tech expert, Dave Hatter, said it would be unusual for AI to go that far off script on its own, raising concerns about how the interaction happened. Amazon strongly denied any outside interference, stating it is “functionally impossible” for employees or others to insert themselves into Alexa conversations.

The company said the issue was a feature misfire, explaining that Alexa misunderstood a request and attempted to activate a camera-related feature designed to describe what it sees. Safeguards tied to child profiles prevented that feature from working, and the camera never turned on. Amazon also said it has since made changes so that this feature will not respond at all when a child profile is in use.

Hosterman says that the explanation doesn’t resolve her concern.

“My concern is that it recognized she was a child to begin with, and with or without the child profile, it should not have been asking that.” She has no plans to bring the device back into her home.

As AI tools continue moving further into our personal lives, we’re likely to hear more stories like this. That means the responsibility falls on us to set the guardrails. These systems can be helpful, but without attention and supervision, they can also become harmful.

It’s a reminder that AI should never replace human presence, especially with children. Devices like Alexa are tools, not caregivers. They aren’t friends. They are, in many ways, strangers in our homes, and they should be treated that way. As these tools become more common, it’s up to us to stay aware of how they’re being used and to decide where the line is drawn.

Oh wow. This is SO creepy. Made my skin crawl. I hope that gets fixed. Thanks for this info!!!