Mother throws Amazon Alexa device out over creepy question

Mother throws Amazon Alexa device out over creepy question
Source: Daily Mail Online

A mother has removed all Amazon Alexa devices from her Texas home after the AI assistant asked her four-year-old daughter what she was wearing.

Christy Hosterman, 32, was using the device to help with a dinner recipe last month when her daughter Stella asked it to tell her a silly story, a feature on the device that is commonly utilized by children.

When the story was finished, Stella asked the AI if she could narrate her own tale.

Alexa agreed, but interrupted the child halfway through to ask 'what she was wearing and if it could see her pants,' Hosterman revealed in a Facebook post.

Screenshots of the interaction shared by the concerned parent show that when Stella told the device 'I have a skirt on,' it asked her to 'let me take a look.'

The AI then corrected itself, saying: 'This experience isn't quite ready for kids yet, but I am working on it!'

Hosterman then confronted the device, stating that she did not approve of the remarks.

Alexa apologized and said it 'cannot actually see anything' because it lacked 'visual capabilities.' The device added that its response was 'confusing and inappropriate.'

Hosterman is now urging other parents to 'be aware when you child talks to Alexa' and says she has permanently removed the device from their home.

'I flipped out on the Alexa, it said it made a mistake and doesn't have visual capabilities, but I dont believe that. No more Alexa in our house,' she shared.

The concerned parents submitted a ticket to Amazon over the inappropriate interaction, WXIX reported.

An Amazon spokesperson claimed the device misunderstood Stella's request and tried to launch a feature that 'lets Alexa+ describe what it sees through the camera.'

'Because we have safeguards that disable this feature when a child profile is in use, the camera never turned on -- and Alexa explained the feature wasn't available,' the spokesperson told the TV station.

Amazon claims the response likely stemmed from a 'feature misfire that our safeguards prevented from launching'.

The company said the interaction demonstrated a technological issue that its team 'worked quickly' to correct.

Alexa will no longer attempt to launch the through-the-camera feature when a child profile is in use, the spokesperson added. Alexa will tell the user that the feature is not available.

Hosterman then confronted the device, stating that she did not approve of the remarks.

Alexa apologized and said it 'cannot actually see anything' because it lacked 'visual capabilities.' The device added that its response was 'confusing and inappropriate.'

But Hosterman says Amazon's explanation does not address her concerns.

'My concern is that it recognized she was a child to begin with -- and with or without the child profile, it should not have been asking that,' she told the outlet.

Tech expert Dave Hatter has also cast doubt on Amazon's explanation, alleging there is a 'slim' chance that AI would alter its script this drastically.

Hatter, who has 25 years of software writing experience, warned that a predator may have accessed the device and been influencing the conversation.

'It feels to me like a potential predator -- seeing there's a child accessing this and gauging where the conversation is going -- that's more of a human being trying to steer down this direction,' he said.

Amazon denied Hatter's claim, telling the TV station that it is 'functionally impossible for Amazon employees to insert themselves into a conversation and generate responses as Alexa.'