Amazon Alexa. Photograph:( WION Web Team )
Amazon's popular voice assistant Alexa recorded a couple's conversation in Portland, Oregon two weeks ago and send it to random people in its contact list.
The couple said they were shocked when they found out that it had been recording their conversations without them knowing. The first audio file was sent to a person in Seattle.
The woman with Amazon devices across her home received a call two weeks ago from her husband's employee, who said Alexa had recorded the family's conversation about hardwood floors and sent it to him.
"I felt invaded," the woman, only identified as Danielle, said in the report. "A total privacy invasion. Immediately I said, 'I'm never plugging that device in again because I can't trust it.'"
Amazon confirmed the incident and explained how it was an "unlikely string of events" that resulted in Alexa sending the audio clips to random people rather than Alexa spying on users.
Alexa, which comes with Echo speakers and other gadgets, starts recording after it hears its name or another "wake word" selected by users. This means that an utterance quite like Alexa, even from a TV commercial, can activate a device.
"Echo woke up due to a word in background conversation sounding like 'Alexa.' Then, the subsequent conversation was heard as a 'send message' request," Amazon said in a statement.
"Subsequent conversation was heard as a 'send message' request," the company added. "At which point, Alexa said out loud 'To whom?' At which point, the background conversation was interpreted as a name in the customer's contact list."
The Echo only confirms a contact name if there are multiple people in an address book with the same or similar sounding names.
Amazon added, "We are evaluating options to make this case even less likely." Assuring customers that Alexa's security is crucial to Amazon, which has ambitions for Alexa to be ubiquitous - whether dimming the lights for customers or placing orders for them with the world's largest online retailer.
University researchers from Berkeley and Georgetown found in a 2016 paper that sounds unintelligible to humans can set off voice assistants in general, which raised concerns of exploitation by attackers. Amazon did not immediately comment on the matter, but it previously told The New York Times that it has taken steps to keep its devices secure.