We don’t know if you’ve heard it, but the Google Assistant sounds freaky-real. In an amazing demonstration, the virtual assistant made a hair appointment and restaurant reservations, sounding like a real person — complete with casual words and even an “mmhmmm” thrown in for believability. There’s no question voice technology will continue to transform our lives, but as with any new technology, it may take a while for solid security to catch up with innovation.
This past week, a Portland family’s Amazon Echo recorded a conversation of them – without them knowing – and then sent an audio file to one of their contacts.. Apparently, the device mistook words in the background as “Alexa,” “send message” and another set of words that sounded like one of the owner’s contacts. Fortunately, it was a boring conversation but umm…that could have been embarrassing
Privacy issues aren’t limited to Alexa. Researchers have devised a proof of concept that gives potentially harmful instructions to popular voice assistants like Siri, Google, Cortana, and Alexa using ultrasonic frequencies instead of voice commands. As we continue to create ways to make life easier, we also need to be working hard to make it safer.