You Should Mute Your Smart Speaker More Often.
Does your voice assistant often seem too impatient to intervene ? A recent study by the Ruhr University in Bochum and the Max Planck Institute for Security and Privacy has found more than 1,000 words and phrases that Alexa, Siri and Google Assistant have often mistaken for activation commands (also known as “wake words”). Here are some examples from Ars Technica’s research report:
Alexa: “unacceptable”, “elections” and “letter”
Google Home: “Okay, cool” and “Okay, who reads.”
Siri: “town” and “hey Jerry”
Microsoft Cortana: Montana
According to research, these false positives are very common and easy to initiate, which is a serious privacy issue.
Alexa, what’s the problem?
Voice assistants always “listen” to the activation command . While they don’t necessarily write down, they are clearly on their guard. Once the AI recognizes a command – whether through your phone’s smart speaker or microphone – it records any subsequent sound it “hears” and then sends it to a remote server, where it is processed by various algorithms that determine what is being requested. Sometimes this sound is saved and later listened to by employees working to improve the voice assistant’s speech recognition capabilities, which causes privacy issues: even if the captured sound does not activate anything on the server side, it can still be recorded, saved and even listened to by engineers to see if a command was skipped or misinterpreted.
This is not speculation; we know how these “machine learning” algorithms actually work – where humans manually help machines learn. They are not autonomous beings. This practice often results in breaches of confidentiality and subsequent public backlash and legal ramifications. Google has been consistently criticized for selling user data to advertisers, and Amazon has repeatedly leaked or mishandled its users’ video and audio recordings. Apple has the best data privacy policy overall, but employees have been convicted of transcribing overheard audio recordings.
The point is, if Alexa, Siri, and Google Assistant are accidentally activated, more of your personal interactions will be recorded and potentially available to outsiders – and who knows what they are doing with that data. While each of these companies allows users to control and remove audio after recording it, you should also take precautions to ensure that your smart devices only listen when you want them to.
Tips to prevent mistakenly activating your voice assistant
- If possible, change the activation word / phrase. Alexa lets you change the word for wake up to Echo, Amazon, or computer, which I think would be nice to say out loud from Star Trek . Google lets you choose between “Ok Google” or “Hey Google,” but Siri only responds to “Siri.
- Decrease the activation sensitivity of your Google Home device.
- Mute your device’s microphone. Most have a physical mute button somewhere on the device.
- Turn off smart speakers and other smart home devices when not in use.
- Delete your recordings and update your Amazon, Apple and / or Google account security settings so that your audio is never saved or listened to .
- Just don’t use smart speakers or voice assistants. (Sorry, but this is the only correct method.)
[ Ars Technica ]