Hackers Can Steal Your Passwords by Listening to Your Input

So, you think you have all this digital security. You don’t reuse passwords; you do not follow suspicious links; you can even use burner email addresses . You are invincible. Except, oops, what is it? Are you still hacked? Wait, you haven’t been typing lately, have you? Rookie mistake.

According to Bleeping Computer , researchers have successfully trained an AI model to recognize certain keystrokes on a keyboard using a built-in microphone in either a computer or a jailbroken smartphone. Worst part? The model created by these researchers can guess which key was pressed with up to 95% accuracy. Don’t worry though: when they used Zoom to train the model, the accuracy dropped to 93%. We are saved.

Seriously, it’s not hard to see why “acoustic attacks” are bad news: an AI model like this could be used to spy on people’s typing habits and collect everything from confidential information to passwords. Imagine opening up Slack, typing a privileged message to your boss, then launching your bank’s website and entering your username and password to verify your account. This AI system can capture up to 95% of that, which in the long run means it collects the vast majority of what you type.

How does this (hypothetical) acoustic attack work?

For starters, the attacker will record your keyboard typing by capturing audio through your computer or another device with a microphone, such as a smartphone. Another method is to target the participant in the Zoom call and parse their input sounds with the appropriate message that appears in the chat.

And how did the researchers train their model to recognize these specific keyboard sounds? Why, they used computers from a company that most likely boasts of privacy and security: Apple. The researchers pressed 36 individual keys on the new MacBook Pros 25 times each and then ran the recordings through the software to determine the tiny differences between each key. It took some trial and error to get the final result, but after enough testing, the researchers were able to identify keystrokes with 95% accuracy when recording from a nearby iPhone, and with 93% accuracy when using the Zoom method.

How to defend against (again hypothetical) acoustic attacks

The good news is that this particular AI model is purely for research purposes, so you don’t have to worry about running into it in the wild. However, if researchers can figure it out, attackers won’t be far behind.

Knowing this, you can protect yourself by keeping the process in mind: this attack only works if the microphone is recording your keystrokes, which means your computer or phone must have been jailbroken beforehand, or you must be on a Zoom call. with the attacker. Knowing this, keep an eye on your device’s microphone permissions and disable access to any app that doesn’t seem to need it. If you see that your microphone is active when it shouldn’t be, that’s also a red flag.

You should also mute when you are not actively speaking on a Zoom call: this is a good practice anyway, but especially useful if there is an intruder on the call. If you’re muted when you’re typing in a chat, they won’t be able to use it against you.

To avoid being hacked, make sure you also follow the usual security tips: don’t click on unfamiliar links, don’t open messages from unfamiliar senders, and don’t download or open files that don’t belong to you. I’m not sure.

Password managers are your friends

That being said, let’s say you’ve been hacked without even knowing it, and your phone is listening to your keystrokes. It’s good practice to use password managers whenever possible, especially those that use autofill: if you can log into your accounts with a face scan or fingerprint scan, you won’t have to worry about entering passwords. You can also run white noise near your devices, so any sound recordings will be useless.

More…

Leave a Reply