Be careful of your keyboard leaking privacy! Study says AI listening recognition accuracy is as high as 95%

Image source: Generated by Unbounded AI

Source: Financial Associated Press

Author: Malan

Laptop users beware. The sound of keystrokes could also be turned into a "Morse code" for artificial intelligence to decipher personal information, a new study suggests.

A recent paper presented at the 2023 IEEE European Symposium on Security and Privacy found that artificial intelligence can recognize keystrokes with 95 percent accuracy just by sound. Users may leak their private information, passwords and credit card numbers due to the sound of keystrokes.

The experimenters pointed out that the success rate of correctly identifying the MacBook Pro keys was 95% when they heard the keystrokes on the phone, and 93% through Zoom video recordings. This means that interested people can easily get the information they need through phone calls and videos.

The research paper details so-called "acoustic side-channel attacks," in which a malicious third party uses an auxiliary device, such as a phone next to a laptop or an unmuted microphone on software such as Zoom, to record the sound of typing, which is then deciphered by artificial intelligence. Input audio information.

With recent developments in microphones and deep learning models, the feasibility of acoustically attacking keyboards is beginning to emerge, according to the paper.

Laptop is the best target

Laptops have always been loved by computer users because of their portability, and they have also been playing the role of auxiliary office in public places. But when people take it to libraries, coffee shops, etc., they may not have thought that this will become the biggest risk of personal privacy leakage.

The paper points out that the sound of typing in public is easily recorded, and people are usually unaware of such attacks, so there is no way to prevent them.

People subconsciously pay attention to people around them when typing a password, or cover their screen, but no one is aware of the sound of the keystroke itself, which means that it is easy for third parties to get access to things that people desperately want to keep private.

The researchers therefore recommend that people set more complex passwords, such as adding special characters, uppercase and lowercase letters, or numbers. Passwords that use complete words are easier to deduce.

Another defense is to use two-factor authentication, such as adding email or phone verification, to confirm account activity from other devices. Also, this technique can be applied to virtually any keyboard, so use caution with devices other than laptops.

The study's co-author, Dr Ehsan Toreini of the University of Surrey, said the accuracy of such models, as well as such attacks, is increasing. As smart devices with microphones become more common in homes, this underscores the need for a public debate on AI governance.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)