If it feels like people can use just about anything to hack you these days, that’s because, well, they kind of can. They can track your shoulder movements to see what you’re typing, use a laser to hack your smartspeaker through a window, or – our personal favorite – use a lightbulb to listen to your conversations from 80 feet away.
Even your vacuum cleaner can no longer be trusted.
According to a recent study conducted by researchers from the National University of Singapore and the University of Maryland, your vacuum cleaner could be used to record your conversations – and it doesn’t even need to have a microphone to do so.
How does it work?
It all has to do with the LiDAR sensors which smart vacuums are equipped with. Short for “light detection and ranging,” LiDAR “basically works by measuring reflections from a target illuminated by laser light,” in the words of Davey Winder.
In a vacuum, LiDAR sensors track vibrations to keep track of the vacuum’s location and keep it from bumping into objects. This is where we find the vulnerability. According Forbes, here’s what the study found:
The hack, which the researchers have called LidarPhone, is what’s known as a side-channel attack. This is when, rather than exploiting weaknesses or vulnerabilities, hackers exploit implemented strengths but for a reason other than that which might be reasonably expected. In this case, using the LiDAR sensors of a ‘smart’ robotic vacuum cleaner, whose function is to help the device navigate and map its surroundings, to act as microphones and record conversations covertly.
Across 19 hours of this covert recording, the researchers say they managed to collect conversations and music played by a computer speaker and TV soundbar. “Sounds are essentially pressure waves that propagate through the vibrations of the medium,” the research paper reveals, “sound energy in the environment is partially induced on nearby objects creating subtle physical vibrations within those solid media.” By repurposing the vacuum cleaner’s LiDAR sensors, they were, therefore, able to grab those vibrations, which were then processed “to recover traces of sounds.” This deep learning algorithm-driven recovery process having a success rate of around 90%, according to the research paper.
Should I be concerned about my smart vacuum?
Before you lose your head, here’s a couple of things to keep in mind about the possibility of such an attack:
- The hacker would need to gain access to your network first in order to conduct such an attack.
- Lab-controlled conditions are much different from a “real-life” hacking scenario: lighting, distance from target, and background noise can significantly reduce the effectiveness of the algorithm. Graham Cluley at Bitdefender Box relates: “In addition, most surfaces are not as good at reflecting signals as windows and mirrors, which means that a robot vacuum cleaner may be collecting data with a low signal-to-noise ratio.”
So don’t panic yet. But it’s 2020, so anything is possible.