Hackers get into Google Assistant and Siri using ultrasonic waves
The idea of some people watching your every move is scary.
Imagine having someone watch over your every move on your smartphone. And what’s worse? They also steal your information and use it to your advantage.
Hackers digging their way through smartphones is definitely not a new thing. Every now and then, we hear news about this type of activity.
In fact, according to a report, mobile and internet hacking will tremendously increase this year.
Some pose grave threats while some have already been stopped even before taking off publicly. Regardless though, it would still scary knowing that someone is around the corner trying to exploit you.
Unfortunately, there is no breathing of fresh air yet. There is a group of hackers that get into Google Assistant and Siri just by using ultrasonic waves.
ALSO READ: How to remove a virus & fix errors on Android devices?
Hackers use ultrasonic waves to activate Google Assistant, Siri
Here is what’s up:
In 2017, researchers in China reported on a clever way to access a digital/virtual assistant, such as Google Assistant or Siri, just by using inaudible ultrasonic sound waves.
Fast forward to today, a new team at Washington University in St. Louis has been working on similar technology.
The bad news?
Their version is even more capable and scarier than the original.
The idea behind this technology is this:
Rather than interacting with a voice assistant through normal and spoken commands, you can instead encode your voice on an ultrasonic carrier that your smartphone or smart speak is still able to interpret.
This latest version of the stealthy exploit requires several ultrasonic hardware, once assembled, the device can successfully issue silents commands even from up to 30 feet away.
This is a huge increase in the 2017’s version of “few feet” limitation. Also, the newest can be used through physical barriers, like metal, wood, and glass, unlike the old version. But it had trouble operating through thin fabrics such as a tablecloth.
The possible risks…
Can you now imagine the risks?
In the wrong hands, a hacker could essentially exploit and get personal information from your smartphone, hijack and invade smart home gadgets, and more.
Technology is ever always evolving. So, who knows what this technology is still capable of, right?
This technology takes up a commanding style same as commanding a Google Home or Amazon Echo with laser beams. Luckily, the complexity of this hack makes it an unlikely candidate to show up out of nowhere.
Despite all these, there is an easy way to completely derail this potential threat.
Users should disable always-on listening. The thing is if your smartphone is not waiting to hear a voice assistant command, basically, it can’t be hijacked.
But when an assistant is invoked, you may want to consider removing personal results from the list of items it can access in the process.
Furthermore, always check your Google Activity history and review your activity log. This way, you can check and see what kinds of commands Assistant has carried out.
How can a site being to protect us from our data being stolen from us make us feel protected? How can we protect ourselves when we trust everything online?