Third-party Amazon Echo and Google Home apps are a minefield of scary security flaws
Smart speakers are all the rage right now, with global sales growing at a rate reminiscent of the smartphone industry’s early days and two companies rising above the vendor pack to vie for the crown. In their battle for world domination, Amazon and Google are trying everything in their power to stand out and one-up each other, pretty much improving the capabilities of their already impressive virtual assistants every day and constantly expanding their hardware portfolios as well.
One very delicate matter that hasn’t received a lot of media attention involves the vetting process of so-called Alexa “skills” and Google Home “actions.” These features that can be added to the two companies’ smart home devices via official stores are developed by third parties, which makes them susceptible to outside attacks.
Be careful what your voice assistant asks you to do
Security Research Labs, which specializes in, well, security research developed such an app with the intent to reveal the shocking vulnerabilities of both Alexa and Google Assistant-enabled devices. As it turns out, it was indeed extremely easy to plant basic Alexa skills and Google Home actions that could “vish” (voice phish) users’ passwords by pretending to be legitimate “lucky horoscope” services with regional restrictions of some sort.
Said apps could trick users into believing their devices and not the actual apps in question were requesting vocal password verification to install software updates. Of course, that’s not how updates work on smart speakers, but this is just one example of a security breach that could be exploited in a very serious way. Other examples include asking for an email address corresponding to said password or even financial information.
Eavesdropping is also incredibly easy
Another test that SRLabs ran to verify the strength of Google and Amazon’s app approval mechanisms consisted of manipulating the same innocent-looking horoscope “skill” and “action” into listening in on conversations when theoretically deactivated. Predictably enough, simply asking a third-party app to “stop” giving you previously requested information may not stop the listening process as well.
In this case, hacking an Echo and a Google Home work a little differently, but the end result is equally scary. Anything you say around these devices can be used against you in a number of ways that we honestly don’t want to talk about in a lot of detail.
Google and Amazon are putting “mechanisms in place” to improve security
By the way, what SRLabs did was not purely theoretical, publishing malicious apps that passed Google and Amazon’s approval process while obviously not exploiting the potential of the skills and actions before contacting the two tech giants and removing the phishing and eavesdropping apps.
Based on what Google and Amazon are saying, it should no longer be possible for such blatant security vulnerabilities to go unnoticed in the future, but it’s probably better to be careful what you download anyway. And remember, don’t do anything sketchy that your voice assistant might ask.