Researchers discover huge security holes in Amazon’s ‘skills’ for Alexa

You might want to place a moratorium on using Alexa’s ‘skills’ until Amazon can sort out some gaping privacy holes in its third-party access. According to a study published today by a team of researchers from North Carolina State University, your personal data – including, potentially, your banking information and contact lists – could be at risk if you’ve installed any third-party skills from the Alexa skills marketplace. First things first: Skills are Alexa’s versions of apps. They’re useful for everything from controlling third-party hardware gadgets such as smart lights or smart thermostats to logging in to your bank account using… This story continues at The Next WebOr just read more coverage about: Security,Amazon

You might want to place a moratorium on using Alexa’s ‘skills’ until Amazon can sort out some gaping privacy holes in its third-party access.

According to a study published today by a team of researchers from North Carolina State University, your personal data – including, potentially, your banking information and contact lists – could be at risk if you’ve installed any third-party skills from the Alexa skills marketplace.

First things first: Skills are Alexa’s versions of apps. They’re useful for everything from controlling third-party hardware gadgets such as smart lights or smart thermostats to logging in to your bank account using voice-command via Alexa.

The only reason the issues raised by the researchers don’t constitute a red-alert situation is that we’re currently unaware of any evidence these security risks have been maliciously exploited. That being said, you might want to uninstall all your third-party Alexa skills until Amazon issues assurance the privacy holes have been plugged.

The problem: Simply put, Amazon doesn’t appear to properly vet third-party skills developers. That means there’s no verification in place to ensure the person or company selling or giving you a skill is who they say they are. Apparently, the system’s set up so that you might think you’re using a skill from your smart thermostat or smart lock manufacturer when in fact you’re being duped by a shady imitator.

It gets worse. The researchers also found that developers can use redundant wake words. In the worst case here, you might be fooled into thinking you’re giving your information to a company you trust because you used an invocation phrase like “Alexa, open the Blah Blah Blah Banking app” when in reality someone’s aped that phrase for nefarious purposes.

Finally, in what could be the most glaring security , according to the researchers Amazon allows third-party skills publishers to change their privacy policies after gaining approval and publishing. Per a university press release:

The researchers demonstrated that developers can change the code on the back end of skills after the skill has been placed in stores. Specifically, the researchers published a skill and then modified the code to request additional information from users after the skill was approved by Amazon.

Quick take: Our advice to anyone who uses an Alexa-enabled device is to go to your Amazon account and ensure you’re not using any third-party skills. At least until Amazon addresses the problems raised by the researchers.

Luckily, it’s really easy to do this.

  • Step one: Log into your Amazon account
  • Step two: Search for “Alexa skills” and click the top result
  • Step three: click on “Your Skills” and make sure you’re not using any third-party skills.

We’ve reached out to Amazon for comment and we’ll update this article as soon as we hear back.

You can read the full paper here.

Published March 4, 2021 — 18:45 UTC

Live Updates for COVID-19 CASES