2018 IEEE Conference on Communications and Network Security (CNS) 2018
DOI: 10.1109/cns.2018.8433167
|View full text |Cite
|
Sign up to set email alerts
|

The Insecurity of Home Digital Voice Assistants - Vulnerabilities, Attacks and Countermeasures

Abstract: Home Digital Voice Assistants (HDVAs) are getting popular in recent years. Users can control smart devices and get living assistance through those HDVAs (e.g., Amazon Alexa, Google Home) using voice. In this work, we study the insecurity of HDVA service by using Amazon Alexa as a case study. We disclose three security vulnerabilities which root in the insecure access control of Alexa services. We then exploit them to devise two proof-of-concept attacks, home burglary and fake order, where the adversary can rem… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
75
0
3

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 84 publications
(78 citation statements)
references
References 14 publications
0
75
0
3
Order By: Relevance
“…With the rapidly growing popularity and capabilities of voice-driven IoT systems, the likelihood and potential damage of voicebased attacks also grow very quickly. As discussed in [2], [13], [11], an attack may lead to severe consequences, e.g., a burglar could enter a house by tricking a voice-based smart lock or an attacker could make unauthorized purchases and credit card charges via a compromised voice-based system. Such attacks can be very simple, but still very difficult or even impossible to detect by humans.…”
Section: Attacks On Voice Controlled Systemsmentioning
confidence: 99%
See 2 more Smart Citations
“…With the rapidly growing popularity and capabilities of voice-driven IoT systems, the likelihood and potential damage of voicebased attacks also grow very quickly. As discussed in [2], [13], [11], an attack may lead to severe consequences, e.g., a burglar could enter a house by tricking a voice-based smart lock or an attacker could make unauthorized purchases and credit card charges via a compromised voice-based system. Such attacks can be very simple, but still very difficult or even impossible to detect by humans.…”
Section: Attacks On Voice Controlled Systemsmentioning
confidence: 99%
“…Voice replay attacks, i.e., an attacker makes a VCS perform a specific malicious action by replaying a previously recorded voice sample [1], [10], [11]. This attack can be executed remotely, e.g., via the Internet.…”
Section: B Basic Voice Replay Attackmentioning
confidence: 99%
See 1 more Smart Citation
“…C Urrently, we have already stepped into the era of artificial intelligence from the digital era. Apps with AI systems can be seen everywhere in our daily life, such as Amazon Alexa [1], DeepMind's Atari [2], and AI-phaGo [3]. Nowadays with the development of edge computing, 5G technology and etc., AI technologies become more and more mature.…”
Section: Introductionmentioning
confidence: 99%
“…CAGFuzz iterative selects the test examples in the processing pool and generates the adversarial examples through the pre-trained adversarial example generator (see Section 3 for details) to guide the DL systems to expose incorrect behaviors. During the process of generating adversarial examples, 1. https://github.com/QXL4515/CAGFuzz CAGFuzz maintains adversarial examples to provide a certain improvement in neuron coverage for subsequent fuzzy processing, and limits the minor perturbations invisible to human eyes, ensuring the same meaningfulness between the original example and the adversarial example.…”
Section: Introductionmentioning
confidence: 99%