Personal voice assistants (PVAs) are increasingly used as interfaces to digital environments. Voice commands are used to interact with phones, smart homes, or cars. In the United States alone, the number of smart speakers, such as Amazon's Echo and Google Home, has grown by 78% to 118.5 million, and 21% of the U.S. population own at least one device. Given the increasing dependency of society on PVAs, security and privacy of these have become a major concern of users, manufacturers, and policy makers. Consequently, a steep increase in research efforts addressing security and privacy of PVAs can be observed in recent years. While some security and privacy research applicable to the PVA domain predates their recent increase in popularity, many new research strands have emerged. This article provides a survey of the state of the art in PVA security and privacy. The focus of this work is on the security and privacy challenges arising from the use of the acoustic channel. Work that describes both attacks and countermeasures is discussed. We highlight established areas such as voice authentication (VA) and new areas such as acoustic Denial of Service (DoS) that deserve more attention. This survey describes research areas where the threat is relatively well understood but where countermeasures are lacking, for example, in the area of hidden voice commands. We also discuss work that looks at privacy implications; for example, work on management of recording Manuscript