Security Stop-Press : Warning Over Amazon’s Human Voice Mimicking Plans For Alexa

Security

A Global Cybersecurity Advisor at ESET has warned that Amazon’s plans to enable the Alexa voice assistant to mimic human voices (dead or alive) could be used to launch deep fake audio attacks on some voice authentication security systems. The advice from some security experts is that, if Amazon goes ahead with voice mimicking for Alexa, it may be wise to switch from using voice authenticate e.g., for bank accounts to another verification method such as online banking via a smartphone.