Most AI voice cloning services have inadequate protections against nonconsensual voice impersonation, according to a Consumer Reports investigation. The study examined six leading publicly available tools and found that five had easily bypassed safeguards. As reported by NBC News, four services (ElevenLabs, Speechify, PlayHT, and Lovo) merely require checking a box confirming authorization, while Resemble AI’s real-time recording requirement can be circumvented by playing existing recordings. Only Descript demonstrated somewhat effective protection by requiring a specific consent statement. The technology has legitimate uses, including accessibility applications, but experts warn about potential misuse for fraud, scams and disinformation. Most services are free, with only ElevenLabs and Resemble AI charging nominal fees to create custom voice clones.