Topics
up-to-the-minute
AI
Amazon
Image Credits:Pexels(opens in a new window)
Apps
Biotech & Health
Climate
Image Credits:Pexels(opens in a new window)
Cloud Computing
Commerce
Crypto
Enterprise
EVs
Fintech
fundraise
Gadgets
Gaming
Government & Policy
ironware
Layoffs
Media & Entertainment
Meta
Microsoft
Privacy
Robotics
Security
societal
Space
Startups
TikTok
Transportation
Venture
More from TechCrunch
event
Startup Battlefield
StrictlyVC
Podcasts
video
Partner Content
TechCrunch Brand Studio
Crunchboard
adjoin Us
Several popular voice cloning tools on the mart do n’t have “ meaningful ” safeguards to forestall fraud or contumely , according to a raw study from Consumer Reports .
Consumer Reports dig into representative cloning product from six society — Descript , ElevenLabs , Lovo , PlayHT , Resemble AI , and Speechify — for mechanisms that might make it more difficult for malicious users to clone someone ’s voice without their permission . The publication find oneself that only two , Descript and Resemble AI , took stride to combat misuse . Others required only that substance abuser check a boxwood confirming that they had the legal right to clone a representative or make a similar self - attestation .
Grace Gedye , insurance analyst at Consumer Reports , said that AI voice cloning tool have the potential to “ supercharge ” impersonation scams if adequate safe measures are n’t put in position .
“ Our judgment indicate that there are introductory step companies can take to make it harder to clone someone ’s voice without their knowledge — but some company are n’t taking them , ” Gedye say in a statement .