Fri. Jul 5th, 2024

This Free Tool Stops Deepfake Grifters From Stealing Your Voice<!-- wp:html --><p>Photo Illustration by Luis G. Rendon/The Daily Beast/Getty</p> <p>It’s a nightmare scenario: You receive a phone call from an unknown number and, when you pick it up, it’s the <a href="https://www.thedailybeast.com/mom-received-an-ai-generated-scam-phone-call-of-her-daughters-voice-saying-she-was-kidnapped">voice of your loved one frantically telling you they’ve been kidnapped</a>. Although your first instinct is to call the police, they tell you that their kidnappers will kill them soon unless you immediately wire a large sum of money to a bank account. There’s no time to think. What do you do?</p> <p>While scary, the person on the other side of that call might not be your loved one at all. In fact, it may not even be a person—but rather an <a href="http://thedailybeast.com/keyword/artificial-intelligence">artificial intelligence</a> clone of your loved one’s voice to trick you into sending money.</p> <p>In the past year, these voice scams have exploded as deepfake <a href="http://thedailybeast.com/keyword/technology">technology</a> becomes more powerful and widespread. It’s staggeringly simple to pull off too: All a scammer needs is a short clip of someone’s voice. From there, the AI can replicate it to say practically anything.</p> <p><a href="https://www.thedailybeast.com/this-free-tool-stops-deepfake-grifters-from-stealing-your-voice">Read more at The Daily Beast.</a></p><!-- /wp:html -->

Photo Illustration by Luis G. Rendon/The Daily Beast/Getty

It’s a nightmare scenario: You receive a phone call from an unknown number and, when you pick it up, it’s the voice of your loved one frantically telling you they’ve been kidnapped. Although your first instinct is to call the police, they tell you that their kidnappers will kill them soon unless you immediately wire a large sum of money to a bank account. There’s no time to think. What do you do?

While scary, the person on the other side of that call might not be your loved one at all. In fact, it may not even be a person—but rather an artificial intelligence clone of your loved one’s voice to trick you into sending money.

In the past year, these voice scams have exploded as deepfake technology becomes more powerful and widespread. It’s staggeringly simple to pull off too: All a scammer needs is a short clip of someone’s voice. From there, the AI can replicate it to say practically anything.

Read more at The Daily Beast.

By