
New AI tools can copy almost anyone’s voice, and sadly, scammers are taking advantage of it. It’s never been easier to clone a voice—take a sample, process it, and make it say whatever you want in that person’s voice. This tech has been around since 2018, but today’s tools are faster, more accurate, and simpler to use.
Don’t believe me? This year, OpenAI—the folks behind ChatGPT—showed off a project that can clone a voice with just a 15-second recording. Their tool isn’t out for public use yet and has safeguards to stop misuse, but Eleven Labs offers a similar service. For just $6, you can clone a voice from a one-minute sample—and it’s open to everyone!
You can see how this could go wrong. Scammers can grab voice samples from phone calls or social media videos and use them to trick people. One common scam is the grandparent trick: a scammer calls an older person, mimics their grandchild’s voice, and begs for money, claiming they’ve been in an accident or arrested. They’ll often say to keep it a secret—especially from their parents—which could tip you off if you check and find your real grandchild is okay.
If you don’t know voice cloning is a thing, it’s easy to fall for it. Even if you do, it’s hard to spot if you’re not paying close attention. And if you suspect something, would you really call out your grandchild as a fake? They’d probably just deny it anyway.
This scam is tricky, and lots of people are getting caught every day. As AI voice cloning becomes more common, expect it to grow even more popular.
Here’s how to protect yourself: Sit down with your family and pick a secret code word for emergencies. Then, if someone calls claiming to be a relative needing cash, ask them for the code word. If they can’t give it, they’re likely a scammer.
What do you think about this scam trend? I’d love to hear your thoughts!