Get $20 in free Bitcoin when you sign up and buy $50+ in crypto before January 31, 2026. Terms apply.

AI Voice Cloning Scams: The New Frontier in Fraud

AI is now entering our lives in a good way, but it can also be used by scammers to manipulate us out of crypto or money. Here's what to look out for.

In this article...

  • AI voice scams are on the rise
  • Scammers create a voice like your loved one and try to manipulate you out of crypto or money
  • Here's what to look out for
AI voice cloning scam

Artificial intelligence is transforming the way scammers operate, with AI-powered voice cloning emerging as one of the most worrying new tactics. These scams cross all industries, whether that be cryptocurrency exchanges, banks or even just random people.

Using short audio samples, fraudsters can now create convincing replicas of someone’s voice and use them to trick friends, family, or colleagues into sending money or revealing sensitive information.

How AI Voice Cloning Scams Work

Voice cloning tools (many of them free or very low cost) can recreate a person’s voice with just a few seconds of recorded audio. Scammers often source these samples from social media videos, podcasts, or other online recordings. Once the voice is cloned, they’ll typically call the target and play a short, pre-recorded message designed to create panic or urgency.

These scams are often an evolution of older cons, such as the “Hi Mum” text scam, but now they add the realism of hearing a familiar voice in distress.

Victims may be told a loved one has been in an accident, is in jail, or needs urgent bail money. They say funds must be sent immediately via gift cards, cryptocurrency, or bank transfer. Excuses are given for why the person can’t speak further, limiting the victim’s chance to notice anything suspicious.

Real-World Examples

AI voice scams have already caused significant harm overseas and are starting to surface in Australia. In the US, victims have paid hundreds of thousands of dollars in fake ransom demands after hearing what they believed were their kidnapped relatives.

One US-based man lost $25,000 after scammers used a cloned version of his son’s voice to request bail money.

Closer to home, an Australian reported receiving a call from someone using a cloned voice of former Queensland Premier Steven Miles to promote a fake Bitcoin investment.

While the voice was slightly robotic, it was convincing enough to raise an alarm. The National Australia Bank now lists AI voice cloning among its top scam threats.

How to Spot an AI Voice Scam

Spotting these scams can be challenging, but experts recommend focusing on what is being said, not just who you think is saying it. Warning signs include:

-Urgency and pressure to send money quickly.

-Unusual greetings or missing social cues.

-Accents slipping in and out.

-Short, pre-prepared messages that avoid back-and-forth conversation.

-Reluctance to answer personal questions or explain details.

-Calls from unknown or blocked numbers.

Protecting Yourself and Your Family

While it’s impossible to stop scammers from accessing public audio entirely, you can make it harder for them to succeed. -Agree on a family codeword that must be used during any emergency call.

-Hang up and call back using a known, verified number.

-Avoid giving out sensitive information over the phone.

-Educate vulnerable relatives, especially older family members, about how these scams work.

-Review your privacy settings on social media to limit access to videos and voice recordings.

-Let unknown calls go to voicemail to buy time for verification.

Why This Matters Now

Experts warn that AI voice cloning is only going to get more accessible and convincing. Some tools can create realistic voices for less than $2 a month, and adding more audio samples increases their accuracy.

While not every scam currently involves voice cloning, the technology’s rapid adoption makes it a growing risk.

The best defense is a combination of awareness, healthy skepticism, and clear verification processes. If you receive a call, even from someone you think you know, asking for urgent money or personal information, pause, verify, and act only when you’re certain it’s legitimate.

coinjar author, best crypto exchange
CoinJarREAD FULL BIO →CoinJar is one of the longest-running cryptocurrency exchanges in the world. Since 2013, we’ve helped hundreds of thousands of people worldwide to buy, sell and spend billions of dollars in Bitcoin, Ethereum and dozens of other cryptocurrencies.

Suggested Articles

No alt text
Scam

Scam Uses Popular Anime Characters to Steal Your Money or Crypto

Have you seen your favourite character in an ad online? It might not be a legitimate use of the image. Here are the red flags to look for.Read more
sim swap scam
Scam

The SIM-Swap Crypto Scam Out to Take Your Bitcoin

A SIM-Swap crypto scam aims to steal your crypto. Here's how to protect yourself against this "phone porting" danger. Read more
crypto atm scam
Scam

Crypto ATMs: How to Protect Yourself from a Growing Scam Risk

The number of victims of crypto ATM scams are on the rise. Here's what to look out for. Read more
CoinJar Logo
App storeApp store

Your information is handled in accordance with CoinJar’s Privacy Policy.

Copyright © 2025 CoinJar, Inc. All rights reserved.

CoinJar, Inc. is a registered Money Services Business with FinCEN and licensed as a money transmitter, NMLS #2492913. For a list of states in which CoinJar, Inc. is licensed or authorized to operate, please visit here. In certain other states, money transmission services are provided by Cross River Bank, Member FDIC.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

CoinJar logo
CoinJarGet the app.
Install app