AI is now entering our lives in a good way, but it can also be used by scammers to manipulate us out of crypto or money. Here's what to look out for.
Fraud is on the rise, and by 2025, Australians could lose billions in dollars and cryptocurrency to scams. As fraudulent schemes grow more advanced, staying alert is essential.
CoinJar is excited to back Scams Awareness Week 2025, collaborating with the ACCC to equip you with the knowledge to spot and steer clear of scams. Put your scam-detection skills to the test and safeguard your finances!
For Scams Awareness Week, we’re encouraging you to STOP, CHECK, and PROTECT.
Artificial intelligence is transforming the way scammers operate, with AI-powered voice cloning emerging as one of the most worrying new tactics. These scams cross all industries, whether that be cryptocurrency exchanges, banks or even just random people.
Using short audio samples, fraudsters can now create convincing replicas of someone’s voice and use them to trick friends, family, or colleagues into sending money or revealing sensitive information.
Voice cloning tools (many of them free or very low cost) can recreate a person’s voice with just a few seconds of recorded audio. Scammers often source these samples from social media videos, podcasts, or other online recordings. Once the voice is cloned, they’ll typically call the target and play a short, pre-recorded message designed to create panic or urgency.
These scams are often an evolution of older cons, such as the “Hi Mum” text scam, but now they add the realism of hearing a familiar voice in distress.
Victims may be told a loved one has been in an accident, is in jail, or needs urgent bail money. They say funds must be sent immediately via gift cards, cryptocurrency, or bank transfer. Excuses are given for why the person can’t speak further, limiting the victim’s chance to notice anything suspicious.
AI voice scams have already caused significant harm overseas and are starting to surface in Australia. In the US, victims have paid hundreds of thousands of dollars in fake ransom demands after hearing what they believed were their kidnapped relatives.
One US-based man lost $25,000 after scammers used a cloned version of his son’s voice to request bail money.
Closer to home, an Australian reported receiving a call from someone using a cloned voice of former Queensland Premier Steven Miles to promote a fake Bitcoin investment.
While the voice was slightly robotic, it was convincing enough to raise an alarm. The National Australia Bank now lists AI voice cloning among its top scam threats.
Spotting these scams can be challenging, but experts recommend focusing on what is being said, not just who you think is saying it. Warning signs include:
-Urgency and pressure to send money quickly.
-Unusual greetings or missing social cues.
-Accents slipping in and out.
-Short, pre-prepared messages that avoid back-and-forth conversation.
-Reluctance to answer personal questions or explain details.
-Calls from unknown or blocked numbers.
While it’s impossible to stop scammers from accessing public audio entirely, you can make it harder for them to succeed. -Agree on a family codeword that must be used during any emergency call.
-Hang up and call back using a known, verified number.
-Avoid giving out sensitive information over the phone.
-Educate vulnerable relatives, especially older family members, about how these scams work.
-Review your privacy settings on social media to limit access to videos and voice recordings.
-Let unknown calls go to voicemail to buy time for verification.
Experts warn that AI voice cloning is only going to get more accessible and convincing. Some tools can create realistic voices for less than $2 a month, and adding more audio samples increases their accuracy.
While not every scam currently involves voice cloning, the technology’s rapid adoption makes it a growing risk.
The best defence is a combination of awareness, healthy scepticism, and clear verification processes. If you receive a call, even from someone you think you know, asking for urgent money or personal information, pause, verify, and act only when you’re certain it’s legitimate.
Your information is handled in accordance with CoinJar’s Collection Statement.
CoinJar’s digital currency exchange services are operated by CoinJar Australia Pty Ltd ACN 648 570 807, a registered digital currency exchange provider with AUSTRAC.
CoinJar Card is a prepaid Mastercard issued by EML Payment Solutions Limited ABN 30 131 436 532 AFSL 404131 pursuant to license by Mastercard. CoinJar Australia Pty Ltd is an authorised representative of EML Payment Solutions Limited (AR No 1290193). We recommend you consider the Product Disclosure Statement and Target Market Determination before making any decision to acquire the product. Mastercard and the circles design are registered trademarks of Mastercard International Incorporated.
Google Pay is a trademark of Google LLC. Apple Pay is a trademark of Apple Inc.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.