GuideMarch 2026·11 min read

AI Deepfakes and Phishing: New Scams Targeting Aussie SMEs

Cybersecurity lock digital. Photo by Towfiqu barbhuiya on Pexels

SmartCompany reported in early 2026 that deepfake scams have arrived in Australia with force. “Deepfake bosses to perfect emails” was the headline, and it was not exaggeration. Security Brief AU followed up with their analysis of AI’s 2026 security fallout, confirming that small businesses are now the primary target.

The scams have changed. The Nigerian prince email with broken English is gone. In its place: a perfectly written email from your “accountant” asking you to update banking details. A phone call that sounds exactly like your business partner asking for an urgent transfer. A video call with a deepfake of your CEO instructing the finance team to pay an invoice.

These are not theoretical scenarios. They are happening to Australian businesses right now. And small businesses, with fewer security layers and less formal verification processes, are the easiest targets.

The Three AI Scams Hitting Australian SMEs

50%

of deepfake scam attempts target small businesses via email

3 secs

of audio is enough to create a convincing voice clone of any person

$3.1B

lost to scams by Australians in 2023 (ACCC), and AI is accelerating the trend

1. AI-Generated Phishing Emails

Traditional phishing emails were easy to spot: bad grammar, generic greetings, obvious urgency. AI has eliminated all of these tells. Modern phishing emails are grammatically perfect, personally addressed, and written in a style that matches the sender they are impersonating.

An AI can scrape your website, LinkedIn profiles, and social media to learn your team’s names, roles, writing style, and current projects. It then generates an email that reads exactly like a message from your colleague, supplier, or accountant. “Hi Sarah, following up on the invoice discussion from yesterday. The client has changed their bank details. Can you update the payment to this BSB and account number?”

That email takes less than a second to generate and costs nothing. At scale, scammers send thousands of these daily. Even a 1% success rate is profitable.

2. Voice Cloning and Phone Scams

Voice cloning is the one that catches people off guard. AI can now clone any voice from a short audio sample. Your voicemail greeting, a YouTube video, a podcast appearance, even a snippet from a phone call. The cloned voice is then used to call your team and impersonate you.

Imagine your bookkeeper receiving a phone call that sounds exactly like you saying, “I need you to pay this invoice urgently. I am in a meeting so I cannot email you the details, but the account number is...” The voice is yours. The cadence is yours. The urgency feels real.

For a small business where everyone knows the boss’s voice, this is devastatingly effective. The technology is commercially available and improves every month.

3. Deepfake Video Calls

The most sophisticated variant uses deepfake video in real-time video calls. A scammer joins a Zoom or Teams call wearing a deepfake of a trusted person (your CEO, your client, your accountant). They discuss business, reference real details scraped from public sources, and then make a financial request.

In 2024, a Hong Kong finance worker was tricked into transferring US$25 million after a deepfake video call with what appeared to be the company’s CFO and other colleagues. The same technology is now accessible to small-time scammers and is being deployed against Australian businesses.

Why Small Businesses Are the Primary Target

Large corporations have multi-factor authentication, dedicated security teams, formal payment authorisation processes, and employee training programmes. Most small businesses have none of these.

In a 10-person business, the person who pays invoices is often the same person who answers the phone. There is no second approval step. There is no security team reviewing email headers. The boss says “pay it” and it gets paid. Scammers know this.

CyberSecurity Ventures reports that SMEs are now the fastest-growing target for AI-powered scams, precisely because the return on effort is highest. It takes the same amount of effort to deepfake a call to a small business as to a large corporation, but the small business is far more likely to fall for it.

How to Protect Your Business

You do not need enterprise-level security. But you do need basic processes that AI cannot bypass.

Establish a verbal verification protocol. Any request involving money (payment changes, new invoices, bank detail updates) requires verbal confirmation via a known phone number. Not the number in the email. Not the number from the call. The number you have saved in your contacts. This single step defeats most AI scams.

Create a code word system. Agree on a code word with your team and key suppliers that must be used in any unusual financial request. AI cannot guess your internal code word.

Require dual authorisation for payments above a threshold. No single person should be able to authorise payments above a set amount (we suggest $2,000 for most SMEs) without a second approval. This is good financial hygiene regardless of AI scams.

Train your team. The most expensive scam protection is the one your team does not know about. Spend 30 minutes showing them examples of AI-generated phishing emails, playing voice clone demos, and explaining the verification process. Do this quarterly, not just once.

Be cautious with voice and video online. Limit the amount of audio and video of key personnel available publicly. Voicemail greetings, video testimonials, and podcast appearances all provide source material for voice cloning. Consider whether a text-based voicemail greeting is sufficient.

For a comprehensive cyber security checklist including AI-specific threats, see our detailed guide.

Is Your Business AI-Ready and AI-Safe?

AI can protect your business and threaten it. Our Free AI Audit assesses both your opportunity and your risk profile, so you can move forward with confidence.

Frequently Asked Questions

AI-generated phishing emails no longer have the spelling mistakes and awkward grammar that used to be red flags. Instead, look for urgency (act now, time-limited), unusual requests (change payment details, send gift cards), and mismatched sender addresses. When in doubt, verify by calling the person on a known number, not the number in the email.

Yes. Modern voice cloning technology can create a convincing replica of someone's voice from as little as 3-10 seconds of audio. This audio can come from phone calls, voicemail messages, social media videos, or podcast appearances. The cloned voice can then be used in real-time phone calls or pre-recorded messages.

Disproportionately so. SmartCompany reported that 50% of deepfake scam attempts hit small businesses via email. SMEs are targeted because they typically have weaker security, fewer verification processes, and less training. A scammer who might not fool a corporate finance team with a deepfake video call could easily fool a small business bookkeeper.

Act immediately. Contact your bank to freeze any affected accounts. Report to the Australian Cyber Security Centre (cyber.gov.au) and ScamWatch (scamwatch.gov.au). Document everything. Notify your insurance provider. Then review your verification processes to prevent repeat incidents. Speed matters: the sooner you report, the better chance of recovering funds.

FW
FlowWorks Team
AI Automation & Consulting · Melbourne, Australia
Get started

Find out what's costing
your business the most.

A 30-minute conversation. No pitch. No obligation. We'll identify your highest-impact automation opportunities before you spend a dollar.

Get your AI Readiness Review
1300 484 044 · ops@flowworks.com.au · 470 St Kilda Rd, Melbourne VIC 3004