Why You Can't Trust Voice Anymore: Cybersecurity & Fraud

By Thato Brander — Technology Keynote Speaker  |  March 12, 2026  |  3 min read

AI Voice Cloning Deepfake Fraud Financial Sector - Thato Brander Technology Keynote Speaker
Thato Brander.

AI VOICE CLONING & THE RISE OF DEEPFAKE FRAUD IN THE FINANCIAL SECTOR

You receive a phone call from a "bank official" while you're having dinner with your friends. The call sounds urgent; it is the person who usually calls you when there is suspicious activity in your bank account. They verify your name, ID number, and last transaction. They want to stop a fraudulent payment and are asking for a "verification code". What do you do?

What if I told you that you shouldn't send that verification code to the bank? Would you say I'm crazy?

Well, scammers can now use AI to clone your banker's voice. Before you say "I would be able to know if it's a bot calling" I would say, you wouldn't. Scammers are becoming more sophisticated, and scams are becoming more personalised. They collect a lot of information and create hyper-personalised scams.

THE $200 MILLION DEEPFAKE VIDEO CALL THAT SHOULD CHANGE HOW YOU VERIFY EVERYTHING

So what do you do? Is a video call better to verify who you're speaking to? Well, let's see.

A multinational company in Hong Kong was scammed out of 200 million dollars. A finance employee was targeted by scammers using AI to create a fake video call with colleagues the employee worked with.

On the video conference, a number of coworkers attended the virtual meeting they sounded and looked like the real deal. But they weren't even people at all; they were deepfakes created by AI.

Ok, maybe now you believe me.

HOW FRAUDSTERS ARE USING THE SAME AI AS YOUR BANK AND WINNING

Banks are spending millions on AI to catch fraudsters, but the fraudsters are using the same technology to sound exactly like your CEO or CFO. The FSCA's latest report has warned that identity is no longer just a security feature it's a target.

The next call you get from your bank might be a lie.

Over the years, cybersecurity threats have become more advanced. Everything is getting more realistic and more convincing. Speed, which was once a feature of modern finance, has now become one of its biggest vulnerabilities.

THE FUTURE OF TRUST IN A WORLD WHERE IDENTITY CAN BE FAKED

Trust used to be built on a voice or a face. In 2026, trust is built on protocols and the slower, the better.

In the financial sector, technology has made it possible for us to send money quickly and efficiently. But we now need to pause on certain actions that should have far more hurdles before they can proceed. Urgency is one of the most powerful tools a scammer has slow down is your best defence.

THREE QUESTIONS EVERY PERSON AND ORGANISATION SHOULD ASK RIGHT NOW

If you're reading this article right now, I will leave you with questions that I hope will stimulate you to come up with solutions to the cybersecurity threats we face.

"If your mother received a call that sounded exactly like you, in your tone, what is the one secret that only you and she know that could save you?"

If you can clone your CEO's voice for the price of a cup of coffee, is your company's "urgent payment" procedure based on a person or a process?

How much of your digital identity voice notes, videos, social media posts, podcasts is currently public enough for a scammer to build a perfect replica by lunchtime?

In Closing

The era of trusting a voice or a face is over. What replaces it protocols, verification processes, and a healthy dose of scepticism will define how safe we are in the age of AI. The question is whether individuals and organisations will adapt before the next $200 million disappears.


About the Author

Thato Brander is a technology keynote speaker at the intersection of AI, innovation, and the future of business. Thato helps organisations understand and navigate the impact of emerging technologies from generative AI to digital identity and translates complex tech trends into clear, actionable insight.

Connect on LinkedIn  |  Read more articles