WHAT if a criminal could take a tiny snippet of your voice from your social media, clone it using AI technology and then trick your friends and family into handing over cash?
It’s a chilling and terrifying thought, but anyone can fall victim – with hoaxers requiring just a three-second clip of your voice to clone your accent.
Fabulous writer Miranda Knox sent out an AI generated plea for money, which was a clone of her own voice[/caption]Almost a quarter of Brits have already experienced an AI voice scam or know someone who has, according to research by online protection company McAfee and a staggering 78 per cent lost money as a result.
And scammers needing such a small voice clip is particularly eye-opening when you consider how much of our lives we now share on social media.
F-Secure Threat Intelligence Lead Laura Kankaala is an expert in the field of security threats and in just minutes, she revealed how incredibly easy it is for scammers to take on your identity and use it to extract money for criminal gain.
Using an AI audio generator tool, Laura cloned my voice and it sounded so identical to my own, even my close family and friends – those you’d think would know better – fell for it.
It’s tech anyone can use, and is readily available online, with dozens of free and paid-for sites.
Laura says: “Voice cloning is happening all the time and people don’t often realise how easy it is to copy our likeness – our voice and pictures.
“Right now it’s a big issue as there have been a lot of new tools that have emerged over the last year or so to copy someone’s voice [and] they’re very easy to use.”
I’m obviously not a scamming pro, but once my two clips were ready, it was alarmingly quick and easy to fool those closest to me.
I sent two voice notes to close friends and family asking for money, loosely replicating a cruel hoax common with scammers.
“Hi, it’s Miranda! About to book flights but I’ve lost my credit card – really need to book now. Can you send me a grand and I’ll pay you back? I’ll really owe you one!”
We’d chatted previously about going on holiday together and were are the point of booking flights when I sent the voice note.
“I can, but just in a meeting,’ she quickly replied – making it obvious just how easy it would be to fall victim when distracted.
Needless to say, I came clean before she got to the point of asking for bank details.
And in a second note, which I sent to friends and family, the cloned voice said:
“Hi, it’s Miranda – having a nightmare and lost my bag and phone! Could I borrow some money so I can get home please?
“Promise I’ll pay you back!”
“Freaking out! Have you actually lost everything?’ one pal replied, contacting our mutual friends to see if I was OK.
My brother-in-law did question it initially, but then asked: ‘How much do you need?’ before sending me £30 – which I sent back immediately.
Thankfully there was no fooling my mum, although she admitted it did sound exactly like me and had I “been a better scammer” she could potentially have fallen for it.
It sounds far-fetched that you’d fall for a robot mimicking your pal’s voice – but it’s a lot more believable than people realise.
Scammers rely on emotional manipulation, high-stress scenarios and tight deadlines to apply pressure, so they essentially want all logic to go out the window.
McAfee Senior Security Researcher Oliver Devane explains: “Scammers are using AI voice cloning technology to dupe parents into handing over money, by replicating their childrens’ voices via call, voicemail or WhatsApp voice note.
“In most of the cases we’re hearing of, the scammer says they have been in a bad accident, such as a car crash, and need money urgently.
“The cybercriminal is betting on a parent becoming worried about their child, letting emotions take over, and sending money to help.”
Explaining the process of using voice cloning tech further, F-Secure’s Laura Kankaala says: “In the guidelines and terms it will always say to only use your own voice or a voice that you have permission to use [but] of course cyber criminals are not following any laws so that’s meaningless.
“Some only need three to 10 seconds of audio [to create a clone], and the one I used needs about one minute.
“On social media, a lot of us post videos talking to a camera, which would be one way to obtain the audio needed for this.
“The problem is not we’re living our lives online and posting things online, as that’s just how the world operates.
“The problem is there are people who want to take advantage of this technology and use it against us to commit crime and scam people out of their money.
“[When it comes to scams], the sky is the limit really.”
F-Secure Threat Intelligence Lead Laura Kankaala wants to raise awareness about the prevalence of scams, and how people can keep themselves safe online[/caption] Scammers are able to take advantage of victim’s genuine concern, and all common sense[/caption]I fessed up quickly so it didn’t escalate, but what should my friends have done when they received my voice note?
Most, to their credit, did try calling me immediately – which is one way to find out if you’re being tricked, and several people called my husband too.
Laura says: “If you receive anything from anyone suddenly, out of the blue, it’s always good to step back, especially if they’re asking you to send money or click on a link.
“Sit back and try to contact that person directly through a different means, for example, calling them by phone to ask if they’re actually in trouble.”
Here, F-Secure Threat Intelligence Lead Laura Kankaala shares her top tips to spot a scam, and what you need to be aware of to avoid falling victim…
Laura Kankaala is the Threat Intelligence Lead at F-Secure – a cyber security company who help protect more than 30m worldwide.
AI voice scams are an increasingly alarming issue, and one that’s sure to become more prevalent as the capabilities of AI technology continues to advance.
When mum-of-four Jennifer DeStefano picked up the phone last January her blood ran cold as her terrified teen daughter Briana sobbed and screamed for her help – only for it to be an elaborate AI voice scam.
It wasn’t Jennifer’s daughter at all, but an AI robot perfectly mimicking her cries and voice seemingly as part of an elaborate ploy to try and scam Jennifer out of tens of thousands of pounds.
To this day, she still has no idea how her daughter’s voice was cloned – or who was responsible, and had never even heard of AI scams before becoming subjected to an attempt to trick her into handing over thousands to save her daughter Briana from kidnappers in 2023.
Mum Jennifer DeStefano was told by scammers that her daughter Briana had been kidnapped and they demanded thousands of pounds for her release[/caption]While Jennifer didn’t hand over any money, it was a shocking experience.
Since sharing her story, she has called for greater legislation around AI technology.
She says: “I’ve had so many people reach out to say they’d experienced a similar thing, whether it’s a call about a kidnapping or an accident or they’re in trouble and in prison – there are loads of different scenarios, and the deep fake videos that are coming out now too are so scary.
“We need more legislation around AI, and AI being used to aid crime. There needs to be penalties and consequences to misusing it.
“It’s a relief my daughter is safe and that situation wasn’t real, however it also sheds light on a whole new world and reality that I had no idea even existed – and that is terrifying.”
Hoping to raise awareness of this sort of scam to try and prevent people falling victim online, Laura adds: “These things sound scary, but I want to talk about this as it’s a tricky world we’re living in right now.
“We’re so dependent on the internet but there are so many ways our data can be weaponised against us – and this is just one example.”