AI technology has been used to fuel the increase of online voice scams, all that’s needed to clone a person’s voice is three seconds of audio, according to McAfee.
A survey performed by McAfee involving 7,054 individuals revealed that a quarter of them have experienced some kind of AI voice scam. 1 in 10 had been the target of a voice scam, and 77% of the victims suffered financial loss as a result.
Using AI to Clone Voices
With the rise in Artificial Intelligence tools, it has become increasingly easy to manipulate photos, videos, and even the voice of friends and family members.
53% of adults share their voice online at least once a week, and 49% share their voice up to 10 times a week. This can be through social media, voice notes etc. Cloning a voice is now a powerful attack method in an attacker’s arsenal.
Attackers will clone a voice (often of a friend or family member) and send a fake voicemail or call the victim’s contacts usually in distress requesting money while impersonating the voice they stole. McAfee’s survey demonstrated that 70% of adults are not confident they could identify a real voice from the cloned version. 45% of the respondents indicated they would reply to the voicemail from the cloned voice of a friend or loved one in need of money.
“Messages most likely to elicit a response were those claiming that the sender had been involved in a car incident (48%), been robbed (47%), lost their phone or wallet (43%), or needed help while travelling abroad (41%).”
The Cost of Falling Victim
The cost of falling for one of these attacks can be significant. More than one-third of the victims that lost money said it cost them more than $1,000, while 7% had between $5,000 and $15,000 stolen.
“Artificial intelligence brings incredible opportunities, but with any technology, there is always the potential for it to be used maliciously in the wrong hands. This is what we’re seeing today with the access and ease of use of AI tools helping cybercriminals to scale their efforts in increasingly convincing ways,” said Steve Grobman, McAfee CTO.
Easier Than You Might Think
Over the course of three weeks, McAfee was able to find 12 freely available voice-cloning tools on the internet. Many of these tools required only a basic level of experience to utilise. In one instance, only three seconds of audio was required to produce an 85% voice match. With just a small number of audio files, they were able to increase this to a 95% match.
“With these hoaxes based on exploiting the emotional vulnerabilities inherent in close relationships, a scammer could net thousands of dollars in just a few hours.”
AI Voice Cloning Protection
There are a few techniques you can implement to protect yourself from one of these scams.
- Set a verbal ‘codeword’ – Consider implementing a ‘codeword’ that only a friend or family member would know and make a plan to always ask for it in the situation they’re requesting money.
- Always question the source – If it’s a call, text or email from an unknown sender, or even if it’s from a number you recognise, stop, pause and think. Does that really sound like them? Would they ask this of you? Hang up and call the person directly or try to verify the information before responding and certainly before sending money.
- Think before you click and share – Who is in your social media network? Do you really know and trust them? Be thoughtful about the friends and connections you have online. The wider your connections and the more you share, the more risk you may be opening yourself up to in having your identity cloned for malicious purposes.
The overriding feeling among the McAfee research team was that artificial intelligence has already changed the game for cybercriminals. The barrier to entry has never been lower, which means it has never been easier to commit cybercrime.
Sources:
Recent Comments