These scams are designed to make one part ways with their money by associating danger with their loved ones. And such scams do not require high-tech to pull off. Applications and tools to produce such deep fake assets are available and just a click away.
If you have a reasonably informative Instagram suggestion feed, chances are you’ve come across a mother narrating the harrowing incident of a scammer calling her to inform her that her 15-year-old who went on a ski trip has been kidnapped.
“I pick up the phone, and I hear my daughter’s voice, and it says “Mom”, and she’s sobbing. I asked what happened, and she said, “Mom, I…I messed up”, she’s sobbing and crying. And then I hear a man’s voice saying, “Put your head back, lie down”, and then I’m like, wait, what’s going on? He gets on the phone, and he’s like, “Listen here, I’ve got your daughter; this is how it’s going to go down. If you call the police or anyone, I’m gonna pop her stomach so full of drugs, I’m gonna have my way with her and drop her off at Mexico,” shared Jennifer DeStefano, an Arizonian citizen with the Arizona News.
But was the voice really that realistic? “100 per cent her voice. It was never a question of who is this. It was clearly her voice.”
And no, such things do not happen only outside India. Turns out India, unfortunately, is leading in terms of the count. The latest report of McAfee titled “Beware the Artificial Impostor” surveyed over 7,000 individuals across the US, UK, France, Germany, Japan, Australia, and India and shared that the increased use of messaging platforms and video-based social media has led to an increase in this upgraded scam.
What is the AI-voice Scam?
These scams are designed to make one part ways with their money by associating danger with their loved ones. And such scams do not require high-tech to pull off. Applications and tools to produce such deep fake assets are available and just a click away.
“Targeted imposter scams are not new, but the availability and access to advanced artificial intelligence tools is, and that’s changing the game for cybercriminals. Instead of just making phone calls or sending emails or text messages, with very little effort, a cybercriminal can now impersonate someone using AI voice-cloning technology, which plays on your emotional connection and a sense of urgency to increase the likelihood of you falling for the scam,” shared Steve Grobman, CTO, McAfee.
We live in an age of social media and digital presence where WhatsApp is used by 2 billion users worldwide, while Instagram, Snapchat and Tik Tok have 1 billion monthly users, 293 million daily active users, and 1 billion monthly active users, respectively.
“While this may seem harmless, our digital footprint and what we share online can arm cybercriminals with the information they need to target your friends and family. With just a few seconds of audio taken from an Instagram Live video, a TikTok post, or even a voice note, fraudsters can create a believable clone that can be manipulated to suit their needs,” the report read.
India Most Vulnerable
Out of all the surveyed countries, India is the most susceptible to voice scams, with 43 per cent responding that they shared their voice once or twice a week. Eight per cent shared it more than 11 times over seven days. 47 per cent of Indian respondents shared that they had either been scammed (20 per cent) or had known someone who was scammed (27 per cent).
The circumstances
45 per cent shared that they were willing to pay on request to their loved ones. Popular scenarios used by scammers include car crashes, robberies, and lost phones or wallets. 40 per cent were willing to pay if it was their spouse/partners, followed by their mothers at 24 per cent.