KEY POINTS

  • Researchers used a generative AI voice cloning tool that produced an estimated 95% match
  • 77% of survey respondents said they lost money through fake voice scams
  • Researchers recommended having a verbal code word with loved ones to prevent scams

Cybercriminals now need just around three seconds to clone someone else's voice and use it to make scam calls, researchers found in a new study on how generative artificial intelligence plays a role in cybercrimes.

Generative AI has pulled down the bar for voice-phishing cyber criminals to replicate a person's voice and use the cloned version in various cybercrimes such as romance scams, malware creation, and writing deceptive messages, researchers at threat security and intelligence firm McAfee Labs found in the study.

McAfee researchers used a voice-cloning tool to determine how long it takes for cyber criminals to copy one's voice.

"With just three to four seconds of voice recording, the free tool was able to create a convincing clone" that matched the voice of a researcher "at an estimated 85%," researchers said, adding, "With more investment and effort, more accurate clones are possible, with the researchers able to train the data models to create an estimated 95% voice match," the researchers wrote.

The researchers went on to note that even with "just a few seconds" of audio grabbed from a TikTok post, a voice note, or an Instagram Live video, fraudsters can "create a believable clone that can be manipulated to suit their needs."

In the study, 70% of surveyed individuals said they were unsure if they could tell the difference between a replicated voice and the real recorded voice.

The study also found that 36% of people surveyed lost anywhere between $500 and $3,000 to voice cloning cyberattacks. Around 7% lost anywhere between $5,000 and $15,000 to the said cybercrimes.

All in all, 77% of the survey respondents said they lost money from scam calls that were enabled by generative AI cloning tools.

Amy Bunn, chief communications officer at McAfee, wrote in a blog post that in the new study, one in four of 7,000 people surveyed said they experienced an AI voice cloning scam or knew someone who was a victim of the said fraudulent activity.

Generative AI tools also allow scammers to respond to victims in real-time, making the scheme even more believable for unknowing individuals.

"It's just a very easy-to-use medium, and the attacker doesn't have to have really any expertise in artificial intelligence," Steve Grobman, chief technology officer and senior vice president at McAfee, told Axios on Tuesday.

To make the call seem more legitimate, some scammers go as far as to research the personal information of the person whose voice they cloned.

McAfee researchers recommend that families should set "verbal" code words with relatives to prevent being scammed by cybercriminals using generative AI cloning tools. They also noted that call recipients should always "question the source" and hang up if in doubt, then call the person directly to verify the information before responding.

Last week, Arizona Attorney General Kris Mayes warned of AI-generated voice clones that scammers use to rip consumers off.

"Scammers are using AI technology to personalize scams and mimic a loved one's voice – or to send similar personalized text messages – to trick people," Mayes said, according to a press release from her office.

Mayes said Arizonians should be especially wary of emergency calls asking for money "to be sent right away."

The Federal Trade Commission (FTC) also issued a consumer alert in March warning of how scammers use generative AI to "enhance their family emergency schemes."

"Artificial intelligence is no longer a far-fetched idea out of a sci-fi movie. We're living with it, here and now," the regulator noted.

Observers noted that with the cheap options to replicate a person's voice and the short amount of time needed to generate a cloned version, generative AI has given voice-based fraudsters more power to pull off their scams and make consumers lose money over believable schemes.

The new findings come after an Arizona-based mother received a call from someone who supposedly kidnapped her 15-year-old daughter. "It was never a question of who is this? It was completely her voice," Jennifer DeStefano said of the incident.

The supposed kidnapper demanded up to $1 million, but DeStefano was able to make contact with her daughter and the authorities are now investigating the case.

telephone-586266_1920
Voice scams are on the rise, and generative AI is only making them even easier to pull off. Pixabay