Powered by MOMENTUM MEDIA
cyber daily logo
Breaking news and updates daily. Subscribe to our Newsletter

AI raises vocal verification concerns as scammers recreate voices

There has been massive discussion in government, media and private organisations of the cyber security risks that artificial intelligence (AI) tools present, lowering the barrier for entry for cyber criminals.

user icon Daniel Croft
Mon, 05 Feb 2024
AI raises vocal verification concerns as scammers recreate voices
expand image

The latest of these concerns regards AI-powered voice cloning software, which could be used to recreate voices and use them in phishing attacks, for fraud and for bypassing voice verification methods.

According to NordVPN, scammers are already using samples of collected voices and using them to create realistic voice clones to steal money from colleagues, friends and family members of the victims.

“A cheap and effective voice cloning software based on advanced machine-learning algorithms now can create highly convincing voice clones of individuals, including public figures,” said NordVPN cyber security expert Adrianus Warmenhoven.

============
============

“Scammers use these voice clones to impersonate someone trustworthy, such as a family member or a company executive, to gain the victim’s trust and extract sensitive information or money.”

Collecting enough voice samples for a single person is not difficult, particularly in such a social media-dependent world. Scammers can collect vocal samples from videos or other media posted to Facebook, TikTok and Instagram or source other contact details, such as phone numbers, to contact the victim and trick them into saying something.

For this example, let’s say the victim’s name is Geoff. A scammer could call and say, “Hi, am I speaking with Geoff?” to which it would not be uncommon to respond with “Yes”. Even before using an AI tool, a recording of an individual saying yes is a dangerous and valuable piece of data.

Even if you realise a scammer is on the other side of the call, saying anything could provide them with another vocal sample.

“Even if you notice that you are speaking with a scammer, it does not mean that you are safe,” added Warmenhoven.

“While you might feel like you have the upper hand against the scammer in that particular situation, don’t underestimate the technology at their disposal.

“If the call is recorded, and you give away a sample of what your voice sounds like, that could be used against you – your cloned voice is then used in a targeted attack to fool friends or family members.”

NordVPN said there are a number of steps that can be taken to prevent giving a threat actor vocal samples. Users should be cautious about posting on social media, knowing that scammers could use vocal recordings.

NordVPN also said that if you notice that a call is a scam, you should hang up immediately, and if you receive a call from an unknown number, you should verify it before calling back.

Daniel Croft

Daniel Croft

Born in the heart of Western Sydney, Daniel Croft is a passionate journalist with an understanding for and experience writing in the technology space. Having studied at Macquarie University, he joined Momentum Media in 2022, writing across a number of publications including Australian Aviation, Cyber Security Connect and Defence Connect. Outside of writing, Daniel has a keen interest in music, and spends his time playing in bands around Sydney.

newsletter
cyber daily subscribe
Be the first to hear the latest developments in the cyber industry.