Voice cloning cybercrime on the rise

Voice cloning cybercrime on the rise

2023-08-18 04:45:03

Table
  1. Voice cloning cybercrime is becoming increasingly prevalent.
    1. Voice Cloning Scams Becoming Increasingly Prevalent
    2. The Process of Voice Cloning Scams
    3. Authenticity Makes Voice Notes More Convincing
    4. Countermeasures and Strategies to Combat Voice Cloning Scams
    5. Voice Cloning Scams Targeting both Companies and Individuals
    6. Recreating Authentic Voices Poses Increased Concern

Voice cloning cybercrime is becoming increasingly prevalent.

JEREMY MAGGS: Let’s end the programme with this story. Using artificial intelligence tools to clone voices has introduced a novel realm of risk for both companies and individuals. Take a listen to this. The immense potential of generative AI is being exploited by cybercriminals who’ve harnessed it for malicious purposes and have created convincing deep fakes and perpetuating what is termed unnervingly realistic voice scams. Are you worried?

Well, we should be. The scammers are incredibly clever, says Stephen Osler, who’s co-founder and business development director at Nclose, which is a cybersecurity company. He joins us now on the programme. Stephen, how prevalent then have voice cloning scams become?

Voice Cloning Scams Becoming Increasingly Prevalent

STEPHEN OSLER: Hi, Jeremy. Look, we’ve seen them happen a little bit from a consumer perspective or shall I say, the numbers are definitely increasing. We have encountered one or two instances when corporates are being defrauded using voice scams and I think those numbers are just going to gradually increase.

BMW's new hydrogen electric vehicle unveiled in UK debutBMW's new hydrogen electric vehicle unveiled in UK debut

JEREMY MAGGS: So it’s something to be concerned about. How does it work? How do they manifest themselves?

The Process of Voice Cloning Scams

STEPHEN OSLER: So essentially what they’re doing is they’re using applications that are easily downloaded [from] a traditional app store. They’re using a voice recording in order to clone a particular voice and then they’re using that as part of their, for all intents and purposes, their fraudulent attack or their cyberattack.

So what we’re generally finding is they’re using the recorded voice in order to gain credibility with the end goal.

So what we see specifically in cyberattacks or fraudulent attacks, what these threat hunters are doing or these threat actors are doing, all they’re trying to do is try and gain additional credibility in order to make somebody either click on a link or pay a fraudulent transaction, and having a voice note as opposed to a WhatsApp message or an email is getting that credibility.

Fingerprint Cards AB (publ) announces final terms for the fully guaranteed rights issue -August 16, 2023 at 01:01 pm| MarketScreenerFingerprint Cards AB (publ) announces final terms for the fully guaranteed rights issue -August 16, 2023 at 01:01 pm| MarketScreener

JEREMY MAGGS: Ja, that’s absolutely right. That’s the point I want to make is that voice note always has a degree more of authenticity, doesn’t it?

Authenticity Makes Voice Notes More Convincing

STEPHEN OSLER: That’s absolutely it, it’s authenticity. I think that’s the problem that we see. Authenticity is probably the one thing that everybody questions when they’re doing a transaction that is slightly sketchy and these defrauders or these cyber criminals are really trying to leverage this generative AI or deep fake cloning technology in order to take that next step.

JEREMY MAGGS: Alright. So having made us all very concerned, you’ll also tell me that often this is targeted at C-suite executives. So what kind of countermeasures are immediately available? What strategies should companies be adopting?

Countermeasures and Strategies to Combat Voice Cloning Scams

STEPHEN OSLER: Jeremy, we saw a lot of fraudulent financial attacks against particular industries where email accounts were compromised, and we found the fraudsters were sending out emails with invoices and changing your bank details. Obviously, we are seeing that pivot because companies have introduced good governance.

ShopBack fined S$74,400 over leak of more than 1.4 million customers’ personal dataShopBack fined S$74,400 over leak of more than 1.4 million customers’ personal data

I think what needs to happen specifically for organisations, they really need to make sure that those governance procedures are well bedded down because in the grand scheme of things, that is the only thing that has the necessary checks and balances in place that could mitigate this type of fraudulent attack.

But you should not be paying any invoices just on receiving a voice note from your C-level executive. I think that’s the big trick.

JEREMY MAGGS: But it’s not just aimed at companies, it’s individuals as well, and voices being extracted from platforms that we use every single day, like Facebook Messenger and WhatsApp.

Voice Cloning Scams Targeting both Companies and Individuals

STEPHEN OSLER: Absolutely. We’ve seen them happen in two cases. Firstly, people are using them to scam individuals by saying that their relatives are being hijacked or they’ve been held ransom and getting people to pay ransoms for their release. We’ve seen a few instances of that.

B.C. couple wins compensation in legal battle with Air Canada over delayed flightB.C. couple wins compensation in legal battle with Air Canada over delayed flight

We’ve seen instances where a user’s trying to sell something or buy something on Facebook marketplace and the fraudster is using the voice notes in order to get credibility to either get them to pay a deposit or pay an amount or get them to meet somewhere. So really these guys are being quite tricky in their approach.

I think the big thing that we need to just be careful of, if it’s too good to be true, then it generally is.

I think the one easy way to mitigate this type of attack is just pick up the phone and phone the individual that you perceive to be receiving the voice note from.

JEREMY MAGGS: But never underestimate all of our naivety in this space as well. Often, we just do it because we do it, I guess. The current technology also, it’s starting to create or recreate tone or idiosyncrasies of an individual’s voice and that makes it more worrying, doesn’t it?

Turbine blades for the first US utility-scale offshore wind farm have arrivedTurbine blades for the first US utility-scale offshore wind farm have arrived

Recreating Authentic Voices Poses Increased Concern

STEPHEN OSLER: It absolutely makes it more worrying, not just the tone or the idiosyncrasies but also, us in South Africa that we have a multitude of different languages and also ways that we talk, and these applications are able to mimic that. One thing is if you were to listen very carefully, we find a lot of these voice notes are slightly distorted or you’ll find that the information that the person is articulating back to you, or shall I say the voice note is articulating back to you, isn’t often 100% correct, because what these fraudsters are doing is they’re using AI or ChatGPT to create the actual conversation piece and using the cloning technology to actually generate the piece of audio. So sometimes the AI technology is not 100% in terms of what they’re explaining is not 100% accurate.

JEREMY MAGGS: In any case, though, you’ve given us an enormous amount of food for thought. Stephen Osler, thank you very much indeed, co-founder and business development director at Nclose.

If you would like to know other articles similar to Voice cloning cybercrime on the rise updated this year 2024 you can visit the category Bussines.

Leave a Reply

Your email address will not be published. Required fields are marked *

Go up