You might think of your financial plan as being a way to help you reach…
The voice cloning AI scam you need to be aware of
Artificial Intelligence (AI) is providing scammers with new ways to try and dupe you. One type of scam that’s on the rise that you should be aware of is “voice cloning”.
Voice cloning uses AI technology to replicate the voice of a friend or family member. Fraudsters then use this to contact you to ask you to transfer funds or share sensitive information. It can be incredibly difficult to spot a voice cloning scam, particularly if the so-called friend or family member appears to be in distress.
Worryingly, a person’s voice could be replicated from as little as three seconds of audio, which may be easily obtained if a loved one has uploaded a video to social media platforms. As well as providing fraudsters with a way to impersonate a person, social media could also help them identify who to target by seeing who interacts with their posts.
28% of UK adults say they have been targeted by an AI voice cloning scam
According to a survey from Starling Bank...
28% of UK adults say they’ve been targeted by an AI voice cloning scam at least once in the last year. Yet, almost half (46%) of UK adults have never heard of AI voice cloning scams, so the scale could be far larger.
The majority of people recognise how challenging it could be to detect a voice cloning scam. Indeed, just 30% say they would confidently know what to look for and 79% said they are concerned about being targeted.
One simple way to protect yourself is to have a safe phrase in place with trusted family and friends. This can provide you with a quick and easy way to verify who you’re speaking to if you’re ever in doubt and before you transfer any money. Be sure to never share your safe phrase online.
📢 3 more AI scams that could affect you
Voice cloning isn’t the only way scammers are using AI.
In fact, according to Money Week, AI scams left Brits £1 billion out of pocket in the first three months of 2024 alone. The latest technology can make it more difficult than ever to spot the red flags, so, unsurprisingly, almost half of Brits said they feel “more at risk of scams”.
Being aware of common scams and the signs to watch out for could mean you’re able to avoid falling victim should you be targeted. Here are three other types of AI scams that are on the rise.
1. Deepfakes
A deepfake is a video, sound, or image that has been digitally manipulated using AI.
A study from Santander found that more than a third of Brits have knowingly watched a deepfake. Yet, more than half of people said they had not heard the term or misunderstood what it means, so many more could have watched a deepfake without realising.
Scammers can use deepfakes in a range of scams. For example, they might create a fake profile filled with realistic media to carry out a romance scam. Or they could use a deepfake to convince you you’re speaking to a genuine investment manager as part of an investment scam.
Deepfakes are often circulated on social media platforms, so it’s important to be vigilant and verify the information you receive, even when it looks convincing. Many deepfakes are imperfect, so taking a closer look at a video or image could highlight red flags, like blurring around the mouth, odd reflections, or abnormal movements, like blinking less than normal.
1. ChatGPT phishing
Phishing scams are nothing new, but AI means they could look far more trustworthy than previous attempts.
Phishing is when criminals use emails or texts that encourage you to visit a website, which may download a virus onto your computer, download attachments, or share personal details. Often, phishing scams will impersonate a genuine company or person to gain your trust.
In the past, you might have spotted a phishing email by noting spelling and grammar mistakes, an unusual sender, or the tone of voice changing from previous communications.
However, ChatGPT and other similar tools mean it’s simpler than ever for criminals to create text and designs that are similar to those they’re impersonating and remove tell-tale signs of a scam. So, it’s important to remain cautious when you’re responding to messages, especially if they’re out of the blue.
1. Verification fraud
It’s not just your loved ones that could be affected by voice cloning and deepfakes, you could be too. Many phone and banking apps allow you to verify who you are by sending a video of yourself or saying a password out loud on the phone.
AI could mean these types of security checks don’t provide the protection they once did. It could mean fraudsters can open accounts in your name, access your accounts, and more.
As a result, being careful about what you share online, including seemingly harmless photographs or videos, may help you avoid a scam.
We could help you spot a scam
If you’re contacted about a financial opportunity, whether through email, a phone call, on social media, or in another way, and you aren’t sure if it’s a scam, we could help. Sometimes another perspective could help you recognise the red flags you’ve overlooked or give you the confidence to ignore the message.
ℹ️ You can also use ActionFraud to report a scam or seek additional information.
Please note: This blog is for general information only and does not constitute advice. The information is aimed at retail clients only.
Sources:
https://www.starlingbank.com/news/starling-bank-launches-safe-phrases-campaign/
https://moneyweek.com/personal-finance/ai-scams-to-be-aware-of
https://www.santander.co.uk/about-santander/media-centre/press-releases/santander-deploys-deepfakes-to-raise-awareness-of-ai