Artificial Intelligence (AI) is becoming increasingly prevalent in society. When used correctly, it can provide a helpful tool for researching, brainstorming, planning, editing, and much more.
But sadly, scammers are also using the technology for more nefarious intentions. Raisin reports that people in the UK lost £1 billion to scams in the first three months of 2024. Moreover, 48% of Brits feel more at risk of fraud as criminals adopt technological advances, including AI, in their methods.
Despite the rising prevalence of AI in financial scams, there are some steps you can take to protect yourself, including being more informed about what to look out for. Read on to discover some of the most common types of AI scams that have tricked victims in 2024 and some practical tips to help you protect your wealth.
3 of the most common Artificial Intelligence scams in 2024
1. Phishing emails using Artificial Intelligence to impersonate legitimate organisations
Phishing emails have long been used by scammers to gather personal information by impersonating your bank or other legitimate organisations. But large language models (LLMs) like ChatGPT have now made it much easier for criminals to produce text that is indistinguishable from the real thing.
By entering text from an organisation’s website or other marketing materials, a scammer can ask ChatGPT to produce an email that sounds exactly like an organisation. Moreover, it can do so without any of the telltale grammar or spelling mistakes that often give phishing emails away as a scam.
2. Voice cloning
AI isn’t just used to produce convincing blocks of text, it can also clone your voice using just three seconds of audio or video recording. Scammers can then use this to gain access to accounts that employ voice recognition software as part of their security measures.
Scammers often use phone calls to capture audio recordings for this purpose, pretending to be a family member or friend who needs financial help.
3. Deepfakes
“Deepfake” is a term used for an AI-generated image or video of a person, usually a celebrity. Scammers sometimes use deepfakes to convince victims to invest money into a fraudulent investment or divulge personal information. One example is a fraudulent advert that features Martin Lewis’ image, which showed the finance journalist apparently endorsing an investment opportunity.
Lewis quickly condemned the advert and continues to remind viewers that he doesn’t appear in adverts. But deepfakes can be very convincing, and the technology is continuing to evolve.
There are practical steps you can take to protect yourself
AI presents so many wonderful opportunities to improve our lives, but its use in financial scams is sadly becoming more common.
Fortunately, there are a few simple steps you can take to protect yourself.
Have a “safe phrase” with your family
A “safe phrase” is a unique word or phrase that you and your family agree to use to help you spot a scam. This is useful if you receive a phone call like the ones you read about above, in which a family member claims they need you to send them some financial help.
By asking them “What is your safe phrase”, you can quickly ascertain whether you are speaking to your relative or a scammer. If they are unable to provide this, you can hang up before the scammer is able to capture enough audio to clone your voice.
Double-check communications are legitimate by contacting the organisation using a method you trust
If you receive a convincing yet unexpected email or phone call from an organisation requesting personal details, it’s sensible to contact them separately to clarify that it is really from them. For example, you could call your bank on the number listed on your credit or debit card.
Using a different yet trusted method of communication, you can check whether the original email or text was legitimate.
Consult your planner for guidance
If you receive an invitation or offer to invest in something that promises high returns, it’s sensible to consult your planner before you make any decisions. Even if the offer seems legitimate, your planner can investigate the source and confirm whether they agree that the investment is sensible based on your personal circumstances and goals.
But the old adage remains true: if it sounds too good to be true, it probably is.
Get in touch
To find out more about how we can help you protect your wealth from scammers, please get in touch.
Email enquiries@metiswealth.co.uk or call 0345 450 5670 today to find out what we can do for you.
Please note
This article is for general information only and does not constitute advice. The information is aimed at retail clients only.
Please do not act based on anything you might read in this article. All contents are based on our understanding of HMRC legislation, which is subject to change.
The value of your investments (and any income from them) can go down as well as up and you may not get back the full amount you invested. Past performance is not a reliable indicator of future performance.
Investments should be considered over the longer term and should fit in with your overall attitude to risk and financial circumstances.