fbpx

AI Voice Scams: How Do They Work?

josh.chinn@wealthrecovery.co.uk Avatar

With artificial intelligence getting more advanced due to how quickly it can learn, AI voice scams are becoming more convincing and catching out more victims. Fraudsters can now find audio clips online, or via your voicemail, and can use this to create clones of voices to scam people for money or personal information.

There are now deep-learning algorithms that make it possible to make these AI voice scams very realistic. They can imitate accents, natural pauses and nuances that are specific to a person. Answering phone calls from unknown numbers has now become a real cause for concern, as you never truly know whether it is legit or a scam.

In this blog, we will take a look at how AI voice scams work and how you can avoid them so that you don’t get scammed for your money or personal details.

Ai Voice Scam Process

There is a specific formula that the majority of scammers use when conducting their voice scams with the help of AI. 

First, the fraudsters will conduct thorough online research, as the majority of AI scams are highly targeted. They can scour social media for any information that can be used to help their scam with TikTok and Instagram being the most popular for it. Scammers can clone voices from videos on social media platforms and use them to get large amounts of money from victims.

A call can then be made to friends and family of the person whose voice has been cloned, which can be used to trick them into believing they are talking to a loved one. An example of a type of AI voice scam that follows this method is the kidnapping scam. The main goal is to make the call recipient panic.

Once the scammer knows they have gained trust, they will look to secure money or personal information. They will typically want you to send the money through a gift card so that it is less traceable and nonrefundable.

How To Identify An AI Voice Scam

Nowadays, it can be very difficult to determine if these scam calls are legitimate or not. However, some telltale signs can help you identify if you are dealing with an AI voice scammer:

  • Briefly hear loved ones voice – In most cases, these scam calls will only briefly include the voice of a loved one, usually in a panicked tone. If you only hear the voice briefly, it is a sign that you are being scammed.
  • Hesitation to questions – While it’s getting easier to clone voices, fraudsters are unable to clone the personality or memories of an individual. An AI voice scam will not be able to answer basic questions and will hesitate.
  • Unknown number – While not all unknown numbers are dangerous, you generally shouldn’t answer calls that have no number present. You never know if it could be a scam.
  • Gift card request – Scammers will look to receive less traceable means of payment, such as gift cards or crypto payments, as they believe the funds will not be recoverable.

Ai Voice Scam Examples

There are several types of scam calls that can be done with AI, such as fake kidnapping, grandparent targeting, fake celebrity endorsements, accessing private accounts and friend favour scams.

Fake kidnapping

Families can be targeted by fake kidnapping scams, especially those that have a decent-sized social media presence. A child’s voice can be cloned and used to call their parents to fake kidnapping so that they panic and send money to the scammer.

Grandparent targeting

Grandparents are an easy target for AI voice scam calls as they tend to be unaware of the advancements in technology that make these scams possible. Scammers will often pose as a family member in trouble to receive money or personal information.

Fake celebrity endorsements

Celebrity voices are the easiest to clone as there is so much material out there that can be used and full conversations can be made from them. This can be used to trick consumers into buying illegitimate products.

Access private accounts

AI voice scams can use your voice to contact financial institutions and attempt to fool bank employees into giving away private information. The chances of this increase if you have posts on social media that include your voice.

Friend favours

AI can also be used to clone your friends’ voices in an attempt to get you to send ‘urgent’ money to help them. Scammers can play on emotions during these calls to get what they want.

How To Protect Yourself And Your Family

As AI develops further and technology advances, these scams will become more convincing and regular. You must know what to look out for and know how to protect yourself and your family from any potential AI voice scams in the future. Ways you can do this include:

  • Creating a family “safe word” to use on the phone.
  • Ask someone else to phone the police while you’re on the call.
  • Contact your loved ones and inform them of the AI scams out there.
  • Limit social media posts with voice inclusion.
  • Use two-step authentication for log-ins.
  • Sign up for a digital security tool that fights against AI voice scams.

Get in touch with our experienced team at WRS if you feel as though you or your family have been a victim of an AI voice scam, or have any questions about other AI scams.


Contact us

    The Blog

    Latest News