Voice Cloning AI Scams Are on the Rise
Scammers can clone your family members' voices with AI to convince you to send money. To avoid falling victim to fraud, protect your digital audio footprint and set up family-only codewords or questions.
Takeaways
- Scammers can use AI-generated audio to impersonate a crying family member urgently requesting money.
- Criminals can use artificial intelligence-generated audio clips to access accounts.
- To prevent voice cloning scams, limit public access to recordings of family members' voices on social media and always call others to verify any emergency
What Is an AI Voice Cloning Scam?
AI voice cloning scams involve using AI to generate a voice that sounds like your friend or family member in distress. The fake voice will ask you to send emergency financial help, claiming to need it while traveling in a foreign country, after being arrested, kidnapped or involved in a car accident.
While the family emergency scam isn't new, the use of AI voice cloning is a twist that makes these scams harder than ever to recognize.
AI Voice Cloning Examples: Fake Emergencies, Real Consequences
Jennifer DeStefano heard the panicked voice of her daughter on the other line, saying she was in danger and needed a $1 million ransom paid to her kidnappers, according to CNN. In this case, Jennifer was able to confirm her daughter was safe and didn't hand over any money, however some AI voice scam stories don't end as well. According to a 2025 WFLA news report, a woman lost $15,000 after receiving a call from her crying daughter. The woman withdrew cash and placed it in a box, which a driver picked up from her house. Another call and a larger money request soon followed. Luckily, the woman's grandson was able to eventually get her real daughter on the phone, preventing further cash withdrawals.
How Does AI Voice Cloning Work?
Scammers typically research your family on social media, looking for videos that contain a family member's voice. Then they will use AI tools to replicate your family member's voice using their own script, according to an FBI alert. They will call you and play this generated voice message, pairing it with a request for money by phone.
"Scammers can pull pieces of a person's real voice and have an AI tool use those voice patterns to create synthetic conversations, copying and manipulating your voice," said Sean Murphy, a Senior Vice President and the Chief Information Security Officer at BECU.
The synthesized AI voice might sound almost identical to your unique human voice, even when crying. It might take just a few seconds to engineer.
Other Types of AI Voice Scams
Murphy said AI voice cloning increases the efficiency and effectiveness of what digital security professionals call "social engineering," which relies on trusted social networks.
Even top U.S. officials were targeted this year by social engineering tactics, according to a May FBI alert. Bad actors impersonated senior U.S. officials to target other federal or state government officials for information.
Vishing Scams
AI-generated voice or video messages, also known as "vishing," can also be combined with malicious links. Vishing is short for "voice phishing" and is a cousin to the well-known scams called phishing (via email) and smishing (SMS messaging).
Voice Verification Scams
Another type of voice scam provides scammers with access to financial institutions. The voice sample collection is used to get past sophisticated technology relying on voice recognition for account access.
"The most likely way many people may be at risk is someone trying to get your credentials, like your password or challenge question information," Murphy said.
"Sometimes we just see feelers," Murphy said. "For example, the AI call you receive may ask you seemingly innocent questions, just getting you to say, 'Hello, who is there?' to gather voice patterns that the scammer will use later to impersonate you."
How to Avoid AI Voice Scams
Follow these tips to protect your personal data and finances from AI voice scams.
1. Create a Secret Word or Phrase With Your Family
Create and share a secret word or phrase with family members, including parents, grandparents, children, and grandchildren, and request that the password be used only by family members.
Another similar option is to set up security questions. Ensure these questions are only answerable by family members and can't be found through social media or internet searches.
2. Don't Trust Caller ID — Even For a Loved One
Fake phone numbers make it look like the call is coming from someone you know. Hang up and call the person at a trusted number you know is theirs or a trusted source who can confirm or deny what the caller is saying.
Be cautious when any urgent requests arise, even if the voice sounds familiar, Murphy said.
Ask questions about personal, private details that can only be answered by your real loved one. This might be a childhood nickname, or your child's beloved first-grade teacher.
3. Limit Your Public Audio and Video
Set social media profiles, feeds and pages to private or restrict followers to people you know, the FBI suggested. Do this wherever you post online sources of sound clips, and ask the same of loved ones and relatives. Falling victim to scams can happen to anyone.
4. Don't Give Sensitive Information To Someone Who Calls You
Even if the caller says they're from your financial institution, call back at a trusted number. It could also be a voice recording trying to capture your voice pattern. When called out of the blue, never provide your:
- Social Security number
- Full bank account number
- Username or password
- Two-factor authentication codes
- Other personal information
"If they are claiming to be from BECU and ask for personal information that we should already know, hang up and contact us directly to be sure," Murphy said.
5. Pay Special Attention to Cash Requests
The caller will likely want money delivered in a way that's hard for you to recover. Once it leaves your account, you can't get it back.
This includes:
- Cash
- Cryptocurrency
- Gift cards
- Wire transfers
- Prepaid debit cards
6. Slow Down
A scammer's most potent tools are emotion and urgency — causing you to worry and panic, believing that unless you take immediate action, something terrible will happen.
Most importantly, stay calm — even if the situation sounds dire. The scammer hopes you won't have time to investigate the situation rationally, ask questions, or call others for verification. Few situations demand such an immediate response.
7. Report Fraud
If you suspect fraudulent activity, the FTC wants you to report it. Learn how to help fight fraud on their website. Report "vishing" videos to social media sites and report scams online to the Federal Trade Commission.
The Flip Side: Using AI To Improve Security
Online scams might make AI sound scary, but Murphy said to remember that legitimate organizations use AI for positive voice-related purposes, too.
"AI can be both good and bad," Murphy said. "From a cybersecurity perspective, BECU is using AI to help us develop tools and techniques to protect our own systems. We find patterns and create defenses that are more predictive and proactive, rather than being reactive."
Murphy said many organizations use AI to teach their computers to detect when other computers are trying to penetrate their cybersecurity measures.
FAQ
Are Voice Cloning Scams the Top Scams?
Although AI-driven scams are interesting and scary, Murphy said you're more likely to encounter basic attempts at fraud and theft. These include password phishing emails, text messages and spoofed websites, where scammers try to convince users to give up their login information.
What's the Best Way To Protect Yourself From AI Voice Scams?
Learn and practice the basics of online security. Limit your digital voice footprint and be cautious about what you share publicly on social media. Be skeptical when you receive an urgent phone call or message requesting money in response to an emergency.
"Limit the information you are willing to share," he said. "A healthy dose of skepticism is good. Just take an extra second to look for red flags."
What is Vishing?
Vishing is short for "voice phishing." These scams use AI to generate audio from recorded voices of friends, family members or trusted organizations. Using the AI-generated voice, the scammer calls or leaves a voicemail to convince you to give up sensitive information.
The above article is intended to provide generalized financial information designed to educate a broad segment of the public; it does not give personalized financial, tax, investment, legal, or other business and professional advice. Before taking any action, you should always seek the assistance of a professional who knows your particular situation when making financial, legal, tax, investment, or any other business and professional decisions that affect you and/or your business.