A deepfake scam uses AI to create fake audio, video, or images that convincingly impersonate someone, so scammers can gain trust, create urgency, and pressure people to send money or share information.
Deepfake scams rely on AI-generated media, not just written messages or standard phone calls.
Using publicly available audio, video, or images—often pulled from social media, voicemails, or online videos—scammers can train AI tools to imitate how a person sounds or looks. The result can be a message that closely resembles a real individual, even though it was never created by them.
Unlike traditional impersonation scams, deepfake scams don’t just claim to be someone, they simulate recognizable traits, such as voice or appearance.

Not all deepfakes are the same, and some are far more common in scams than others.
Audio deepfakes use AI to replicate a person’s voice. These are currently the most common and most practical deepfakes used in scams.
They are often used in:
Audio deepfakes are effective because they:
Voice cloning is frequently used in family emergency scams and workplace payment scams.
Video deepfakes use AI to create fake videos that appear to show a real person speaking or acting.
In scams, video deepfakes are:
While video deepfakes attract more attention in the media, they are currently less practical for direct, real-time scam interactions, though this may change as technology continues to develop.
AI-generated images are often used to:
These images may be combined with audio or messaging scams to reinforce trust.

While deepfake scams are still emerging, consumer protection agencies, including the FBI’s Internet Crime Complaint Center (IC3), have warned that AI-generated audio and video are increasingly being used to enhance impersonation and payment scams among others.
One of the most reported uses of deepfake technology involves AI-generated voice impersonation of family members. In these scams, a person receives a call or voicemail that sounds like a loved one claiming to be in trouble and asking for urgent financial help. The emotional realism of the voice can make it difficult to pause and verify before acting.
Deepfake audio has also been reported in workplace scams, where employees receive phone calls that sound like senior executives or managers requesting urgent payments or account access. Because the voice appears familiar and authoritative, these requests may be acted on quickly before internal verification occurs.
Some scammers use AI-generated or manipulated videos of celebrities, influencers, or public figures to promote fake investments, giveaways, or causes. These videos are often shared on social media or messaging platforms and are designed to create credibility through visual recognition.
In some cases, deepfake audio is used alongside traditional impersonation tactics. Scammers pose as customer support or security teams, using AI-generated voices to sound professional and authoritative while requesting verification steps, payments, or account access.
These examples show that deepfakes are most often used to enhance existing scam types, rather than create entirely new ones.
Deepfake scams often rely on familiarity rather than obvious warning signs. While some are very convincing, there are still indicators that audio or video may not be genuine.
A message may involve a deepfake if:
Because deepfakes are designed to look and sound real, verification matters more than realism.
Deepfake scams are dangerous because they exploit recognition, not just trust.
Hearing a familiar voice or seeing a familiar face can:
This can result in financial loss, account compromise, or emotional distress, especially when scammers impersonate family members or authority figures.
When realism increases, verification becomes even more important.
What is a deepfake scam?
A deepfake scam uses AI-generated audio, video, or images to impersonate a real person and trick someone into taking action.
Are deepfake scams usually videos?
No. While deepfakes are often associated with fake videos, audio deepfakes (voice cloning) are currently more common in real-world scams.
Can deepfake scams really sound or look real?
Yes. AI-generated voices and videos can closely resemble real people, which is why independent verification is essential.
How is a deepfake scam different from impersonation?
Impersonation involves claiming to be someone. Deepfake scams use AI to simulate how that person sounds or looks.