The different types of impersonation scams
Types of impersonation scams: government, business, financial, celebrity, family, employment, and crypto-native fraud.
There are numerous types of impersonation scams, part of what makes them so difficult to spot. Some lean on institutional authority, such as government agencies, banks, brands, and financial professionals. Others weaponize personal familiarity, using a celebrity, a family member, or a fake employer to lower the victim's guard.
There is also a newer category that is more native to crypto and modern internet infrastructure. In those cases, the scam is built around fake exchanges, spoofed websites, manipulated wallet activity, and fabricated online communities. The categories overlap, but breaking them apart makes the warning signs easier to spot.
Institutional and authority-based impersonation scams
Government impersonation scams
Government impersonation scams remain one of the most damaging forms of fraud. The FTC reported that government imposter scam losses reached $789 million in 2024, up $171 million from the year before.
Common versions include:
- IRS callers claiming unpaid taxes and threatening arrest.
- Social Security impersonators warning that benefits will be suspended.
- Police or court impersonators claiming there is a warrant, a fine, or missed jury duty that must be resolved right away.
- FTC impersonators who perversely tell victims to move money "to protect it" from another scam.
- Fake court or law enforcement officials demanding bail money, fines, or Bitcoin ATM deposits.
- Fake toll operators, DMVs, or other ticket issuers threatening to suspend licenses or affect credit scores
A particularly sophisticated version of government impersonation scam is the Phantom Hacker scam. The fraud is to convince the victim their bank account has been hacked and they must move their funds into government-protected escrow. The innovation is that multiple impersonators contact the victim in sequence. One pretends to be tech support. Another pretends to be the victim's bank. A third pretends to be a government official investigating the case.
Each handoff reinforces the others, making the fraud feel independently verified. By the end, the victim believes their case is so urgent it's been escalated to the very top. The ask only happens once the victim has invested an inordinate amount of time and resources into speaking with authority figures, making it psychologically difficult to pull out at the last minute.
The Phantom Hacker Scam
Scammers impersonate three different authorites in sequence
government scams in 2023
are over age 60
per reported victim
Be aware that no legitimate government agency will demand payment in cryptocurrency, threaten immediate arrest over the phone, ask you to keep the interaction secret, or tell you to move your money to "protect" it. These are immediate and unassailable signs that you are being scammed.
Business and brand impersonation scams
Business impersonation scams are effective because they exploit routine consumer behavior. People are used to getting bank alerts, package delivery notifications, refund emails, and account-security messages. Many scammers insert themselves into that deluge of inbound communication. According to FTC reporting on older consumers, business impersonation generated $377 million in losses among older adults in 2024.
The most common variants include:
- Bank impersonation texts and calls that appear in the same thread as genuine bank communications.
- Tech support scams involving pop-ups, remote access software, and fake refund procedures.
- Fake invoices or renewals involving brands like Geek Squad, Amazon, and PayPal.
- Delivery scams involving carriers such as FedEx, UPS, or USPS.
- Business email compromise, where a criminal impersonates an executive, vendor, or finance employee.
Business email compromise is one of the clearest examples of how impersonation scales into serious financial loss. Abnormal AI's summary of the FBI IC3 data reported $2.77 billion in BEC losses across 21,442 incidents in 2024. Over the last decade, losses have reached roughly $17.1 billion. Traditionally, these attacks relied on email spoofing and urgent executive language. Increasingly, they now include live voice cloning and deepfake video confirmation, which makes the fake approval much harder to challenge in the moment.
As with all scam tactics, scammers are becoming more comfortable with cryptocurrency and its infrastructure.. Fraudulent "safe account" transfers are migrating from wires to digital assets.
Financial professional impersonation scams
Financial professional impersonation scams occur when a criminal pretends to be a broker, advisor, firm representative, regulator, or recovery professional. Scammers leverage their perceived expertise and the victim's relative unfamiliarity with financial topics to convince them to make fraudulent investments.
The SEC's Investor.gov guidance warns that scammers create websites and profiles that mimic legitimate registered firms, sometimes using actual employee names, logos, and links to the real firm's website to make the fake version appear authentic. These criminals may also hijack comment threads or social posts from real professionals to redirect victims to imposter pages.
Common variants include:
- Fake brokers pitching crypto investment opportunities.
- Impersonated registered investment advisors using lookalike websites.
- SEC impersonators contacting retail investors or company officers.
- Fake law firms, recovery firms, or "asset tracing" services that target existing fraud victims.
This last category deserves special attention. Many scam survivors get targeted twice. The FBI warns that cryptocurrency recovery services charging upfront fees are frequently scams themselves.
If someone claims to be a financial professional, verify them outside the conversation. Use FINRA BrokerCheck, SEC Investor.gov, the relevant state securities regulator, and the real company's published contact information.
When considering a crypto recovery lawyer, check their state bar licenses, LinkedIn, case history via PACER or another case docket directory, and contact information. Never work with a lawyer who demands payment upfront in crypto or who is unwilling to meet via a secure video platform like Zoom. A legitimate professional will welcome independent verification and will make no guarantees about recovery before taking your money.
Personal and relationship-based impersonation scams
Celebrity and influencer impersonation scams
Celebrity and influencer impersonation scams occur when a criminal contacts their victim using a famous person's identity, likeness, or online persona. Scammers often choose to impersonate older celebrities who are private and considered trustworthy, giving them more latitude to reveal fake details about the celebrity's life.
AI has made the scams dramatically more convincing, as scammers can now easily generate photos and even chat live with victims using deepfake technology.
According to Keepnet Labs' 2026 deepfake trend summary, 48% of deepfake incidents in the United States in 2025 used celebrity likenesses. Norton documented repeated deepfake videos of Elon Musk promoting fraudulent crypto giveaways, and the California DFPI tracker warns that giveaway scams now routinely use compromised accounts, fake livestreams, and AI-generated endorsements.
The Celebrities Scammers Impersonate Most
Stolen identities chosen for maximum emotional leverage



Common forms include:
- Fake social profiles using a celebrity's image and branding.
- Deepfake videos promoting a token, exchange, or giveaway.
- Fake interviews or news articles claiming a public figure uses a trading platform.
- AI-generated livestreams that direct viewers to send crypto first in order to "unlock" a reward.
If crypto is involved, the scammer will often request a small amount first before making the bigger ask. If you think you may be a victim of this type of scam, see our guide to celebrity romance scams for more details.
Family and personal impersonation scams
In a family or personal impersonation scam, a criminal pretends to be someone emotionally close to the victim, such as a child, grandchild, spouse, coworker, or close friend. Scammers will often use a fake family emergency to create a synthetic sense of urgency, triggering panic, sympathy, or protective instinct before the victim stops to verify the story.
The American Bar Association described a 2025 case involving Sharon Brightwell of Florida, who received a call that sounded like her daughter crying after a car accident and legal emergency. She sent $15,000 before learning the voice had been cloned with AI from publicly available audio.
This is not a unique case. Voice cloning has lowered the technical barrier for this category dramatically. Deepstrike and Brightside AI reported that scammers can create an 85% voice match from as little as three seconds of audio. McAfee found that one in four adults has encountered an AI voice scam, and about 70% of people do not feel confident they could tell a cloned voice from a real one.
The same pattern is now appearing on video. In the widely reported Arup case, a finance employee wired $25 million after joining a video conference in which every other participant, including the CFO, was allegedly an AI-generated deepfake.
Common family and personal impersonation variants include:
- Grandparent scams involving arrests, accidents, or medical emergencies.
- AI voice cloning calls that recreate a child, spouse, or sibling.
- Fake video meetings that impersonate coworkers or company leadership.
- Messenger or SMS impersonation where the "family member" claims to have a new phone number and urgent financial need.
Alongside AI, crypto has made these scams more effective by compressing the victim's response window. Instead of mailing cash or buying gift cards, the victim is told to visit a Bitcoin ATM or transfer USDT immediately.
Employment impersonation scams
Employment impersonation scams occur when a criminal pretends to be a recruiter, hiring manager, staffing agency, or employer to extract money, personal data, or labor from the victim. In these schemes, scammers will sometimes use online platforms that require victims to complete menial tasks. The victim's account will display fake earnings for each task they complete, but in time the employee is required to deposit funds to "unlock" more opportunities.
The promise of a job lowers skepticism and gives the scammer a ready-made reason to ask for forms, fees, or account access.
Crypto-native impersonation scams
Crypto-native impersonation scams often reuse the same core trick: the scammer borrows the identity of a trusted exchange, wallet, token, project, or community, then uses that borrowed trust to redirect funds or credentials.
Fake customer support
- What it looks like: A scammer pretending to be an exchange, wallet provider, or trading platform representative contacts you about a fake account breach or urgent security problem.
- What they want: They want login credentials, one-time codes, or a transfer to a so-called secure wallet they control.
- Example: In a Brooklyn criminal case, prosecutors alleged that a 23-year-old posed as Coinbase customer support and stole about $16 million from roughly 100 users. The scheme was especially effective because a former Coinbase support agent was later arrested in India for allegedly leaking customer data to the scammer.
Fake platforms
- What it looks like: The victim lands on a website or app that imitates a real exchange, broker, or investment platform and appears professional enough to trust.
- What they want: They want deposits sent through scammer-controlled bank accounts or crypto wallets while the fake interface displays false balances and profits.
- Why it works: These sites have become more convincing as "scam as a service" vendors sell ready-made clone kits. Fake platforms are central to pig butchering scams, where the interface is part of the deception rather than a real trading venue.
Token impersonation and address poisoning
- What it looks like: The scammer manipulates wallet history or token names so a fake address or asset appears familiar at a glance.
- What they want: They want you to copy and reuse fraudulent transaction details, believing you are sending funds to a trusted destination.
- Example: Because many wallets display only the first and last few characters, victims may copy the wrong address from their own history and send funds to the attacker. In one reported case, a Colorado victim lost $2.1 million.
Fake DeFi communities, airdrops, and group chats
- What it looks like: Criminals impersonate projects, moderators, giveaway posts, investor chats, and support accounts across Telegram, Discord, X, and other crypto-native communities.
- What they want: They want wallet approvals, seed phrases, clicks to drainer sites, or direct transfers to a fake project, moderator, or public figure.
- Example: Cointelegraph's scam explainer and the California DFPI Crypto Scam Tracker both describe fake Telegram and Discord groups, bogus airdrops, wallet drainer sites, and fake "gurus" claiming to represent trusted projects or public figures.
A legitimate exchange or wallet provider will never ask you to move funds to a new wallet for security, reveal a seed phrase, or hand over a one-time code to someone who contacted you first.
| Scenario | Legitimate exchange behavior | Impersonation scam behavior |
|---|---|---|
| First contact | Support begins in-app or through official channels you initiated | Unsolicited call, text, or DM |
| Request for seed phrase or private key | Will never request | Will say it's necessary to protect your assets |
| "Secure wallet" transfer | Will never request | Will say your account is under attack |
| Injecting urgency | Will allow you time to verify their identity | Immediate action demanded due to emergency |
| Providing ID | Official domain and app workflow | Spoofed number, lookalike domain, or social profile |
FAQ
- Yes. Modern AI tools can clone a voice from only a few seconds of audio taken from social media, podcasts, or videos. That is why family emergency calls are more convincing than they used to be. A pre-agreed code word or callback to a known number is one of the best ways to verify a real emergency.
- The Phantom Hacker scam is a multi-stage government impersonation scheme. First, a fake tech support agent contacts you and claims your accounts are compromised. Then a second caller poses as your bank or brokerage and confirms the threat. A third pretends to be a government official and instructs you to move funds to a "safe" government-protected account. The handoffs reinforce each other so the fraud feels independently verified. The FBI and FTC warn that this sequence has caused billions in losses, especially among older adults.
- Crypto-native impersonation scams use fake customer support, fake trading platforms, lookalike wallet addresses, and fake DeFi communities to steal funds or credentials. Scammers impersonate exchanges, wallet providers, or trusted projects so you hand over login details, seed phrases, or send crypto to their wallets. These scams often rely on clone websites and "scam as a service" kits that make fake platforms look nearly identical to real ones.
