How were victims identified and groomed for confidence scams versus tech-support or IRS-style scams?

Checked on December 5, 2025
Disclaimer: Factually can make mistakes. Please verify important information or breaking news. Learn more.

Executive summary

Confidence scams recruit victims by building emotional or financial trust over days to months using relationship tactics and fake returns; romance/“pig butchering” schemes drove relationship-investment losses above $650 million reported to FBI/FinCEN-related trackers in 2023 and were singled out by FinCEN and the FBI for sophisticated, long-duration grooming [1] [2]. Tech‑support and IRS‑impersonation scams use fear, urgency and spoofing to force quick compliance—common tactics include fake pop‑ups, spoofed caller ID, remote‑access requests or threats of arrest and unusual payment methods (gift cards, wire transfers), as documented by Microsoft, the FTC and the IRS [3] [4] [5] [6].

1. Victim selection: who they pick and why

Confidence-scam operators select targets for emotional vulnerability, loneliness, financial hope or perceived wealth; dating apps, social media and wrong-number texts are explicit recruitment channels mentioned by FinCEN and consumer advisors, because scammers can fabricate relationships and time to groom victims [1] [7]. Tech‑support scammers target people who are online at the moment they see a bogus error or who are less comfortable with technology—pop‑ups, ads or unsolicited calls create immediate opportunity [8] [9]. IRS‑style scams instead exploit fear about legal consequences and rely on broad sweeps—spoofed numbers and lists of names let impersonators hit many people quickly [10] [6].

2. Grooming pace and narrative: slow seduction versus instant crisis

Confidence scams often unfold slowly: operators "fatten" victims with conversation, fake performance reports and staged small wins over days, weeks or months—FinCEN’s “pig butchering” alerts and FBI indictments describe elaborate storylines and fake account dashboards that encourage more investment [2] [1]. By contrast, tech‑support and IRS‑impersonation scams manufacture an acute crisis—a locked screen, an “infected” computer, or an imminent arrest—so victims have to act immediately, undermining deliberation and increasing compliance with demands [8] [6].

3. Psychological levers: trust, authority and fear

Confidence scams leverage reciprocity, intimacy and social proof—scammers mirror victims, create sympathetic backstories, and show fabricated profits to normalize continued investment [11] [12]. Tech‑support scams use fear and technical jargon to create confusion and urgency; they often ask for remote access under the pretense of repair [13] [14]. IRS impersonators wield the power of official authority and legal threat—threats of arrest or license revocation and use of caller‑ID spoofing make victims more likely to comply [10] [6].

4. Technical and operational tools the scammers use

Modern confidence-scam operations increasingly deploy deepfakes, fake investment platforms, and multichannel contact (social, SMS, dating apps) to appear legitimate and sustain relationships, as noted in industry and government alerts [15] [2]. Tech‑support fraud relies on pop‑up malvertising, fake help‑lines, remote‑access software, and spoofed numbers; victims are asked to install software or hand over credentials [3] [16]. IRS‑style impostors exploit spoofing and stolen personally identifiable information (SSN fragments) to add realism and press for unconventional payments like gift cards or wires [10] [6].

5. Money flows and the exit: how victims lose funds

Confidence-scam victims are coaxed into transferring real cryptocurrency or wiring funds into fake investment platforms; operators then “clear out” accounts after showing fabricated returns [2] [12]. Tech‑support scams extract money via one‑time fees, subscriptions, or by installing malware that steals banking credentials—payment often pressured through immediate methods [3] [17]. IRS impostors demand immediate payment methods and threaten enforcement, encouraging victims to use irreversible channels such as prepaid cards or wire transfers [6] [10].

6. Why victims don’t report (and why that matters)

Underreporting is systemic: F‑Secure and consumer groups note that only a fraction of scams are reported—shame, fear of embarrassment, or belief that nothing can be recovered keep cases hidden, which in turn protects criminal playbooks and funding streams [18] [15]. FinCEN and the FBI stress that not reporting hampers detection of large, transnational schemes such as scam compounds tied to “pig butchering” operations [2] [1].

7. Competing viewpoints and limits of current reporting

Sources agree on core tactics—relationship building for confidence scams, urgency and spoofing for tech/IRS scams—but reporting focuses unevenly on certain examples (cryptocurrency investment fraud and pig‑butchering receive heavy attention) while available sources do not mention detailed demographic breakdowns beyond older adults being frequent targets or the precise role of human trafficking victims within every scheme [2] [19] [11]. Government guidance centers on behavioral red flags and reporting pathways rather than exhaustive profiles of individual perpetrators [4] [20].

8. Practical takeaways: what to watch for and report

If contact is friendly and slow, treat investment promises and pressure to send crypto with skepticism; verify any investment via independent channels and be wary of “proof” supplied inside a platform the other party controls [12] [2]. If contact triggers fear—an urgent virus alert, a threatened arrest, or a demand to pay via gift card or wire—hang up, verify with official lines, and report to consumer or tax authorities [3] [6]. Report suspicious messages to FTC, FinCEN/IC3, company support pages, and the IRS as applicable to help dismantle these networks [1] [21] [22].

Want to dive deeper?
What psychological traits make someone vulnerable to confidence scams compared to tech-support scams?
How do scammers research and select targets for IRS-style impersonation schemes?
What grooming tactics are unique to long-term romance or trust-based confidence scams?
How do scammers adapt approaches for elderly victims versus younger, tech-savvy victims?
What digital footprints or signals do fraudsters use to tailor tech-support and IRS scam scripts?