Have YouTube removed channels impersonating political commentators in 2024–2025?
Executive summary
YouTube carried out large-scale removals of channels tied to state-backed propaganda campaigns in 2025 and terminated specific creator-run networks tied to alleged foreign funding in 2024, while platforms more broadly announced mass takedowns of impersonating accounts — but the publicly available reporting does not provide a clear, named list of YouTube removals explicitly described as “impersonating political commentators” across 2024–2025 [1] [2] [3].
1. What YouTube explicitly removed: state-linked networks and creator channels
Google reported sweeping enforcement in 2025, removing nearly 11,000 channels in Q2 and more than 23,000 accounts in Q1 that it described as tied to state-backed propaganda operations, mainly originating in China and Russia, and totaling roughly 34,000 removals since the start of 2025 according to consolidated reporting [1] [4] [5]. Separately, in September 2024 YouTube terminated channels operated by Tenet Media and by Lauren Chen after U.S. DOJ allegations that those operations had accepted Russian government money to pay influencers — a discrete enforcement action that targeted channels tied to an alleged foreign influence scheme rather than a generic impersonation sweep [2] [6].
2. What “impersonation” means on platforms and what other companies did
Major platforms framed parts of their 2025 enforcement as addressing impersonation and “spammy” accounts: Meta said it removed roughly 10 million profiles in H1 2025 for impersonating large content producers as part of anti‑spam efforts, a metric media outlets repeatedly cited alongside Google’s removals [3] [4] [7]. Those announcements show industry-wide attention to impersonation as a problem, but they conflate several behaviors — fake accounts, coordinated inauthentic networks, and content-amplifying spam — which complicates a neat one‑to‑one mapping to the narrower claim that YouTube removed channels impersonating named political commentators [3] [4].
3. Policy shifts and technical framing: YouTube’s impersonation rules and AI
YouTube maintains an impersonation policy that allows removal of channels that adopt nearly identical names, logos, or repost content while posing as others, and industry guides indicate Google expanded focus on AI‑generated impersonations in 2024–2025 — changes that give the company the policy hooks to remove impersonator channels and AI deepfake impostors [8]. YouTube’s transparency and TAG bulletins cite removals tied to influence operations and state actors, and the company framed many of the 2025 takedowns as routine, ongoing work rather than ad hoc responses [4].
4. What the reporting does not show — limits in the public record
None of the cited pieces provides a named catalog of channels that were explicitly removed for impersonating individual political commentators between 2024 and 2025; the public reporting instead documents broad removals of state‑linked networks and a high‑profile termination of channels tied to alleged foreign funding [1] [2]. That gap matters: the presence of large takedowns and a beefed‑up impersonation policy means YouTube had both motive and mechanism to remove impersonator channels, but the sources do not explicitly demonstrate a systematic, publicized campaign by YouTube in 2024–2025 that targeted channels solely on the basis of impersonating named political commentators.
5. Competing narratives and potential hidden agendas
Tech platforms and nation‑state actors both manage narratives: companies emphasize routine enforcement and public‑safety rationales while political actors may frame removals as censorship or vindication depending on partisan stakes, as seen in later pushback and reinstatement debates in 2025 reporting [9] [10]. Independent outlets and aggregated bulletins present the removals as counter‑propaganda work [1] [5], while conservative media have highlighted reinstatement proposals and questioned past enforcement choices — a dynamic that underscores how enforcement can be spun to serve political narratives even when the underlying takedowns address coordinated or state‑linked activity [9] [11].
Conclusion: a qualified yes, but not the simple story asked for
In short: YouTube did remove channels in 2024–2025 tied to foreign influence and it terminated specific creator networks implicated in alleged foreign funding [1] [2], and the platform’s impersonation policy was broadened in the face of AI risks [8]; however, the public record assembled here does not document a clear, named list of channels that YouTube removed specifically for impersonating individual political commentators during 2024–2025, and other platforms’ mass “impersonation” removals (Meta) should not be conflated with YouTube’s distinct enforcement actions unless directly corroborated [3] [4].