What is the average dropout rate for ICE agent training programs?
Executive summary
There is no authoritative, public "average dropout rate" for ICE agent training programs in the supplied reporting; the sources describe changes to course length, isolated accounts of trainees leaving for failed background checks, academic or fitness reasons, and oversight scrutiny, but do not publish a numerical dropout percentage that can be averaged across ICE’s assorted pipelines (ERO deportation officers, special agents, FLETC programs) [1] [2] [3]. Any attempt to state a precise average from these materials would be speculation rather than reporting.
1. What the reporting actually documents about training and attrition
Recent reporting shows ICE runs multiple, distinct training tracks — for example, FLETC’s 12‑week Criminal Investigator Training Program plus a 15‑week follow‑on for special agents (total ~27 weeks) and shorter Enforcement and Removal Operations (ERO)/deportation‑officer programs and a roughly five‑week Spanish Language Training Program — and that those different pipelines have different standards and assessments, which complicates any single “dropout rate” calculation [4] [3] [5]. Journalistic accounts and agency FAQs note trainees have to meet academic, firearms, fitness and background‑check standards and can fail out for academic scores or fitness failures (for instance, SLTP requires a 70% cumulative score) [5] [3], while public reporting says some recruits have left training because of failed vetting, academic or fitness standards [1].
2. Training shortened, creating ambiguity in comparability
Multiple outlets report that ICE dramatically shortened some training tracks during the 2025 hiring surge — with descriptions ranging from “around six weeks” to reporting a 47‑day curriculum for some recruits — a change that both critics and the agency frame differently: DHS/ICE argues new hires receive on‑the‑job mentoring after baseline FLETC instruction, while critics warn compressed pre‑deployment training can affect readiness and attrition patterns [2] [6] [7]. Because training length and content were altered mid‑surge, any dropout rate measured in one period may not be comparable to earlier metrics [2] [6].
3. Oversight and critics flag increased dropout drivers but provide no aggregate rate
Oversight bodies and press coverage describe an uptick in scrutiny — including a DHS inspector general review — about whether rapid hiring and shortened training produced more candidates failing vetting or standards, and PBS reporting explicitly notes trainees have dropped out for failing background checks, academic requirements or fitness standards [2] [1]. Those narratives identify causes of attrition but stop short of publishing an agency‑wide percentage of trainees who fail to complete training; the reporting documents phenomena, not a consolidated statistical average [1] [8].
4. Why available sources cannot produce a reliable average dropout rate
The supplied materials cover multiple training pipelines (special agents at FLETC, ERO basic courses, language training), changes in curriculum length, and anecdotal/qualitative reporting about dropouts, but none provide a comprehensive, standardized dataset of enrollments versus completions that would be required to compute an “average dropout rate” across programs and time periods; therefore the question cannot be answered numerically from these sources alone [4] [3] [1] [2]. Public ICE career pages and reporting discuss requirements and training structure but do not publish completion/attrition statistics in the supplied documents [9] [7].
5. What a rigorous answer would require and where to look next
A defensible average would require program‑level data (enrolled vs. completed) for each ICE pipeline across the period of interest, ideally broken down by cohort and cause of separation; that data would most likely come from ICE’s Human Capital reports, DHS Inspector General analyses, or a Freedom of Information Act request for official training attrition metrics — none of which are contained in the provided reporting [2] [1]. Until such data is released or cited in reporting, the responsible conclusion is that the supplied sources describe instances and drivers of dropout but do not furnish a verifiable average dropout rate.