What databases track police killings by race and armed status, and how do they differ?

Checked on January 25, 2026
Disclaimer: Factually can make mistakes. Please verify important information or breaking news. Learn more.

Executive summary

Several independent, open-source projects and a few government systems now track people killed by police and most record both race and whether the person was armed; the best-known are The Washington Post database, The Guardian’s Counted (and successor projects), Mapping Police Violence, and aggregators such as Killed by Police and Fatal Encounters, while official systems like the National Violent Death Reporting System and the U.S. National Vital Statistics System are patchier and undercount incidents [1] [2] [3] [4].

1. The Washington Post — systematic media-driven counts with detailed incident fields

The Washington Post began logging every person shot and killed by on-duty police in 2015 and compiles more than a dozen details for each killing, explicitly including the race of the deceased and whether the person was armed, drawing from local news, law-enforcement sites, social media and independent databases such as Killed by Police and Fatal Encounters to populate cases [1].

2. The Guardian / The Counted and successor datasets — early comprehensive public tallies with classification

The Guardian’s Counted project (2015–2016) gave early public visibility to comprehensive tallies of killings by police and classified victims by race, weapon type, and whether they were armed; it counted roughly 1,100 deaths in 2015 and offered public breakdowns by state, gender, race/ethnicity, age and armed status that later projects emulated [2].

3. Mapping Police Violence and other research-led projects — researcher-oriented reconciliations and broader counts

Mapping Police Violence and academic projects aggregate media reports and public records, sometimes producing higher yearly totals than single media outlets and aiming to reconcile duplicates and add context; these projects also provide armed/unarmed flags and emphasize cross-checking to reduce omission bias that plagues official statistics [3].

4. Killed by Police, Fatal Encounters and similar aggregators — grassroots collections feeding larger datasets

Independent sites such as Killed by Police and Fatal Encounters operated as grassroots aggregators of media and public records that many larger projects, including The Post, used as inputs; these sources supply early leads and raw entries but vary in verification rigor and metadata completeness [1].

5. Official systems (NVSS, NVDRS) — partial coverage, inconsistent variables, documented undercounting

Federal and state systems exist — the National Violent Death Reporting System (NVDRS) and the National Vital Statistics System (NVSS) — but they undercount police killings relative to open-source tallies, lack uniform federal reporting requirements for police-use-of-force demographics, and provide uneven geographic and temporal coverage even as NVDRS shows improved coverage versus NVSS [4] [3].

6. How databases differ — sources, verification, scope, and coding of “armed” or race

The key differences are method: media- and researcher-driven databases (Post, Guardian, MPV) proactively compile and verify incidents from multiple public sources and explicitly code armed status and race, whereas official records rely on death certificates or agency reports that often omit or misclassify police homicides and lack standardized armed-status coding; independent projects typically report higher counts and richer incident detail because they reconcile multiple sources [1] [2] [4] [3].

7. Analytical consequences — benchmarking, denominators, and debates over interpretation

Differences in data collection matter for analysis: researchers warn that decisions about denominators (population vs. crime/encounter rates) and the independence of race and armed status affect conclusions about disparity; several studies using Washington Post or combined open-source data show racial disparities in unarmed killings and interactions between race, age and armed status, while others argue benchmarking on crime rates can change disparity estimates — a debate reflected across the literature [5] [6] [7] [8].

8. What the limits of these sources mean for readers and researchers

Open-source databases provide the most complete public picture of who is killed by police and whether victims were armed, but they still depend on media and public records and cannot resolve all causal questions; official systems remain incomplete and undercount police violence, meaning rigorous research must combine datasets, document coding decisions, and transparently account for measurement bias when comparing race and armed status [4] [3] [9].

Want to dive deeper?
How do researchers define and code “armed” versus “unarmed” in police-use-of-force databases?
What are the strengths and weaknesses of benchmarking police killings to crime rates versus population rates?
How has underreporting in federal statistics on police killings changed after 2015, and which states still lack reliable coverage?