What were the key findings of Dominion Voting Systems' 2024 security audits?
Executive summary
Available sources show multiple 2024 security reviews and audits of Dominion systems produced mixed findings: federal and vendor testing reports largely found systems met certification tests and produced expected results [1], while independent researchers identified vulnerabilities — including ballot-ordering (privacy) flaws and software weaknesses that Dominion patched after disclosure [2] [3]. Coverage includes vendor statements stressing audit/recount confirmations of accuracy [4] [5] and independent lab or academic reports highlighting exploitable issues if physical or operational protections fail [2] [3].
1. Certification testing: formal tests largely met expectations
Federal Voting System Test Laboratory reports for Dominion’s D-Suite (e.g., D-Suite 5.20) record that “all actual results obtained during test execution matching the expected results,” indicating that security and functionality test cases in the certification process returned expected outcomes [1]. County-level testing in places such as Maricopa County likewise emphasizes that independent Voting System Test Laboratories analyzed equipment and found air-gapped systems and no evidence of backdoor access in their formal audits [6]. These documents form the official baseline that vendors and many election officials cite when asserting that certified equipment passed required security checks [1] [6].
2. Vendor and government statements: no evidence of manipulated outcomes
Dominion’s public messaging and legal filings reiterate broader government findings that there is no evidence voting systems deleted, changed, or lost votes in recent elections; company materials cite CISA and other official statements to that effect [4]. Dominion also notes thousands of audits and recounts since 2020 that it says confirm accuracy and reliability of its certified systems [5] [4]. Those statements are part of the company’s response to post‑2020 controversies and are used to argue that tabulation results have been repeatedly validated [4] [5].
3. Independent researchers: technical vulnerabilities and privacy risks uncovered
Academic and independent security researchers reported concrete technical vulnerabilities in Dominion equipment in 2024. J. Alex Halderman’s team documented flaws — including weak default credentials, buffer overflows, and encryption gaps — that could be exploited by someone with brief physical access; Dominion developed patches after disclosure [2]. Separately, a USENIX Security paper found a ballot-randomization flaw in precinct scanners that could allow predictable ordering and thereby risk linking ballots to voters, undermining secret-ballot protections used across multiple states [2] [7]. These findings emphasize that real-world attack feasibility often depends on the attacker gaining some level of physical or procedural access [2].
4. Patching and mitigation: vendor responses and persistent operational choices
Dominion responded to disclosed vulnerabilities by producing software patches and recommending mitigation steps; CISA advisories also listed configuration and procedural mitigations for ImageCast X systems and warned of residual risks if upgrades leave devices in insecure states [3] [2]. However, implementation depended on jurisdictions: Georgia officials, for instance, deferred some updates until after the 2024 election, with state leaders publicly characterizing some risks as theoretical while researchers warned the Coffee County incident showed physical access can occur [2]. This illustrates a tension between immediate patching and election-administration choices to avoid mid-election software changes [2] [3].
5. What these audits and reports agree and where they diverge
There is agreement that certified systems functioned as tested and that audits/recounts generally confirmed tabulation results [1] [6] [4]. The divergence arises over real-world exploitability: certification and county test labs emphasize closed, air-gapped designs and passing test cases [1] [6], while security researchers argue that specific software flaws and predictable ballot ordering could be exploited when operational controls (e.g., physical security, timely patching, ballot-data sanitization) fail [2] [3]. Both perspectives are present in the record, and both stress different parts of election security: formal compliance vs. adversarial threat modeling.
6. Limitations of available reporting and unanswered questions
Available sources do not provide a single, consolidated “2024 security audit” narrative that lists all findings across jurisdictions; instead the record consists of vendor reports, EAC test reports, CISA advisories, county audits, and independent academic analyses [1] [3] [6] [2]. Sources do not enumerate every patched issue or provide a universal assessment of whether every jurisdiction implemented recommended mitigations — some jurisdictions delayed updates [2]. For claims not covered here (for example, specific exploit demonstrations beyond reported lab work or post‑patch verification across all deployments), available sources do not mention those details.
7. Bottom line for readers: accuracy affirmed, vulnerabilities documented, context matters
The official testing and recounts cited by Dominion and testing labs support the conclusion that tabulation results matched expectations and that certified systems met test criteria [1] [6] [4]. Independent researchers documented meaningful vulnerabilities — particularly around privacy of ballot ordering and the risks that come from physical access and misconfiguration — and those findings led to patches and advisories [2] [3]. Voters and officials should therefore treat the situation as one where audit/recount evidence supports result accuracy while continuing to address and mitigate technical and operational vulnerabilities identified by researchers [4] [2] [3].