What categories of technical evidence have EU regulators previously demanded under the Digital Services Act?
Executive summary
EU regulators under the Digital Services Act (DSA) have sought a range of technical evidence from platforms to verify compliance: structured data access for independent researchers, transparency reports and algorithmic disclosures, formal third‑party audit reports and systemic risk assessments, and operational evidence around content‑moderation and advertising systems [1] [2] [3] [4]. These categories are being operationalised through delegated acts, the European Centre for Algorithmic Transparency (ECAT) and reporting obligations tied to Very Large Online Platforms and Search Engines (VLOPs/VLOSEs) [1] [5] [6].
1. Data access for independent researchers and the delegated regulation
One of the clearest, repeatedly signalled demands is programmatic, procedural and technical access to platform data for vetted researchers under Article 40: the Commission ran a call for evidence to design a delegated regulation laying out the technical and procedural requirements for data access, explicitly to enable external auditing of platforms’ handling of illegal content and societal risks such as disinformation and mental‑health harms [1]. The summary of contributions shows regulators intend concrete technical conditions and safeguards against abuse rather than abstract rights of access, and the delegated act was scheduled for adoption to make those mechanics operational [1].
2. Transparency reports and operational logs of moderation and recommendation systems
The DSA forces platforms to publish transparency reports documenting content‑moderation activity and the tools used to make those decisions, including clarifications on removals and flagging mechanisms, which EU bodies have used as evidentiary material to assess compliance and to trigger further technical probes [2] [7]. The Commission’s enforcement actions and fines — for example a high‑profile fine under transparency rules — demonstrate regulators treat those published disclosures as a baseline of technical evidence to audit against statutory obligations [6].
3. Systemic risk assessments and internal algorithmic documentation
VLOPs/VLOSEs must produce systemic risk assessments and mitigation plans addressing harms like disinformation, electoral manipulation or risks to minors; regulators have demanded the underlying technical documentation showing how recommendation systems, ranking and targeting algorithms operate to judge whether promised mitigations are implemented [4] [8]. The creation of ECAT signals an intent to perform direct technical tests and to analyze transparency reports, risk assessments and independent audits — in other words, regulators will pair documentary evidence with hands‑on tests of algorithms [5].
4. Independent third‑party audits and certification evidence
Article 28’s requirement that VLOPs submit annual independent audits compels platforms to produce formal audit reports and auditor independence proofs as admissible evidence; regulators use those audits to certify compliance with Articles 26–27 and to validate the substance of reported mitigation measures and controls [3]. Industry and compliance advisors frame these audits as central to enforcement and as the mechanism by which subjective managerial claims become examinable, technical evidence [3].
5. Advertising transparency, trader verification and metadata
The DSA’s ad‑transparency and trader‑identity rules force platforms to expose campaign metadata (why a user saw an ad) and verify merchant contact details, producing machine‑readable records and verification artifacts that regulators can demand as technical evidence when probing illicit ads or opaque targeting practices [7] [9]. Platforms’ obligations to prevent targeting based on sensitive characteristics and to display labels or markings create discrete data sets regulators can request or compel to investigate compliance [7] [9].
6. Competing perspectives, enforcement appetite and political undercurrents
Some civil‑society actors push for even broader technical access and algorithmic transparency as a check on platform power, while industry voices warn about commercial sensitivity and cybersecurity risks tied to sharing operational data; the Commission’s phased approach — delegated acts, ECAT tests and audits — reflects a compromise that balances research needs against abuse/safety concerns [4] [1] [8]. Enforcement to date, including fines and designations of VLOPs, indicates regulators are willing to convert transparency obligations into compulsory evidence collection, but the exact technical specifications remain in flux pending delegated acts and implementing guidance [6] [1].