What precise text in the Online Safety Act mentions client‑side scanning or Ofcom technical notices and how have courts interpreted those clauses?
Executive summary
The Online Safety Act does not use the phrase “client‑side scanning” but creates a mechanism—centralised in Clause 104 and surrounding provisions—that allows Ofcom to issue “technology” or “content‑scanning” notices requiring service providers to deploy “accredited technology,” a formulation that has been widely read as potentially encompassing client‑side scanning or other measures that could weaken end‑to‑end encryption (E2EE) [1] [2] [3]. Judicial treatment to date is limited and procedural: litigation has arisen (for example, 4chan’s challenge) that probes how Ofcom exercises those notice powers, but there is not yet an authoritative body of case law definitively construing the Act’s reach to mandate CSS or encryption‑breaking backdoors [4] [5].
1. The statutory hooks: “accredited technology”, Clause 104 and related provisions
The Act frames Ofcom’s specific technical powers through Chapter 5 of Part 7 (often referenced in public commentary as Clause 104), which enables Ofcom to issue notices requiring in‑scope providers to identify, remove or mitigate illegal content using specified “accredited technology” — a deliberately open formulation that the regulator fleshes out in guidance and Annexes to its consultation documents [2] [3]. Other cited provisions that commentators point to as interpretive levers include Clause 105 (safeguards), Clause 122 (powers relating to encrypted communication providers in guidance summaries), and definitional or interpretive clauses such as 188 and 192 that address whether content is “publicly or privately” communicated — language that directly imports private messaging into the regime’s potential ambit [1] [6] [7].
2. What the Act actually says about technical notices, in practice
Ofcom’s technology‑notice framework in its consultation and draft guidance repeatedly uses the terms “accredited technology,” “content moderation technology” and sets out procedures for accreditation, minimum accuracy and impact assessment rather than enumerating named technical architectures [2]. The regulator’s published guidance and annexes supply the operational detail that would make a Clause 104 notice actionable in practice, including technical accreditation reports and minimum accuracy proposals — a delegation of technical content from primary legislation to regulatory instruments [2] [8].
3. Why commentators read the law as implicating client‑side scanning and E2EE
Because the Act can be used to require platforms to detect illegal material even when it is communicated “privately,” and because Ofcom’s powers contemplate “accredited” content‑detection tools, expert commentators and civil‑society groups have concluded that the practical route for enforcement on E2EE services is client‑side scanning (CSS) or technical measures that achieve equivalent access — even though the statute avoids naming CSS explicitly [1] [3] [6]. Rights groups pressed amendments demanding judicial approval for technology notices and explicit human‑rights duties for Ofcom, signaling concern that the statutory wording leaves room for intrusive designs such as on‑device hashing, reporting pipelines or even encryption workarounds [9] [10].
4. How courts have so far interpreted — and what litigation shows
Court development remains embryonic. Ofcom’s first enforcement decisions and related litigation (notably 4chan’s suit) have illuminated procedural pathways — the regulator’s confirmation decisions, information‑gathering powers and enforcement escalation — but have not yet produced a definitive judicial ruling that interprets Clause 104 as authorising a forced deployment of client‑side scanning that would lawfully require breaking E2EE [4] [11]. Some cases and regulatory enforcement actions have been contested across jurisdictions (including U.S. litigation by entities affected by Ofcom’s actions), which raises constitutional and extraterritorial questions, but the sources do not record a final appellate judgment resolving whether the Act compels CSS or encryption backdoors [4] [5].
5. Where the law leaves uncertainty and why courts matter
The statute’s deliberate use of broad, technology‑neutral language plus Ofcom’s delegation to make technical accreditation rules creates legal uncertainty that invites both administrative decisions and judicial review; human‑rights advocates thus pressed for judicial oversight in parliamentary debates and in consultation responses, and the government itself has acknowledged technological limits to safe CSS implementation [9] [12]. Existing commentary and regulatory documents make clear the pathway from Clause 104 notices to potential CSS mandates, but the absence of a controlling judicial construction means the ultimate limits of Ofcom’s power — and whether courts will constrain notices that require de facto weakening of encryption — remain unresolved in the jurisprudence cited here [2] [13].