Which countries explicitly prohibit the intentional viewing of CSAM?

Checked on January 14, 2026
Disclaimer: Factually can make mistakes. Please verify important information or breaking news. Learn more.

Executive summary

A small but growing set of jurisdictions explicitly criminalize knowingly or intentionally accessing/viewing child sexual abuse material (CSAM), with the European Union’s 2011 Directive requiring member states to punish knowingly accessing CSAM [1] and national laws that explicitly criminalize “access with intent to view” in places such as the United States (federal law) [2] and individual EU members including the Netherlands and Greece as called out in comparative studies [1]. Reporting from INHOPE and ICMEC documents shows uneven national approaches and clear gaps in global coverage, meaning any list built from current public summaries is necessarily partial [3] [4].

1. EU-level rule: a baseline saying “knowingly access” is punishable

The European Directive on combating sexual abuse and sexual exploitation of children and child pornography includes Article 5, which obliges member states to criminalize knowingly accessing child sexual abuse material through information and communication technologies, setting a regional floor for prohibiting intentional viewing [1].

2. Specific national examples cited in legislative reviews

Comparative overviews by INHOPE and academic reports identify particular countries that have gone beyond generic possession or distribution crimes to criminalize access itself; the Netherlands and Greece are specifically referenced in the legislative review as jurisdictions that criminalize access to CSAM, while several EU states list “prescribed penalties” that can cover access-related offenses [1] [5].

3. United States federal approach includes “access with intent to view”

U.S. federal law explicitly lists “access with intent to view” among the prohibited acts relating to CSAM, alongside production, distribution, and possession, making intentional viewing a federal offense in the United States under the statutes summarized by the Department of Justice [2].

4. Other jurisdictions and case-by-case actions — evidence of prohibition but not a definitive list

Recent news reporting about countries blocking apps used to generate AI-produced CSAM — such as Malaysia and Indonesia blocking the Grok app after significant misuse — indicates those states are willing to act against platforms facilitating CSAM and implies legal or regulatory prohibitions against such material, though the reports are operational rather than legislative overviews and do not present full statutory language [6].

5. What the global reviews say — patchwork laws and major data gaps

Global reviews from ICMEC and INHOPE show that while many countries criminalize production and distribution, fewer expressly define or criminalize simple possession or access: ICMEC’s inventories show that of countries with some CSAM legislation many did not define CSAM sufficiently or criminalize simple possession, and INHOPE’s 61-country overview highlights inconsistent terminology and coverage across states, underscoring that comprehensive enumeration of “explicit prohibitions on intentional viewing” is not available from these high-level reports alone [4] [3].

6. Caveats, competing priorities and hidden agendas in reporting

Sources vary in focus — INHOPE centers hotlines and removal practices which can emphasize enforcement needs [3], ICMEC provides model law advocacy that highlights deficiencies to spur reform [4], and news pieces often highlight enforcement actions that may reflect political will rather than statutory wording [6]; therefore claims that a country “explicitly prohibits intentional viewing” must be grounded in statutory language or authoritative national legal summaries, which the provided documents only partially supply [4] [3] [6].

7. Bottom line and recommendations for further verification

Based on the supplied reporting, the EU Directive requires criminalization of knowingly accessing CSAM [1], the Netherlands and Greece are singled out as criminalizing access in comparative reports [1], the United States federal law explicitly criminalizes “access with intent to view” [2], and operational actions in Malaysia and Indonesia show state-level moves against platforms facilitating AI CSAM [6]; beyond these examples, the global picture is fragmented and the available summaries do not permit a comprehensive, definitive global list without consulting national statutes or the fuller INHOPE/ICMEC country-by-country annexes [3] [4].

Want to dive deeper?
Which EU member states have implemented Article 5 of the CSAM Directive into national law, and what do their statutes say about knowingly accessing CSAM?
How do different countries legally distinguish between possession, distribution, and access/viewing of CSAM in statutory language and penalties?
What are the most authoritative sources (government statutes or legal databases) to consult for country-by-country CSAM laws and definitions?