Can viewing CSAM get you arrested by UK police

Checked on February 1, 2026
Disclaimer: Factually can make mistakes. Please verify important information or breaking news. Learn more.

Executive summary

Yesin the UK, viewing child sexual abuse material (CSAM) can lead to arrest because the law treats “making” and “possession” of indecent images of children broadly, and mere acts of opening, downloading or otherwise causing an image to be created on a device can meet those offences; prosecutors and police use grading systems and databases to classify and investigate such material [1] [2] [3]. However, prosecutions turn on legal definitions, intent and context — accidental, unsolicited or immediately deleted views may attract different investigative and prosecutorial responses and there are limited statutory defences and nuance in how “making” and “possession” are applied [1] [4] [5].

1. The legal bottom line: viewing can be criminal because the law is framed broadly

UK legislation and prosecutorial guidance treat indecent images of children as criminal to make, possess or distribute, and the Protection of Children Act 1978 together with the Sexual Offences Act 2003 form the backbone of those offences which apply to anyone who takes, makes, shows or possesses indecent images of people under 18 [2] [6] [7]. Crucially, Crown Prosecution Service guidance and legal commentary explain that the offence of “making” an image has been interpreted to cover activities such as opening an attachment, downloading or otherwise causing an image to be stored — meaning mere viewing can amount to the actus reus of an offence in the eyes of law enforcement [1].

2. How police and prosecutors decide whether to arrest or charge

Police investigation pathways are shaped by classification schemes, databases and multi-agency protocols: law enforcement uses the Child Abuse Image Database (CAID) and sentencing guidelines to grade material and assess seriousness, and the IWF and NPCC have memoranda of understanding with prosecutors to coordinate removal and investigation [3] [1]. The CPS and police will consider the category of image, whether images were intentionally sought, whether they were downloaded or merely streamed, and aggravating factors such as distribution or links to contact offences when deciding to arrest and charge [3] [1].

3. Accidental viewing, intent and statutory defences — there is nuance

Legal commentary and defence-firm guidance note that context and intent matter: if material was genuinely unsolicited, immediately deleted and there is a credible absence of intent to possess or distribute, that can be relevant at investigation and prosecution stages and may form part of a statutory defence in some narrowly defined provisions such as those created by recent statutes [1] [4] [5]. That said, those mitigating circumstances do not guarantee immunity — prosecutors can still investigate, and courts will weigh the evidence, because the law is deliberately protective and wide-ranging [1] [4].

4. Penalties and seriousness: why enforcement is robust

Sentencing guidance categorises images by harm and prescribes ranges that reflect the gravity of the abuse depicted; some offences involving indecent images carry maximum sentences of many years’ imprisonment, signalling why police and CPS take viewing and possession allegations seriously [3] [8]. Public bodies and child-protection organisations emphasise that CSAM is a form of child sexual abuse and that repeated viewing, sharing or creation perpetuates harm to victims — a policy stance that underpins enforcement priorities [7].

5. New technologies and expanding criminalisation

Recent policy moves and industry campaigning have led to proposals and rules that would criminalise possession and distribution of AI-generated child sexual abuse material and tools optimised to create it, reflecting a trend to close technological loopholes and to treat synthetic images as harmful and illegal [9]. This broadening creates additional risk for anyone who knowingly or recklessly accesses such content, and may further lower the practical threshold for investigation in future cases [9].

6. Practical takeaways and limits of available reporting

The available sources establish that viewing can trigger criminal investigation and possible arrest because of how “making” and “possession” are interpreted and because authorities use graded assessment and databases to pursue offences [1] [3] [2]; defence guidance shows that context and intent are central to outcomes but cannot nullify statutory culpability in all cases [4] [5]. Where sources do not provide a definitive, case-by-case prosecution threshold or up-to-the-minute police arrest practice, reporting limitations prevent asserting exactly how any particular officer or force will act in every circumstance.

Want to dive deeper?
What legal defences exist in the UK for accidentally viewing CSAM and how have courts treated them?
How do UK police use the Child Abuse Image Database (CAID) when investigating alleged viewers of CSAM?
How will new UK rules on AI-generated child sexual abuse material change criminal liability for viewing synthetic images?