What legal reforms have been proposed to Section 127 and the Malicious Communications Act, and what arguments have supporters and critics made?
Executive summary
Reforms have aimed to replace or modernise the overlapping, vague offences in section 127 of the Communications Act 2003 and section 1 of the Malicious Communications Act 1988 with a clearer, harm‑based suite of offences targeted at serious online harms while protecting free expression, a package developed and promoted chiefly by the Law Commission and reflected in government proposals and parts of the Online Safety Act 2023 (Law Commission recommendations; government final report) [1] [2]. Supporters say the changes focus criminal law on real harms and reduce chilling vagueness; critics warn some proposals don’t go far enough (or go too far), urging repeal or different approaches to preserve speech and legal clarity (Open Rights Group and other commentators; parliamentary scrutiny) [3] [4].
1. Why reform? the problem the Law Commission identified
The Law Commission argued that the communications offences in section 127 and section 1 are overlapping, ambiguously worded and poorly adapted to modern, multi‑audience online communications, producing uncertainty for users, platforms and police and generating prosecutions in contentious cases like “Twitter joke” prosecutions and WhatsApp group messages (summary of Law Commission project and case history) [1] [5] [6].
2. The Law Commission’s headline reform: a harm‑based offence
The central Law Commission recommendation was to create a new “harm‑based” communications offence to replace section 127 and the parallel provisions in the Malicious Communications Act, reframing liability around actual or intended serious harm rather than vague terms such as “grossly offensive,” “indecent” or “causing annoyance” [1] [7].
3. New, targeted offences and examples proposed
Beyond the single harm‑based offence, the Commission proposed specialist offences to capture specific modern harms—cyberflashing, knowingly sending flashing images to people with epilepsy, encouraging or assisting serious self‑harm, and sending false communications or threats—so that conduct is criminalised where it is plainly harmful rather than where it’s merely offensive [1] [7] [8].
4. Government action: partial adoption via the Online Safety Act and legislative movement
Some recommendations have been taken up: the Online Safety Act 2023 repealed parts of section 127 (notably s.127) and replaced certain offences (for example, false or threatening communications) with new offences in statute, while leaving the “grossly offensive” limb of s.127 and parts of Malicious Communications intact, meaning reform is partial rather than wholesale [5] [9].
5. Arguments of supporters and proponents
Supporters — including the Law Commission, government advisers and many legal commentators — argue reform will “future‑proof” the law, focus criminal sanctions on real harms (pile‑on abuse, cyberflashing, self‑harm facilitation), provide clearer drafting for prosecutors and protect freedom of expression by removing broad, catch‑all offences that can chill speech (Law Commission report; government statements) [1] [2] [10].
6. Criticisms from civil liberties and free‑speech advocates
Critics such as the Open Rights Group argue that section 127 is conceptually unsuited to social media and that attempts to amend it are insufficient; some call for outright repeal and for public order law to be adapted instead, warning that harm‑based offences can still be vague in application and risks of over‑criminalisation remain unless drafting and prosecutorial guidance are tight (Open Rights Group submission; ORG analysis) [3] [6].
7. Political and evidentiary fault lines: data, policing and prosecutorial discretion
Parliamentary scrutiny and independent briefs have highlighted uneven enforcement and lack of centralised arrest data for these offences, with concerns that police use varies by force and that reforms need accompanying guidance, data transparency and safeguards to prevent disproportionate policing of certain communities or speech (House of Lords Library summary; government consultation context) [4] [2].
8. Bottom line: a work in progress with competing priorities
Reform efforts have coherently shifted from vague, transmission‑focused offences toward targeted, harm‑based crimes and specialist offences for online‑specific harms, a direction welcomed by many but resisted by civil liberties groups who want repeal or different statutory vehicles; the law has been partially updated in the Online Safety Act but significant choices about drafting, prosecutorial practice and data transparency remain unresolved (Law Commission recommendations; Online Safety Act changes; civil society critiques) [1] [9] [3].