Keep Factually independent
Whether you agree or disagree with our analysis, these conversations matter for democracy. We don't take money from political groups - even a $5 donation helps us keep it that way.
What are the main arguments for and against Bill C-63 from political parties and advocacy groups?
Executive summary
Bill C-63, the Online Harms Act, sought to create a regulatory framework and a new Digital Safety Commission to address seven categories of harmful online content — from child sexual exploitation to content that foments hatred or incites violence [1] [2]. Supporters (including the Liberal government and some police and victim groups) argued it strengthens child protections and gives law enforcement new tools [3] [1], while critics (civil liberties groups, some academics, and media commentaries) warned it risks free‑speech overreach, is poorly targeted, and contains provisions that could be weaponized or have limited practical effect [4] [5] [6].
1. What the bill actually proposes — a regulator, duties and preserved content
Bill C-63 would have established the Online Harms Act with duties on “regulated services,” a new Digital Safety Commission to issue orders, and preservation requirements (for example, requiring platforms to preserve content that incites violence for one year) as well as amendments to the Criminal Code and Canadian Human Rights Act aimed at seven harm categories [7] [1] [8].
2. Arguments from proponents — protecting kids and giving law enforcement tools
The Liberal government framed C-63 as stronger protections for children and broader safeguards against online hate, arguing the bill gives “much‑needed tools” to police and prosecutors to make communities safer and to better combat online drivers of real‑world violence [1] [3]. Government statements and supporters emphasized the bill’s focus on child sexual exploitation, non‑consensual intimate images, bullying of children, and content that incites violence or extremism [2] [1].
3. Police and victim advocates: pragmatic safety claims
Open statements logged in parliamentary debate and summaries show some police services and families affected by hate crimes backing the bill on grounds that online speech can be an early vector for violent acts, and that platform accountability and preservation obligations would aid investigations [3] [7]. These stakeholders present the bill as filling investigative gaps left by current rules [3].
4. Civil‑liberties concerns — free speech and overbroad powers
Amnesty International Canada and other civil‑society groups welcomed splitting the bill but warned parts could disproportionately affect free expression without effectively reducing online hate [4]. OpenMedia and other advocates argued some enforcement mechanisms — likened to punitive “peace bond” style measures for speech — are poorly designed and risk criminalizing hypothetical harms rather than imminent threats [5] [4].
5. Academic and policy critiques — light touch or empty policy?
Analysts at the Centre for International Governance Innovation and TechPolicy.Press offered competing critiques: CIGI argued C-63 is “a light touch” that largely leaves platforms’ structural incentives intact and therefore may not meaningfully curb harms [6], while TechPolicy.Press noted the bill’s ambition but recorded its practical fragility — it was dead on the order paper after prorogation and may be split into narrower bills, reflecting political and technical difficulty in passing comprehensive online‑harms regulation [2] [8].
6. Concerns about scope, consultation and possible weaponization
Human‑rights and gender‑violence commentators pointed to limited consultation with some affected communities and warned that vaguely defined or broad offences could be weaponized against vulnerable groups or dissenting voices; this partly motivated calls to split the bill to preserve child‑protection measures while reworking hate‑speech and criminal‑law changes [9] [4].
7. Political responses and the bill’s fate — split, stalled, then died on the order paper
The government responded to criticism by proposing to split C-63 into separate laws (e.g., prioritizing child‑protection elements) and by revisiting contentious parts [8] [9]. Parliamentary dissolution in January 2025 caused C-63 to die on the order paper, and some hate‑speech components were later reintroduced in other bills, showing the issue remains contested and incremental rather than settled [8] [2].
8. How supporters and critics disagree — tradeoffs and implicit agendas
Supporters prioritize immediate investigatory gains and child safety [3] [1]; critics prioritize civil liberties and precision in law‑making, suspecting political incentives to appear tough on online harms may have led to rushed or overbroad measures [4] [5]. Policy analysts worry the bill either doesn’t go far enough (if you treat platforms as powerful commercial actors) or goes too far (if you view it as a threat to free expression), exposing an ideological and practical split in assessing regulatory tradeoffs [6] [5].
9. Limits of current reporting and what’s missing
Available sources do not mention detailed vote counts by party on specific clauses (not found in current reporting) or exhaustive technical assessments of how the Digital Safety Commission would operate day‑to‑day; parliamentary records and advocacy submissions suggest broader debates but leave implementation specifics unresolved [10] [7].
Conclusion: Bill C-63 united a contentious mix of child‑safety ambitions and free‑speech fears; supporters argued it fills enforcement gaps and protects vulnerable people, while critics warned of overreach, poor drafting, and political haste — a divide that produced the government’s decision to split parts of the bill and the bill’s eventual demise on the order paper [1] [4] [8].