Keep Factually independent

Whether you agree or disagree with our analysis, these conversations matter for democracy. We don't take money from political groups - even a $5 donation helps us keep it that way.

Loading...Goal: 1,000 supporters
Loading...

How might Bill C-63 impact online platforms in Canada?

Checked on November 18, 2025
Disclaimer: Factually can make mistakes. Please verify important info or breaking news. Learn more.

Executive summary

Bill C-63 (the Online Harms Act) would have created a new regulatory regime for large social media platforms in Canada, including a Digital Safety Commission and obligations to assess risks, remove or make certain harmful content inaccessible, and increase transparency; the bill died on the order paper when Parliament was dissolved in January 2025 [1] [2]. Supporters framed it as child-protection and accountability legislation while critics warned of overreach, weak enforcement design, and risks to free expression — commentators disagree on whether it was a “light touch” or a major expansion of hate‑speech law [3] [4] [5].

1. What the bill would require of platforms: new duties and a regulator

Bill C-63 would have imposed duties on operators of regulated social media services to assess exposure to seven types of harmful content, adopt measures to reduce those risks, implement reporting and record‑keeping systems, provide user complaint channels and designate contact persons, and preserve certain removed content for investigatory purposes; it also proposed fines and civil enforcement through Federal Court rather than criminal imprisonment for corporate non‑compliance [6] [1] [7] [8].

2. A powerful new agency — the Digital Safety Commission

The bill proposed to create a Digital Safety Commission of Canada, supported by a Digital Safety Ombudsperson and Office, to set standards, investigate complaints, hold hearings, and enforce the Act. That administrative architecture would shift significant oversight from platforms and courts to a statutory regulator that could develop evolving baseline safety requirements [8] [1].

3. What content the law targeted — seven categories and criminal-code tweaks

C-63 aimed at seven categories of online harms: sexual victimization of children and revictimization of survivors, non‑consensual intimate images, bullying of children, content inducing self‑harm in children, content that foments hatred, content that incites violence, and content that incites violent extremism or terrorism. The bill also proposed amendments to the Criminal Code and Canadian Human Rights Act to strengthen penalties for hate‑motivated offences [5] [9] [7].

4. Practical changes platforms would likely make if C-63 became law

Platforms would likely expand content‑detection tools, develop and publish digital‑safety plans, strengthen takedown and complaint mechanisms, label botnet/amplified content, and keep longer retention and preservation records for certain content — changes outlined in legal and industry analyses as necessary to comply with the proposed duties [10] [6] [7].

5. Enforcement, penalties and legal pathway — civil, not criminal, for operators

Enforcement would be primarily administrative and civil: the Commission could impose penalties up to a percentage of global revenue or fixed amounts for corporate actors, and enforcement actions would proceed in Federal Court; the bill specified that non‑payment of fines would not lead to imprisonment [7] [1].

6. Supporters’ framing: child protection and accountability

The government and some legal commentators presented C-63 as a targeted response to protect children and vulnerable people online and to hold platforms accountable where current regimes leave gaps. Proponents highlighted the need for baseline safety requirements, more transparency about platform practices, and better tools for victims and law enforcement [1] [5].

7. Critics’ concerns: censorship risk, regulatory overreach, and design weaknesses

Opponents voiced multiple concerns: some argued the Digital Safety Commission could have sweeping powers that threaten free expression and create chilling effects; others (including institutional analysts) said the bill was either overbroad or, conversely, too modest — “one of North America’s most rigid regulatory environments” according to one law firm analysis, while another think‑tank said it was in practice a “light touch” that did little [4] [3] [11]. Academic and media critiques also warned about the risk of weaponizing well‑intended rules and the technical and legal complexity of defining and enforcing “harm” online [12] [3].

8. Political fate and what that means for platforms now

C-63 did not become law: it stalled amid political crises and “died on the order paper” when Parliament was dissolved in January 2025. Some elements — notably child‑protection provisions or hate‑speech components — were discussed for re‑introduction in separate bills afterward, indicating the policy debate will continue even if this specific package failed [2] [9] [5].

9. What to watch next — regulation vs. platform self‑governance

Future developments to watch include whether Parliament reintroduces child‑safety or hate‑speech elements as narrower bills, how the government balances enforcement powers with Charter and free‑speech concerns, and whether platforms preemptively upgrade safety measures to avoid future regulation. Analysts also note the international context: U.S. and other jurisdictions’ reactions and political framings may influence how Canadian proposals evolve [5] [3].

Limitations and sources: This analysis summarizes the proposed bill and public commentary available in the supplied documents; it does not assert outcomes beyond those sources. Statements above are drawn from the Government of Canada and parliamentary texts, legal firm analyses, policy think‑tank pieces and media summaries provided [1] [8] [6] [7] [4] [3] [5] [2].

Want to dive deeper?
What specific obligations would Bill C-63 impose on Canadian online platforms and intermediaries?
How would Bill C-63 change content moderation practices for social media companies operating in Canada?
What penalties or enforcement mechanisms does Bill C-63 propose for platforms that fail to comply?
How might Bill C-63 affect user privacy, data retention, and transparency reporting requirements?
Could Bill C-63 influence international tech companies’ decisions to restrict or modify services in Canada?