Keep Factually independent

Whether you agree or disagree with our analysis, these conversations matter for democracy. We don't take money from political groups - even a $5 donation helps us keep it that way.

Loading...Goal: 1,000 supporters
Loading...

What is the main purpose of Bill C-63 in Canada?

Checked on November 19, 2025
Disclaimer: Factually can make mistakes. Please verify important info or breaking news. Learn more.

Executive summary

Bill C-63 is the federal government’s proposed Online Harms Act aimed chiefly at promoting online safety in Canada by reducing specified harmful content on major social media platforms and creating new institutions to enforce rules — including a Digital Safety Commission and Ombudsperson [1] [2]. The bill targeted seven categories of harmful content (children’s sexual exploitation, non‑consensual intimate images, bullying of children, induced self‑harm, fomenting hatred, incitement to violence, and violent extremism/terrorism) and also proposed changes to the Criminal Code and Canadian Human Rights Act; it did not become law before Parliament was prorogued [3] [4] [5].

1. What the bill says it seeks to do: legislate online safety and accountability

Bill C-63 would have enacted an “Online Harms Act” whose stated purposes include promoting the online safety of people in Canada, reducing harms caused by harmful content online, and ensuring operators of social media services are transparent and accountable for duties under the Act; it proposed creating a Digital Safety Commission of Canada, a Digital Safety Ombudsperson, and a supporting Digital Safety Office to administer that framework [2] [1] [6].

2. The specific harms the bill targeted

The government framed C-63 around seven types of harmful content: content that sexually victimizes a child or revictimizes a survivor; intimate content shared without consent (revenge porn); content used to bully a child; content that induces a child to self-harm; content that foments hatred; content that incites violence; and content that incites violent extremism or terrorism [3] [5].

3. Criminal‑law and human‑rights changes bundled into the bill

Beyond platform regulation, C-63 included amendments to the Criminal Code (introducing a new “offence motivated by hatred,” new peace-bond powers related to hate crimes, and increased maximum penalties for hate propaganda) and changes to the Canadian Human Rights Act and mandatory reporting rules for child sexual content — signalling the government’s dual focus on online safety and hate/harm prevention [1] [7].

4. Enforcement design: a regulatory body with broad powers

The bill proposed a regulatory architecture that places substantial authority in a new Digital Safety Commission and associated officials to create standards, enforce duties, and oversee social-media operators — a structure civil liberties groups warned could centralize interpretive and enforcement power in government-appointed bodies [2] [8].

5. Supporters’ rationale: protect children and reduce online hate

Government and some advocates argued C-63 would deliver stronger protections for children and better safeguard Canadians from online hate, filling gaps left by platform self-regulation and offering victims routes to redress [5] [3]. Proponents also cited expert advisory input used to shape the bill’s measures [6].

6. Critics’ concerns: free expression, scope, and institutional power

Civil-liberties groups (e.g., Canadian Civil Liberties Association) and other critics cautioned that the bill’s criminal and human‑rights amendments, plus the concentrated regulatory authority, risk chilling legitimate speech, stifling journalism and activism, and granting an unelected body sweeping enforcement discretion [8] [9]. Amnesty Canada and others welcomed splitting the bill to prioritize child protection while addressing rights concerns separately [10].

7. How significant was the change proposed? Competing expert takes

Some commentators saw C-63 as a careful, measured regulatory approach focused on large platforms (government framing) while others said it was one of the most significant expansions of hate‑speech regulation in North America and could create a rigid environment for media and platforms [6] [11]. Policy analysts argued the platform-regulation portion was relatively “light touch” in practice, even as other elements were legally consequential [12].

8. Political fate and next steps in the record

C-63 stalled in Parliament and “died on the order paper” after prorogation in January 2025; officials signalled an intention to split the bill (prioritizing child-protection measures) and reintroduce components later, and some hate‑speech elements later appeared in subsequent legislation [4] [3]. Available sources do not mention whether an exact reworked successor fully restoring C-63’s package has become law (not found in current reporting).

9. Bottom line for readers: purpose versus controversy

The main purpose of Bill C-63 was to create a statutory Online Harms Act to make platforms and harms explicitly accountable and to strengthen protections for children and targets of online hate, backed by new institutions and criminal‑law changes [2] [5]. However, the package combined platform regulation with far‑reaching criminal and human‑rights amendments that prompted concerns about free expression and concentrated regulatory power, producing significant debate about trade‑offs between safety and rights [8] [9] [11].

Want to dive deeper?
What key changes to immigration or asylum law does Bill C-63 propose?
How would Bill C-63 affect refugee claim processing times and procedures in Canada?
What are the main arguments for and against Bill C-63 from political parties and advocacy groups?
Does Bill C-63 change enforcement powers for CBSA or IRCC, and what are the privacy implications?
How would Bill C-63 impact undocumented migrants, families, and access to legal counsel in Canada?