Online Harassment Protections Expand Under New Law

What this means for victims and platforms

img

With the digital world becoming increasingly central to Canadian lives, the federal government has passed sweeping updates to its online safety laws, aimed specifically at combatting cyber harassment and digital abuse.

A Long-Awaited Reform

On July 1, 2025, the Online Safety and Accountability Act officially came into force, marking one of the most significant digital policy updates in Canada in over a decade. This legislation introduces stricter obligations for platforms to detect, remove, and report abusive content targeting individuals online — especially threats of violence, doxxing, and non-consensual image sharing.

“We can no longer ignore the serious harms that occur in digital spaces,” said a senior official during the bill’s unveiling in Ottawa. “These protections are overdue, and they place Canada among leading global democracies addressing online abuse.”

How the Law Works

Under the new regulations, digital platforms that operate in Canada — including social media, forums, and streaming platforms — must:

  • Remove reported abusive content within 24 hours
  • Establish internal systems for content moderation and transparency
  • Offer clear reporting pathways for users, available in both English and French
  • Share anonymized data with regulators to track compliance

Failure to comply may result in fines of up to $10 million or 3% of global revenues, whichever is higher.

Support for Victims

Perhaps the most significant component of the new law is its expanded victim-support mechanisms. A federal Digital Harm Support Centre is now live, offering:

  • Real-time help lines for those experiencing cyber harassment
  • Legal assistance for filing protection orders
  • Help navigating content removal with platforms

“Victims no longer need to suffer in silence or navigate bureaucracy alone,” noted a Toronto-based victim advocate.

Balancing Free Speech and Accountability

The law does raise ongoing questions about balancing free expression with safety. Civil liberties groups have voiced concerns about how certain speech might be unfairly flagged or suppressed.

In response, the legislation includes a provision for independent oversight: an arms-length tribunal that will review appeals from users whose content was removed or flagged unjustly.

Impact on Platforms

Major platforms are already responding. Meta and TikTok have updated their Canadian moderation protocols. Reddit has announced the opening of a moderation support office in Vancouver. Smaller platforms, however, may struggle with the administrative burden — a potential concern for innovation and digital entrepreneurship.

The Canadian Radio-television and Telecommunications Commission (CRTC), the designated enforcement body, has emphasized a “grace period” of six months for companies to become fully compliant.

A Cultural Shift

Beyond the policy changes, experts see this moment as a cultural turning point. Cyberbullying, revenge porn, and mass harassment campaigns have driven Canadians offline, especially women, LGBTQ+ users, and racialized individuals. This legislation, while not perfect, aims to signal that online safety is a right — not a luxury.

While the true effectiveness of the law will only be visible over time, its passage indicates a clear and urgent political will: to make Canada's digital landscape not just free and open, but fair and safe as well.

“It’s not about censoring expression,” said one lawmaker. “It’s about drawing a line between critique and cruelty — and protecting people on the right side of that line.”

Empower Your Mind, Shape Canada’s Future

Dive into compelling stories, insightful analysis, and perspectives that connect you to the heart of Canada’s journey. Join thousands of thoughtful readers who trust us to deliver clarity and context — all straight to your inbox.

Join Us Today