Skip to content

Protecting children online: How the new federal legislation aims to tackle online abuse

Sea to Sky MP and expert on Bill C-63 introduced by the federal government to reduce online harms, including sextortion and online abuse targeting children.
online-harms
The Online Harms Act is intended to shield children and youths from abuse on the internet.

Can government intervention protect our kids online? That question is being hotly debated across the country.

The Online Harms Act is meant to be a dam in the river of abuse that is currently happening online. The federal legislation is intended to hold social media platforms accountable for addressing harmful content on their platforms and for creating a safer online space, especially for children.

On Feb. 26, federal Minister of Justice Arif Virani introduced Bill C-63 to create the new act.

The problem

It is no secret that abuse happens online.

An increasing concern has been the abuse of children.

One sobering statistic is that there has been a 150% increase in reports of sextortion to Cybertip.ca, a Canadian tipline for online child sexual abuse and exploitation.

Sextortion, just one form of online abuse targeting children, is blackmail. Someone threatens to distribute naked pictures or videos of the child unless money or more images are sent.

According to Cybertip, often young men are being tricked into thinking they are speaking to a girl their own age. The supposed love interest online sends a photo or video, asks for one in return and as soon as it is sent, the blackmail begins.

"Immediately after receiving the sexual content, the sextorter makes their demands. If a young girl is victimized, the sextorter typically demands additional sexual photos and videos. If the sextorter targets a boy, they almost always demand money instead," reads Cybertip’s website.

Patrick Weiler, Member of Parliament for West Vancouver—Sunshine Coast—Sea to Sky Country, is championing the bill and says it shifts the onus of responsibility onto the companies rather than it being up to youth or their parents to navigate obstacle courses of abuse.

"What we see is social media and other online services, really not taking steps to have rigorous measures to protect against this, and it's having a very real-world impact that, in the worst of cases, is making it deadly, and really offloading this responsibility onto parents," he said.

"Instead, putting it on the social media companies and other web companies to come up with a plan to mitigate those risks, and then ensuring that they're reporting on that over time to make sure we're holding them accountable to ensure that they produce a safe space."

What does the bill say?

It is a complex bill, with many moving parts, but, at its core, the proposed legislation aims at reducing seven types of harmful content:

            •           Content that sexually victimizes a child or revictimizes a survivor;

            •           Intimate content shared without consent;

            •           Content used to bully a child;

            •           Content that induces a child to harm themselves;

            •           Content that foments hatred;

            •           Content that incites violence; and

            •           Content that incites violent extremism or terrorism.

Once passed into law, a new Digital Safety Commission would be created to enforce the law, and a Digital Safety ombudsperson position would provide support for online users and victims.

It would also require the mandatory reporting of internet child pornography by those who provide an internet service.

The act would also bring changes to the Criminal Code "to better address hate crime and hate propaganda," according to the federal government.

In addition, there would be changes to the Canadian Human Rights Act that would allow individuals and groups to file complaints against those who post hate speech.

How will it work?

Companies like Meta and X (formerly Twitter) would have to offer "clear and accessible ways" to flag abusive content and block users.

The companies would need to put in place "special protections for children; to take action to address child sexual exploitation and the nonconsensual posting of intimate content, including deepfake sexual images; and to publish transparency reports," the federal government states.

Weiler noted that if the companies failed to comply, they would be hit in their profit margin.

"There's the ability to levy administrative monetary penalties of up to 6% of global revenues, which is a massive amount when you're talking about companies like [X], like Meta, and so that will have a sufficient deterrent effect to really hold them accountable to meeting this," he said, adding that initial responses from at least some companies have been positive.

"They want to work with the government to make these safer spaces. So, we hope it doesn't come to that [levy], but it's important that there is that heavy stick in case they are not willing to comply."

Free speech

Would such a bill mean online users could be punished for simply saying things the government doesn't like?

Weiler says no.

He said the aim is to remove child sexual abuse material or intimate images posted without consent, and the like.

 "I think it's very hard to argue that that content should not be immediately removed," he said.

In terms of hate speech, the bill proposes creating an offence for advocating or promoting genocide against an identifiable group.

"There's content and speech that is awful, but still lawful," Weiler said. "And so, it needs to reach a very, very high standard and that standard that's been set by the Supreme Court of Canada to be able to qualify as hate speech."

An expert weighs in

Heidi Tworek is the Canada Research Chair and associate professor of International History and Public Policy at UBC.

In the early stages of the act's creation, she was also a member of the expert advisory group to the Heritage Ministry on online safety in the summer of 2022.

Looking at the issue of online abuse in a historical context, Tworek said there has been a "waxing and waning of different kinds of speech” over time.

"What we're seeing at this moment is, sadly, a waxing of lots of speech that is really problematic, or causing a lot of, in this case, real-world harms in different ways."

One of the most famous cases of online abuse being deadly was that of 15-year-old Amanda Todd, who died in 2012 by suicide after relentless cyberbullying.

Most recently, in November of last year, a 12-year-old boy in Prince George died by suicide shortly after being sextorted online.

"There is no silver bullet to solve any of these things," Tworek added. "But ... in a democratic society that values freedom of expression, there are also still types of expression that are illegal. And in this case, it's to do with children or intimate images, and can we reduce the amount of that we see? I see this bill as trying to do that while also ensuring freedom of expression."

Tworek noted history has also shown that governments can go too far in their restrictions on speech.

"That's one of the historical cautions is that the state can also go much too far, and we need to find ways to balance those two things," she said.

"There's a real attempt to try and have that balance of freedom of expression and then things that are illegal put up without consent. And so that's how I see this bill as trying to sort of thread that needle."

Asked if she is concerned companies like X and Meta won't comply with the rules, Tworek said that the bill is not re-inventing the wheel.

What it is proposing is in place—and being complied with—in other areas, such as in the U.K. and Australia, she said.

"There's nothing in this bill where you would say, 'Gosh, this is such an outlier from Australia, or the European Union, or the UK, that is something that [a] company is going to step away from.'"

She said another important aspect of the bill is that it includes planned research and data collection into how it is working—whether companies are complying and if it is reducing real-world harms.

While some of the details will be worked out in the coming months, at its core, the bill isn't trying to do something very radical, at least in terms of protecting children from harm, Tworek said.

"I think we're quite familiar with the idea that other products for children should be safe. In a way, you know, it's not so super radical to say that, just like a highchair should be safe, [other] products children use should be safe; and then, it's a question of what does that look like in the digital realm."

What is next?

Next will be further debate about the bill.

“As important as it is to pass this sooner, it's important that it's not rushed,” said Weiler. “Because we want to make sure that we get it exactly right. So that we ensure that we're not violating aspects of the [Canadian Charter of Rights and Freedoms], while we make sure that we are protecting against harm online. So I'm sure it will take some time.”

In Canada, for a federal bill to become law, it has to be approved by both the Senate and the House of Commons. The Legislative process involves debate, review and voting.

After a bill is passed, the Governor General signs the bill, granting Royal Assent, making the bill a law.

This process can take months or years to complete.

Both Weiler and Tworek noted that in the meantime, it is positive that the bill is prompting more discussion both of the abuses that occur online and what we as a society want to do about them.

“And I think that's really important,” Weiler said.

Read much more about the bill on the Government of Canada website.

push icon
Be the first to read breaking stories. Enable push notifications on your device. Disable anytime.
No thanks