Fighting online hate speech

Credit: Getty

In 5 seconds

A UdeM law professor is helping lawmakers tackle toxic online content.

An expert advisory group on online safety was set up on March 30, 2022 to advise the Minister of Canadian Heritage on the government’s proposed legislative and regulatory framework for combatting harmful content online. We sat down with the group’s co-chair, Pierre Trudel, professor in the Faculty of Law at the University of Montreal.

Why was this advisory group set up?

Last summer, the Canadian government announced it was taking steps to update laws to ensure that what is deemed hate speech in the real world is also banned in the digital world. Why is hate speech prohibited in a restaurant but not online? The government wanted to correct this anomaly and started by launching public consultations. Everyone agreed on the importance of fighting harmful speech, but many also stressed that if it wasn’t done right, we could end up with forms of censorship that would be worse than the problem we’re trying to fix. After analyzing the various points of view expressed during these consultations, the government set up an advisory group of 12 experts with diverse knowledge and experience. Its members are not all from academia; for example, some work in child protection, others in anti-racism.

The advisory group will support the Minister of Canadian Heritage in drafting legislation to combat not only online hate, harassment and threats, but also online distribution of child pornography and images of child abuse.

Canada already has laws to protect freedom of expression and laws against hate speech. How can we do more without opening the door to more censorship?

Since freedom of expression was enshrined in Canada’s constitution in 1982, the courts have ruled that a number of provisions in the Criminal Code prohibiting hate speech and harassment constitute reasonable limits on freedom of expression and are proportionate.

Many people confuse hate speech with speech they hate. We need to distinguish between what should be prohibited and what is protected by freedom of expression. Freedom of expression isn’t just the freedom to say and hear what you think is acceptable; it’s about putting up with comments you disagree with wholeheartedly. Striking a balance is tricky because what’s considered an acceptable limit by some people is not for others. In short, the only predictable and legitimate limits are those prescribed by law.

What are the plans for regulating digital environments?

Digital environments come in many shapes and sizes. In general, online environments can be categorized as public or private, and these operate on different business models.

When people are communicating privately, for example in a phone conversation, the laws on hate propaganda don’t apply. In the private sphere, a person can express repugnant views about someone else and this is not considered a crime. However, it is illegal even in the private sphere to send a threatening message.

Digital is blurring the boundary between private and public. For example, a person can very easily transfer a message received privately on Twitter into the public domain. The question is how do we draft acts and regulations that are appropriate for online environments, which are light years away from the world of posting letters in the mail.

Do we need to rethink the issue of territorial boundaries?

Definitely! In general, the courts tend to recognize that when a message is directed toward a specific population, the laws of that country apply. Just because a company is located beyond our borders doesn’t mean that it’s no longer bound by our laws. Someone who does business in Canada and posts hateful comments about a segment of our population should be sanctioned under Canadian law. Likewise, someone who non-consensually distributes intimate images using the services of a foreign-based company cannot evade our laws if those images are accessible in Canada.

What can the courts do about the alarming rise in the non-consensual distribution of intimate images?

It isn’t always easy to distinguish between consensual and non-consensual sharing of intimate images. It requires verification mechanisms. Non-consensual distribution of intimate images existed in the days of glossy photos, but it was mostly confidential. The big difference is that today they can very quickly end up in digital environments with a global reach.

When we amend the laws to regulate digital environments effectively, we also have to make sure we can act quickly so that victims have appropriate remedies within an appropriate timeframe.

So new “cyberjustice” laws are needed?

Yes! And the University of Montreal is a pioneer in this regard. For years the Cyberjustice Lab led by my colleague Karim Benyekhlef has been working to develop legal processes that fit online realities. At first even the idea of “cyberjustice” was met with scepticism in legal circles. But it has become increasingly clear that we have to move fast when something’s online. Try telling someone whose intimate photos are flying around the planet, “we’ll go to court and maybe in 10 years we’ll know if you have the right to get them taken down…”

This gives you an idea of the challenges we’re up against. We have to establish rules of law adapted to the speed of cyberspace. We have to have rapid access to independent judges to sort out what can and cannot be distributed freely. If the mechanisms are too crude, there’s a risk of excessive censorship of content that doesn’t pose the danger against which we want to protect people. This could undermine the legitimacy of our laws.

Are there useful models from other countries?

Countries like France and Germany have enacted laws targeting the most harmful online content. For example, Germany has adopted a law, heavily criticized at first, that forces social media sites to take down hate speech and illegal images within 48 hours of posting.

When will the expert advisory group’s work wrap up?

The group is expected to complete its work in a few months. It was set up to provide feedback on the changes proposed by the Canadian government, so it won’t last forever. It’s basically a large-scale collaborative research project involving government policy-makers and researchers from different domains. It’s similar to collaborative research between universities, civil society groups and industry, except in this case, because it’s legal research, our “industry” partner is the Government of Canada.

On the same subject

law freedom of expression violence