The Freedom of Invisibility: Regulating Online Speech
Writer: Suri Yilin Li
Editor: Cheryl Chen
We do not have liberty without the freedom to express ourselves. But, what is the price of individual liberty?
With the rise of social media, our exposure to opinionated information is the highest it has ever been, yet the regulation of online speech remains uncertain. In a 2021 survey conducted by the Canadian Race Relations Foundation, 69% of Canadians worried more about the “impact of hate speech and racism” than the limits on freedom of expression or privacy. Yet, when asked if the Canadian government should be taking more action to prevent hate speech, 23% were unsure (Humphreys, 2021).
Today, anyone can be a self-publisher without proving to an editor that their speech is accurate or ethical (Cole, 2024). To minimize hate and discrimination, it is important to discuss the extent that online speech should be regulated and who should hold the power to do so.
The Canadian Charter of Rights and Freedoms protects the Freedom of Expression––but solely against government interference (Canadian Charter, 1982, s 2(b)). In the private sector, the Charter rights of social media companies allow them to discriminate against content in their filtering process. Platforms can “refuse to publish or distribute content simply because they find it offensive, distasteful, false, or unworthy for virtually any reason” (Cole, 2024). At the same time, platforms are not obligated to publish protected speech. For instance, platforms filter out pornography, hate speech, terrorism, spam, and irrelevant content (Canadian Charter, 1982, s 2(b)). Otherwise, user feeds would be filled with the most recent, frequent posts (Cole, 2024).
While social media platforms need some degree of regulation to function, ex post content regulation (regulation that takes place after content is posted) is also a form of expression in itself (Samples, 2019). Where is the line between excessive and lax regulation? Should social media platforms even be the ones to regulate our speech?
From a political perspective, government regulation of online speech may not prove effective. . The Canadian Conservative Party equates regulation of speech with “censorship” encroaching on the right to freedom of expression (Canadian Conservative Party, n.d.). Similarly, Republicans in the USA view themselves as the victims of platforms’ liberal bias and propose the “fairness doctrine”, the equal representation of all viewpoints. To maintain true neutrality, the “fairness doctrine” means that each suicide awareness post would be paired with a pro-suicide post and platforms would be forbidden from removing hate speech. Conversely, Liberals argue that the current regulation of speech is insufficient, threatening democracy. They believe that the lack of speech regulation further marginalizes women, minorities, and existing marginalized groups (Cole, 2024). The political divide in opinion questions the credibility of the government as a regulator of online speech if policies are created to further a political ideology.
However, with the social media oligopoly, private regulation may not be superior. “If there were fifty Facebooks, we would be less worried about the content moderation policies of any particular one”(Cole, 2024). With the top five social media platforms holding around 98% of market share in Canada (Statista, 2024), each of these mega-platforms grasps enormous market power. Given the right to regulate user speech, there is no guarantee that a platform’s own bias will not dominate its content.
Thus, the paradox of regulating online speech: “we are uncomfortable with the government doing it; we are uncomfortable with social media or media titans doing it. But we are also uncomfortable with nobody doing it at all” (Bazelon, 2022).
As social media users, it is equally important for us to be mindful how we interact with content. In an interview with Havergal’s Executive Director of Equity, Diversity, Inclusion and Belonging Nicole Cozier, she points out that “The space is the space, [social media is] inherently neutral.” We have a responsibility as consumers and contributors of social media to reflect on why there is even a need to regulate content in the first place.
Ms. Cozier explains that confirmation bias is “the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values while giving disproportionately less consideration to alternative possibilities.” When scrolling through social media, how often are we conscious of bias, where our information comes from, where our source’s information stems from? This blind belief in what we see, facilitated by social media’s algorithms, creates a positive feedback loop where we are continuously fed information that we already believe in and, thus, put more credence into the information that we are fed.
As content contributors, Ms. Cozier notes that one may find themself stuck in an echo chamber wherein one reinforces the intensity of their beliefs by talking to others with the same beliefs. These echo chambers occur in person but are “magnified online because it’s so easy to find spaces of people that believe the same thing”. In fact, Ms. Cozier states that in digital spaces, it’s much easier to “become entrenched in our beliefs and hostile to those that don’t share in our beliefs”. Polarization amplifies, we “lose a sense of humanity and human connection” in virtual spaces.
Social media has immense potential to benefit us as a communication and information platform: it can connect those who feel isolated, provide access to different perspectives, encourage pause and reflection before a response…Yet, Ms. Cozier reminds us that “The challenges of social media isn’t just because it exists, the challenges arise from how we are using it.”
References
Bazelon, E. (2022). The Disinformation Dilemma. In Social Media, Freedom of Speech, and the Future of our Democracy (pp. 41–52). essay, Oxford Academic. Retrieved February 2, 2025, from https://doi.org/10.1093/oso/9780197621080.003.0003.
Canadian Charter of Rights and Freedoms, s 7, Part I of the Constitution Act, 1982, being Schedule B to the Canada Act 1982 (UK), 1982, c11.
Cole, D. (2024, March 21). Who Should Regulate Online Speech?. The New York Review. https://www.nybooks.com/articles/2024/03/21/who-should-regulate-online-speech/
Conservative Party of Canada. (n.d.). Repeal Liberal Censorship. Conservative Party of Canada. https://www.conservative.ca/cpc/repeal-liberal-censorship/
Humphreys, A. (2021, January 25). Canadians want online hate and racism curbed, even at cost of freedom of speech, poll finds | national post. National Post. https://nationalpost.com/news/canadians-want-online-hate-and-racism-curbed-even-at-cost-of-freedom-of-speech-poll-finds
Samples, J. (2019, April 9). Why the Government Should Not Regulate Content Moderation of Social Media. Cato Institute. https://www.cato.org/policy-analysis/why-government-should-not-regulate-content-moderation-social-media
Statista. (2024). Leading mobile social media websites in Canada in March 2024, based on share of visits. Statista. https://www.statista.com/statistics/696537/canada-share-social-mobile/