Introductory Session

What is a Worksheet?

Each advisory group session will be supported by a worksheet, like this one, made available to the group in advance of each session. The goal of these worksheets is to support the discussion and organize feedback and input received. These worksheets will be made public after each session.

Each worksheet will have a set of questions for which the group members will be asked to submit written responses to. A non-attributed summary of these submissions will be published weekly to help conduct the work in a transparent manner.

The proposed approach in each worksheet represents the Government’s preliminary ideas on a certain topic, based on feedback received during the July-September 2021 consultation. It is meant to be a tool to help discussion. The ideas and language shared are intended to represent a starting point for reaction and feedback. The advice received from these consultations will help the Government design an effective and proportionate legislative and regulatory framework for addressing harmful content online. Neither the group’s advice nor the preliminary views expressed in the worksheets constitute the final views of the Department of Canadian Heritage nor the Government of Canada.

Objectives

  1. Obtain your observations on the previous proposal and online content regulation in general. From July 29 to September 25, 2021 the Government of Canada ran a public consultation on its proposed approach to address harmful content online. Your feedback is now being sought to help identify what worked and what was not successful. Your views are also being sought regarding what you think should be included in any successful online content regulation framework – and what you think should be avoided as well.
  2. Introduce and gather feedback on the process for the sessions going forward. You will be provided with sufficient time and opportunity to be heard and voice your concerns, ideas, and recommendations. To that end, you will be asked for written submissions to the proposed discussion questions two days prior to each session. The discussion questions are included at the end of each worksheet. While your submissions will support and frame discussions, the workshops should not replicate your written responses. You will be able to revise your written submission and provide an updated draft after the sessions.
  3. Identify further themes that you want to explore in some depth, in addition to what is outlined in the itinerary. Each session will focus on a different key aspect of the Government’s proposal to regulate harmful content online. The topics reflect the Government’s understanding of the main issues raised in the July-September 2021 consultation. Any input on the selection of main topics to discuss is welcome.

Starting Points

Respondents to the Government’s consultation in summer 2021 generally expressed an appetite for a balanced, proportionate and practical framework to address harmful content online in Canada. Many voiced a preference for a regime in which any limits on Canadians’ rights would need to be proportionate to the Government’s objectives and stressed the importance of promoting an online environment in which all Canadians can exercise and enjoy their fundamental rights, including the freedom of expression, and the right to equality. Respondents brought practical considerations to light as well, emphasizing that the most effective way to address the plethora of harmful content online is by regulating and overseeing the processes and systems that social media platforms and other online services have in place.

In considering how to reframe the proposal shared for consultation, the policy approach put forward by the United Kingdom in its Online Safety Bill may be a helpful model to follow. The UK model would establish a new statutory ‘duty of care’ to make platforms more transparent and accountable for the safety of their users. It would give regulated entities the flexibility to tackle harms caused by content on their services in the most efficient and effective way. Similarly, Canada’s updated framework could focus on the systems, tools, and approaches that online services have in place to address harmful content. The framework outlined in the July-September 2021 consultation focused on what regulated services would be obligated to take down and how they would be required to do it, including a specific timeframe to abide by. An updated approach may pivot from a regime based on rigid moderating obligations, to a general framework that compels platforms to assess the risk posed by harmful content on their services and provide details about how they will mitigate the risk identified and respond to instances of online harm on their platforms.

Platforms themselves have an informational and experience advantage when it comes to content moderation. Governments, researchers, experts and civil society are faced with a dearth of information on the prevalence and moderation of harmful content online. In contrast, online services have been making decisions – without transparency, accountability, or oversight – about what content should or should not be on their platforms for years. A legislative and regulatory framework for online safety must both confront this information and experience asymmetry and navigate its reality. Pivoting to a systems-based approach would aid on both fronts. It would require regulated services to generate much more information and data regarding the prevalence and moderation of harmful content. It would impose a set of general obligations to monitor, moderate and manage harmful content and its risks. Finally, it would give regulated services a measure of flexibility to determine how to moderate and monitor content on their platforms and decide on the best approach to meet their general obligations.

In light of the above, an updated legislative and regulatory framework should focus on three goals:

  1. Reduce the amount of harmful content online and the risk it poses to Canadians while operating within the parameters of the Canadian Charter of Rights and Freedoms. The legislative framework would aim to foster a safe and inclusive internet where people in Canada feel they can express themselves without being victimized or targeted by harmful content.
  2. Bring greater transparency, oversight and accountability to content moderation in Canada. A key intended function of the framework will be to set transparency and reporting obligations on platforms that will enable the Government to better understand the prevalence of harmful content and assess the efficacy of the platforms’ content-moderation policies and practices.
  3. Have an effective regulatory regime, supported by an emphasis on norm-setting, guidance, and collaboration. Legislative frameworks to regulate harmful content online in other jurisdictions such as Australia suggest that major social media platforms are almost always responsive to guidance and requests from governments when it comes to harmful content. Building a strong foundation for engagement, collaboration and partnership between regulated platforms and a new regulator would help bring about effective change. While transparency is limited and information and data are sparse at present, an effective regulatory regime can, over time, support a more robust information ecosystem and norms around the monitoring, moderation and management of harmful content online.

Overview of Proposed Approach

Major social media platforms and services most responsible for harmful content would be included in the regime. The framework would put forth a definition of “online communication service provider” that captures services that Canadians intuitively associate with the term social media platform (such as Facebook, YouTube, Instagram, Twitter, and TikTok), and services that, though not typically referred to as social media platforms, pose significant risk in terms of proliferating harmful content, like PornHub. Legislation would give the Government some discretion to specify services that would be excluded from the regime. This discretion would help to ensure a flexible and nimble framework.

The framework would seek to regulate the types of harmful content online that Canadians expect to be regulated. Like the previous proposed framework, the framework would regulate five categories of harmful content – child sexual exploitation content, the non-consensual sharing of intimate images, terrorist content, content that incites violence, and hate speech. The broad definitions for the content identified at inception would be drawn from the Criminal Code, jurisprudence, and other Canadian legislation. Other types of harmful content could be examined for possible inclusion in the regime.

A new regulator, the Digital Safety Commissioner, would be created to administer and enforce the framework. Legislation would grant the Digital Safety Commissioner the necessary authorities and powers to ensure it can effectively administer, oversee, and enforce the legislation and regulations. To ensure that regulated entities are held accountable for the commitments they make, as well as for the information they share publicly, the Commissioner would be equipped with audit and inspection authorities. The Commissioner would also be given the necessary tools to both deter and address non-compliance, including the ability to impose penalties and use harsher compliance tools as a measure of last resort.

A duty of care on platforms to take reasonable steps to identify and mitigate foreseeable harm arising from the operation and design of their services would be asserted. Regulated entities would be compelled to file Digital Safety Plans (DSPs) with the Digital Safety Commissioner. The DSPs would require that regulated entities conduct a risk assessment of the harmful content on their platforms, and detail their mitigation measures, systems and processes to address such risks. To help ensure accountability, the Digital Safety Commissioner would certify platforms’ DSPs, and issue guidance and orders where necessary to ensure the plans comply with legislative and regulatory requirements, or other rules set by the Commissioner. To bring about greater transparency, regulated entities would be required to outline information regarding their DSPs in publicly available terms of service and specify how their users are protected from harmful content on their platforms.

The framework would promote and protect the freedom of expression, equality rights, and privacy rights. A number of safeguards would be included in the proposal to help mitigate risks to Canadians’ fundamental rights. Legislation would require that in fulfilling their obligations, all regulated entities consider the impact of their safety policies and procedures on their users’ freedom of expression, equality rights, and privacy. Legislation would also introduce circumscribed definitions for regulated entities and regulated content that would mitigate the risk of the regime being overbroad. In addition, the framework would introduce effective and efficient procedural fairness mechanisms, including obligations on regulated entities to give notice to users, allow representations to be made, and grant the opportunity to request a reconsideration of specific content moderation decisions. Finally, legislation would mandate that users’ data and privacy be preserved through the mechanisms to protect privacy when making complaints and representations, including in camera hearings where appropriate, and necessary safeguards when preserving data and sharing information with other Government agencies, including law enforcement.

Complementary amendments to the Canadian Human Rights Act (CHRA) to combat online hate speech, as in the former Bill C-36, would be introduced. The CHRA would define as a “discriminatory practice” the communication by internet or other telecommunications of hate speech in contexts where it is likely to foment detestation or vilification on prohibited grounds of discrimination. A prohibition would apply directly to users who communicate or cause the communication of hate speech, including on platforms regulated by the Digital Safety Commissioner, and to operators of websites and other online forums that publish the communications of their users; but would not apply to the operators of platforms regulated by the Digital Safety Commissioner. Individuals and groups would be able to file complaints with the Canadian Human Rights Commission, which would decide whether to refer them to the Canadian Human Rights Tribunal for adjudication. The Tribunal would be empowered to order that respondents delete and cease communicating the hate speech; compensate any victim specifically identified in the hate speech of an amount not exceeding twenty thousand dollars, but only if the respondent is an author or contributor to the hate speech and not a mere publisher or intermediary; and pay a penalty of not more than fifty thousand dollars if required to induce compliance.

Supporting questions for discussion

  1. Obtain your observations on the previous proposal and online content regulation in general.
    1. Is there important feedback that you believe was missing from the What We Heard Report?
    2. What struck you most about the legislative and regulatory frameworks coming from other jurisdictions? What can we learn from their successes?
    3. Other than a marked reduction of harmful content online, what would the main policy goals be for a successful revised framework? What authorities, tools, and obligations would you introduce to achieve those goals?
    4. What are the biggest mistakes that you think are being made by other governments trying to regulate harmful content online? What can we learn from their mistakes? What can we learn from their successes?
  2. Introduce and gather feedback on the process for the sessions going forward.
    1. Does the proposed process get us to a place where we can achieve our objectives of (a) engaging with you on our preliminary thinking and (b) gathering your insight and recommendations regarding the major elements of a regulatory framework?
    2. Are you being given adequate opportunity to voice your concerns and engage with fellow participants, and the Government?
  3. Identify further themes that you want to explore in some depth, in addition to what is outlined in our itinerary.
    1. Are there any additional broad thematic concerns that you would like to discuss?
    2. Are there any specific items that you would like to add to any of the existing sessions?

This publication is available upon request in alternative formats.

This publication is available on the Internet at:

Canada.ca/Canadian-Heritage

©Her Majesty the Queen in Right of Canada, 2021

Catalogue number: XX0-0/0000

ISBN: 0-000-00000-0

Page details

Date modified: