Regulating Digital Platforms to Boost Trust: A Review of the UNESCO Guidelines

As global digital platforms increasingly govern the life of our societies, there are calls from across the world to regulate these firms and mitigate associated risks. Until recently, their global and virtual nature meant platforms were perceived as not being susceptible to regulation primarily due to issues of enforceability.

International organisations such as UNESCO have recognised the harms associated with the intensifying network effects of platformisation and have seen an opportunity to direct the discussion. They have initiated a process towards the global governance of platforms by publishing Guidelines on platform regulation. This is a complex issue because geopolitical interests, the local context and regime type, as well as human rights, intersect at the question of regulation. 

UNESCO has shared its 2nd Draft Guidelines for Regulating Digital Platforms in preparation of the Internet for Trust conference from 21 to 23 February 2023 in Paris.

UNESCO’s Human Rights Approach

The Guidelines provide recommendations to help regulate online platforms – defined as social media, messaging apps, search engines, app stores and content-sharing platforms. UNESCO adopts a human rights-based approach and (in line with their global mandate) focuses on the right of freedom of expression and access to information. Their central question is how to treat content that is illegal under human rights law and poses harm to democracy while protecting the freedom of expression.

The Guidelines advocate for a co-regulation approach with public, private, international and civil society actors each having a role. All four groups have broad duties within UNESCO’s Guidelines. For example, civil society actors need to promote and enable a multistakeholder approach and help develop and oversee the regulation of digital platforms. The Guidelines further outline the responsibility of intergovernmental organisations by highlighting their role in providing technical assistance and monitoring.

The Role of States

The UNESCO Guidelines identify two main duties for states. The first obligation is for states to promote universal and meaningful access to the Internet. The second one is to regulate content that compromises human rights while refraining from censoring legitimate content. States can fulfill these duties by creating an independent regulatory system which works towards five common principles:

  • The system and its decision-making should be independent.
  • The system should focus on systems and processes of digital platforms, not individual pieces of content.
  • There should be an external review mechanism of the regulatory system.
  • Any decisions taken by the regulatory system must be allowed to be reviewed by an independent juridical system if necessary.

The regulatory system needs to have the power to investigate digital platforms to oversee their implementation of five responsibilities.

The Role of Digital Platforms

A key aspect of the Guidelines is the duties of digital platforms, which are based on five responsibilities:

  • Platforms need to respect human rights in content moderation and curation. This needs to apply consistently across all regions, contents and users. To ensure the human rights approach in automated decision-making, there should be an independent assessment on the impact of these.
  • Platforms need to be transparent, meaning that they should report to a regulatory system how they fulfill principles of transparency and explicability. Additionally, meaningful transparency in the form of practical information like the number of content moderators or the way how to access complaint procedures should be provided.
  • Platforms should empower users by providing them with reporting mechanisms of abuse, contributing to media and information literacy, and making the terms of service available in primary languages.
  • Platforms must be accountable to relevant stakeholders, therefore being able to explain what actions were taken and why. This also encompasses effective user complaint mechanisms that follow the seven UN Guiding Principles on Business and Human Rights.
  • Platforms must conduct human rights due diligence. It is proposed that risk assessments should be used to identify and address potential human rights harms. 
Future Suggestions

UNESCO’s Guidelines provide a first important, broad overview on how regulation of digital platforms can be accomplished. Still, the question of what constitutes a harm of human rights is unclear and the focus on risk assessments as a tool to prevent harm needs to more clearly specify enforcement mechanisms.

Moreover, the Guidelines could more centrally address inequalities. For example, the Global North–South hierarchy is under-addressed; hence the Guidelines cannot sufficiently account for why certain languages are missing or what local contexts are missing. When content moderation is discussed, the Guidelines do not address how content moderators, often situated in the Global South, are doing work for users based in the Global North. These same moderators are rarely tasked to review content circulating in their own societies. 

For further background research and recommendations on the issue of platform regulation, take a look at this Research ICT Africa three-part series:

Readers are invited to comment on the draft background papers and contribute to the discussion on Internet for Trust and platform regulation.

Here are some additional resources that Research ICT Africa has produced on the issues of content moderation and platform governance that can inform the ongoing discussion:

Theresa Schültken is a political scientist interested in the international governance of technologies such as AI or Digital Data. She is currently an artificial intelligence research intern at Research ICT Africa. She can be contacted at