RIA research enriches UNESCO debates

Options to regulate internet platforms to counter online harms to human rights were aired this week at a special session convened by Research ICT Africa, within UNESCO’s conference in Paris titled “Internet for Trust – Towards Guidelines for Regulating Digital Platforms for Information as a Public Good”.

Moderating the session, RIA’s Dr Alison Gillwald summarised a research output commissioned by UNESCO, and which was published as the working paper of the conference:

  • Part 1 covers why lies and hate proliferate online.
  • Part 2 covers problems in the Platforms’ own policies for moderating content.
  • Part 3 covers a hybrid mix of regulatory arrangements, including multi-stakeholder roles.

“What is key for a multi-stakeholder role in digital governance – in rule-making, enforcement, and monitoring – is that it should be enabled through institutional arrangements that guarantee equitable, meaningful participation of all stakeholders – in many African countries that meant enabling the broader participation of civil society” she said.

From UK regulator, Ofcom, Maria Donde welcomed the RIA research as providing an important perspective from the Global South.  It provides an evidence base to help devise regulatory solutions to complex problems. “It is also very helpful to map harms to human rights, and more work on this would be welcome.”

She added that there was value in the RIA research’s coverage of “modularity” in regulation and in raising issues about co-operation across different regulators.

For South Africa’s information commissioner Pansy Tlakula, “we have to leverage existing regulators, not create new ones”.

She wondered whether the African Union could adapt the forthcoming UNESCO global principles for regulation. This would copy the successful pattern of Guidelines for Access to Information and Elections, adopted by the African Commission on Human and People’s Rights and which sets out stakeholder roles.

Tlakula also raised the prospect of possible joint meetings of associations of African information regulators, communications regulators, election management bodies and information commissioners.

Another possibility she noted was whether right to information laws, such as in South Africa, could be used to obtain information from big technology companies, thus enabling an assessment about their policy and plans.

Harvard academic Joan Donovan said that the RIA research showed that it is not just “bad actors” who are the problem, but also the platforms themselves. “These companies operate at huge scale – it’s like they build the planes, put customers in them, but without building airports for landing. We must address this issue of scale and who pays for the wreckage caused when these planes crashland in our communities.”

Donovan proposed that companies should be made to truly respect their own rules, with consumer privacy and data protection being a good start, which would dovetail with the insight that tech regulation should be about process not products.

Regulation should be proportional to the problem and  set requirements for apps collecting data. Platforms recommendations should include TALK – timely, accurate, local knowledge, she added.

Donde, who is also Senior Vice-Chairperson, European platform of regulatory authorities, flagged the importance of initiatives like the Global Online Safety Regulators Network.

She added: “National regulation becomes easier if we can minimise divergences around the world, and where we can have more in common and speak with one voice.”

“Not a single regulator, nor a single country, can be the solution. Cross-border collaboration is as important as cross-sectoral amongst regulators. We need to look at systemic risks of platforms, and at information gathering powers to get data from the platforms.”

Nighat Dad, from Pakistan’s Digital Rights Foundation, underlined the need to reform problematic laws, referencing legislation purportedly designed to protect women and children, but which is being weaponised against journalists.  She urged that attention be paid to creating mechanisms to hold governments and states accountable for regulations following human rights frames.

Dad, who is also a member of the Meta “oversight board”, said the experiences of this institution could inform regulatory bodies by visibly adhering to human rights standards, and by being open and transparent.

Speaking for the Inter-American Press Association, Jorge Canahuati Larach, warned that “regulation of the platforms could censor news content on platforms”.  He advised that any enforcement of rules against the platforms should be by fully independent regulators, and that platforms should remain the first source of moderation.

“For their part, platforms should have high standards of editorial responsibility, and should act quickly and efficiently.  They must put human rights before political or commercial interests”.

Canahuati Laruch said further that countries have sufficient laws to regulate illegal content, although he pointed out that there are laws that limit legitimate content.

Canahuati Laruch said the walled gardens and moderation practices of platforms are linked to their economic aspects which had impacted negatively on professional journalism. “This has to be addressed as part of the solution, since journalism is counterbalance to misinformation and disinformation,” he said.

Related