Content Moderation and Artificial Intelligence in the year of elections

Will 2024 be the big test of integrity for global tech platform companies?

  • Blog

More than 49% of the world population will vote for their leaders this year. In Africa, 22 countries, out of 54 are scheduled to have general elections. These elections are taking place with massive developments in artificial intelligence (AI), which means electorates are going to make their choice based on information likely either created by AI, specifically targeted for them by AI, or both. The development and use of AI in elections has not only complicated the challenges of information integrity but also brought up intriguing questions about the role of AI in democracy. For example, is it ethical for politicians to use AI for different targeted political messaging for different populations? What will accountability for the campaign promises look like in such cases?

Last week, RIA research fellow Liz Orembo moderated a virtual event called “Content Moderation and Artificial Intelligence in the Year of Elections”. Organised by Paradigm Initiative as the host organisation for the Net Rights Coalition. The panel featured Bulanda Nkhowami from Digital Action, Sherylle Dass from the Legal Resources Centre (LRC), Khadija El Usman from Paradigm Initiative and Nadine Kampire, Nkhosikhona Dibiti from Community Podium and Farai Morobane from Meta.

The regional diversity of the panel meant all the sub-regions of Africa were represented. As reflected in the conversations, Africa is a continent of diverse contexts, a diversity that needs to be considered in any policy approach to content moderation. Countries have different levels of technological development, digital access, and a wide range of variations in democratic systems. In some countries, such as Zimbabwe, offline activities impacted how disinformation spread online, while in Senegal, the citizenry still relies heavily on community radio for information. Therefore, specific techniques to tackle online disinformation at community levels are important.

An important point was made during the discussions about what constitutes a successful election in Africa; stakeholders mostly fear violence as a consequence of election misinformation and election integrity. When it doesn’t occur, elections are considered a success. But this is often a very cynical view with serious consequences as African citizens feel the gaps in accountability in how elections are managed. This results in a demoralised citizenry and low civic participation and public trust. The panellists agreed the bar needs to be raised regarding the quality of African elections.

Digital Action and the LRC called for more timely interventions and investments towards information integrity in Africa by social media companies, noting that challenges faced and reported by African stakeholders usually need to be addressed with the priority and urgency they deserve. The harmful effects of slow response are also exacerbated by platforms’ inability to implement policies and community guidelines. For instance, while Online Gender Based Violence is a common violation of many platform policies, platforms have not only left these kinds of harmful content on their platforms but also approved such content coming in as advertisements.

Meta said that they are setting up election centres (some virtual) in African countries to oversee information integrity.

2024 will be an interesting year to reflect on and analyse how platform companies exercised accountability across different parts of the world.

Related