by Enrico Calandro
On 4 March 2015, the Film and Publications Board (FPB) of South Africa, an office of the Department of Communication, published a draft Online Regulation Policy (Notice 182 in Government Gazette 38531).
The draft Online Regulation Policy (Draft Regulation) proposes far-reaching regulation of online content. The Draft Regulation is controversial not only in principle as it would restrict freedom of expression online, but also from a more practical perspective, with regards to implementation. The following blog-post discusses the potential impact of such regulation on users’ civil and political rights, particularly freedom of speech and expression, by extending its application beyond the remit of the FPB on protecting children’s rights for a safe media environment. Further, it explores unintended consequences of implementing such regulatory system.
The classification of online content before it is published restricts freedom of expression. Any pre-publication review of content must comply with the democratic and constitutional norms. This position is supported by eminent scholars on internet governance who have expressed concerns on filtering content through classification as this regulatory practice can significantly harm freedom of expression in democratic societies (See Brown, 2008; McIntyre, 2013).
As observed by Kreimer (2006, in McIntyre, 2013), online content classification may create a risk of invisible and unaccountable “censorship by proxy” as it combines three different regulatory trends to control information:
First, a risk associated with online content classification is that once this mechanism is established for a narrow, protective purpose, it can be easily extended to achieve different goals. In this respect, Mueller (2010) has argued that under the aegis of child protection, internet content regulation can result in “networked censorship”. Mueller (2010:190) observes that “emotional appeals to ’the children’ have deliberately been exploited as the entering wedge for a broader reassertion of state control over internet content”.
Second, the effect of a focus on intermediaries or intermediary-based regulation – such as Internet Service Providers (ISP) and social media platforms – (Boyle, 1997; Swire, 1998) may result in over-blocking online content. Since these private players have greater technical capability to screen communications, they can act as “internet points of control” (Zittrain, 2003). The amount of content which does not comply with classification guidelines may be disproportionate, and online platforms and other intermediaries may favour an over-blocking system for online content which protects themselves from government’s sanctions, rather than to protect users’ content from censorship (Kreimer, 2006).
Third, the use of self- or co-regulation in preference to legislation as a possible starting point for regulation of information technology (Koops et al., 2006) offers governments the opportunity to outsource enforcement. It also minimises accompanying costs, and indemnifies them from claims, loss or damage arising from online content classification systems.
In the last decade, in an attempt to limit online availability of harmful content for children, the practice of internet content regulation has become common in several democratic countries and has been implemented in numerous jurisdictions (McIntyre, 2013). In addition, many information and communications intermediaries have adopted their own systems to filter such content, often self-regulating to avoid more formal regulation.
For instance, in Australia, a classification regulatory regime is in place under the Australia Communications and Media Authority with power to enforce online content restrictions on Internet content, hosted in Australia and maintain a “black-list” of overseas websites for use in filtering software. However, anecdotally from industry players and activists the online classification scheme is almost wholly ineffective. The scheme is complaints-driven and imposes obligations on domestic hosts. It potentially includes obligations to block content of foreign hosts; but this aspect has never been implemented. A big push from 2008-2012 to introduce a more comprehensive website blocking regime was defeated by civil society organisations (Suzor, informal conversation, May 2015).
Australia is further about to pass a bill to introduce website blocking for websites that facilitate copyright infringement. However, the term facilitate is not quite clear but there is an ongoing enquiry into the term (Suzor, informal conversation, May 2015).
In the UK, since 1996 a response to child abuse images has been given by the industry. The Internet Watch Foundation (IWF), a private body funded by the internet industry and the EU, acts in collaboration with the police and government. It receives public complaints and determines whether webpages contain potentially illegal material. Although the system has developed independently without any legislative basis, it has limited procedural safeguards and no judicial oversight (McIntyre, 2013).
Among content oversight initiatives from the private sector, YouTube has community guidelines in place to moderate the online video platform at the backend and provide guidance to users on which content is appropriate and allowed to be published. In addition, the platform has in place a reporting system which allows to classify harmful or inappropriate content based on YouTube’s community guidelines. Age and country restrictions mechanisms have also been adopted.
Why the South African Draft Regulation could result in online “censorship by proxy”
From a constitutional and human rights perspective, the practice of pre-classifying online content as envisaged by the Draft Regulation goes against rights protected by the South African Constitution, the Bill of Rights and by the Universal Declaration of Human Rights (UDHR).
The right to freedom of expression, as envisaged by the Constitution of the Republic of South Africa (Bill of Rights), is inspired by Article 19 of the UDHR, which states that “everyone has the right to freedom of opinion and expression; the right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers”.
Article 16 of Chapter 2 of the South African Bill of Rights grants everyone with the right to freedom of expression, including not only “freedom of the press and other media” but also “freedom to receive or impart information or ideas”.
Users’ capacity to produce and distribute content is enhanced by social media, which not only provide platforms to receive or impart information or ideas, but have also been driving mobile internet connectivity in African countries including South Africa (Stork et al., 2013). Nevertheless, the Draft Regulation applies severe restrictions mostly on users’ freedom of imparting information and ideas through social media as it requires that “as at 31st of March 2016, no online distributor shall be allowed to distribute digital content in the Republic of South Africa unless such content is classified in terms of the Board’s Classification Guidelines” (Art 5.4.2 The Draft Online Regulation Policy, FPB). As a consequence, users’ freedom of both receiving and imparting content is compromised by content filtering and blocking put in place by the proposed draft regulation on online content classification and pre-approval by the Board. This system of preventive measures may put a brake to a positive trend of increasing internet access and use via social media and indeed the hallmark promotion of freedom of expression in South Africa.
While the South African Constitution grants rights, it imposes some restrictions on freedom of expression in cases such as “a. propaganda for war; b. incitement of imminent violence; or c. advocacy of hatred that is based on race, ethnicity, gender or religion, and that constitutes incitement to cause harm”.
However, the authorisation by the Board of the content to be published is not only related to specific categories of content restricted by the South African Constitution but it seems to apply to all the digital content produced. For this reason the Draft Regulation may result in a system of “networked censorship” since the Board’s Classification Guidelines can include any kind of content that someone may find objectionable but that it may not be harmful for children or fall into the specific limitations on freedom of expression as provided for in section 16 of the Constitution. In addition, the 1996 Films and Publications Act, which established the Film and Publication Board, its mandate and its objectives, requires only distributors of films and games to register as distributors. Therefore, the Draft Regulation lacks authority to request any online content platform or online content distributor to register with the Board (Limpitlaw, 2015). On this basis the Draft Regulation appears to extend beyond FPB’s remit, whose main aim is to protect children from exposure to distributing and harmful material and from premature exposure to adult material by way of classifying films, games and ‘certain publications’. It has been observed (Jorgesen, 2013) that measures to prevent the use of the internet to violate the rights of children must be narrowly targeted and proportionate and the effect of measures taken on the free flow of information online must be given due consideration.
In relation to children’s rights protection, article 25 of the Universal Declaration of Human Rights (UDHR) protects childhood by entitling “special care and assistance” for children. In terms of child protection over the internet, it translates on the one hand to giving children the freedom to use the internet; on the other, on protecting them from the dangers associated with the internet (Jorgesen, 2013).
From a practical perspective, considering the amount of digital content produced and distributed across the internet, classifying all online content is unfeasible. Several measures included in the Draft Regulation place both financial and procedural burdens on content and platform providers. The draft regulation requires that any online platform or online content provider needs to apply “for registration as film or game and publications distributor” (Art 5.1.1) and for “an online distribution agreement” in order to “classify its online content on behalf of the Board, using the Board’s classification Guidelines and the Act” (Art 5.1.2). Specifically analyzing the text of the Draft Regulation, it requires the “payment of the fee prescribed from time to time by the Minister” (5.1.2), “for each title submitted” (5.1.3), the provision of “facilities to store all classified content for audit and related purposes” (5.1.5.), and the display of the “Film and Publication Board classification rating and logo” (5.1.9). This draconian mechanism of co-regulation may create a system which makes it easier for intermediaries to over-block online content than deal with complaints and gives the government and the FPB the opportunity to monetise from a system of networked censorship.
Conclusions
In its current form the draft regulation is too broad, generally unworkable, open to abuse, and unconstitutional. The likely negative impact on the public’s ability to produce and access content online as well as negative impacts on online media freedom suggest that it is overly restrictive.
Government’s ability to control content that is available online has profound implications for free expression and censorship. Beyond the more obvious negative impact on users’ civil and political rights, there is a need to understand unintended consequences and the linkages of online content classification, censorship and freedom of expression within the polity. In particular, we need to understand better how this form of regulation may impact on internet use and access in low- and middle-income countries such as South Africa. This has to be assessed before putting in place a system which may hamper a positive process of internet use driven by user generated content and social media. Regulating online content by classification may impact on the ability of users to create and post local content on social media and therefore it may have severe implications on demand-side stimulation strategies acknowledged in the National Broadband Policy, SA Connect, as essential to driving broadband adoption.
The main objective of government should be to preserve the internet as an engine for social and economic development, and therefore to create the conditions for political, social and economic innovation, enabling all stakeholders to contribute to the maintenance and growth of the network, including governments, businesses, and users.
An internet policy should focus on creating a favourable investment environment to allow the ICT sector to grow, of which content production is a key enabler.
An internet policy should encourage and facilitate an open and competitive online landscape, reaffirming users’ and platform providers rights to free speech and expression.
Systems for reporting children pornography or harmful content as envisaged by the South African constitution should not place any further restrictions on digital content production and distribution than those mentioned in the Bill of Rights and in the 1996 Film and Publication Act for off-line content. The identification of the most appropriate mechanism to enforce the rule of law in the online space should result from a consultative process between intermediaries, online users, the government, civil society organisations and academia. The role of FPB in this process should be proportionate to its authority over specific categories of content regulation and within the objectives and mandate of the 1996 Film and Publication Act.
Users’ and private sector-driven initiatives may support FPB’s process of identifying content which is potentially disturbing or harmful to children in particular age groups or which may remain under the restrictions imposed by the South African constitution. Since community guidelines have already been implemented by platform providers such as YouTube, it is recommended that the FBP should draw on the work done on these guidelines, and collaborate with the intermediaries and civil society organisations to institutionalise an open system for reporting harmful content and users who do not respect constitutional restrictions to freedom of expression and the diffusion of child pornography.
Take Action!
Many civil society organisations have already expressed their concerns and dissent on the Draft Regulation, defined as “Africa’s Worst New Internet Censorship Law” by the Electronic Frontier Foundation. Association for Progressive Communications, SOS coalition, Right2Know, and the Freedom of Expression Institute on Friday 22 May convened a roundtable in Johannesburg to unpack and understand how the FPB’s Draft Regulation will impact on users’ civil and political rights and on the internet sector more broadly. Participants to the meeting, which included representatives from civil society organisations, private sector, media groups, library associations and internet and telecommunications industry associations, all agree that the Draft Regulation must be scrapped.
The Films and Publication Board has opened a Public Consultation until the 15 July 2015. Submissions should be emailed to policy.submissions@fpb.org.za or hand delivered to the FPB head office at ECO Glade 2, 420 Witch Hazel Street, ECO Park, Centurion, 0169 and marked for attention Ms. Tholoana Ncheke.
Another way to stop the FPB’s Draft Regulation is to support the #HandsOffOurInternet petition and social media campaign launched by the South African coalition Right2Know.
References
Boyle, J. (1997). “Foucault in Cyberspace: Surveillance, Sovereignty and Hardwired Censors”, University of Cincinnati Law Review, 177, 186.
Brown, I. (2008). “Internet Filtering: Be Careful What You Ask for”, In: S. K. Schroeder, and L. Hanson (eds), Freedom and Prejudice: Approaches to Media and Culture, Istanbul: Bahcesehir University Press.
Jorgesen, R. F. (2013). An internet bill of rights? In: Brown, I. (eds), Research Handbook on Governance of the Internet. Edward Elgar Publishing Limited.
Koops, B.-J. et al. (2006). “Should Self-Regulation be the Starting Point?”, in B.-J. Koops et al., (eds), Starting Points for ICT Regulation: Deconstructing Prevalent Policy One-Liners, The Hague: T.M.C. Asser Press.
Kreimer, S. (2006). “Censorship by Proxy: The First Amendment, Internet Intermediaries, and the Problem of the Weakest Link”, University of Pennsylvania Law Review, 155, 11.
Lessig, L. (1999). Code: And Other Laws of Cyberspace, New York, NY: Basic Books.
Limpitlaw, J. (2015). Film and Publication Board Draft Internet Regulation Policy – Framing Discussion. Presentation. 20 May 2015.
McIntyre, T.J. (2013). Child abuse images and clean feeds: assessing internet blocking systems. In: Brown, I. (eds), Research Handbook on Governance of the Internet. Edward Elgar Publishing Limited.
Mueller, M. (2010). Networks and States: The Global Politics of Internet Governance, Cambridge, MA: MIT Press.
Suzor, N. (2015). Informal conversation on online content classification, 13 May 2015.
Stork. C., Calandro, E., and Gillwald, A. (2013) “Internet going mobile: internet access and use in 11 African countries”, In: info, Vol. 15 Iss: 5, pp.34 – 51.
Swire, P. P. (1998). “Of Elephants, Mice, and Privacy: International Choice of Law and the Internet”, The International Lawyer, 32, 991.
Zittrain, J. (2003). “Internet Points of Control”, Boston College Law Review, 44, 653.