Daniel Motaung: how a man from the Free State came to take on Facebook

In 2018, a fresh graduate from Rhodes University, I was brimming with potential and had a very clear mission: find and follow a clear career path, attain financial freedom and lift myself and my family out of poverty. 

The dream

It felt then like a stroke of providence when I came across a company called Samasource (now Sama), which promised a bright future for individuals exactly like me. The company’s claim is that it focused on upskilling the poor and lifting them out of poverty. The career path was that of a content moderator. At the time, I had no idea what that was, but I applied.

Soon I was on the journey of a lifetime to Nairobi, Kenya, to start my exciting new career.

It was March 2019, and I felt like I had achieved it all. All the more so because I had a reputation in my family and village as a record breaker: I was the first one to go to a so-called prestigious university, the first to travel on an aeroplane, and the first to work abroad.

I wish I had known then that what waited for me was the destruction of my own mental stability and physical health.

The nightmare

The job of content moderators is to try make Facebook (and other platforms) safe(r) for everyone who uses it. Every day around the world, people post truly horrific things online, and it is the job of content moderators to sift through these posts to take them down. So that ordinary people don’t have to see them.

The very first video I watched in my new role was a live video of someone being beheaded. The work of content moderators involves long shifts of watching a constant stream of graphic violence, sexual abuse, animal torture, and sexual exploitation of children.

Nobody survives this work unscathed. Several of my co-workers were diagnosed with post-traumatic stresss disorder (PTSD). I feel like I am now living in a horror movie. The trauma I was subjected to keeps replaying in my mind. In the flashbacks I am often the victim of the violent content I had to watch.

Before I joined Sama I had a pre-existing condition of epilepsy, which was inactive in 2018. In other words, it did not affect my life or need treatment. But during and after my time as a Facebook moderator, the seizures returned. I believe this was a result of the trauma and ill-treatment.

Response by Sama and Facebook

Facebook refuses to employ moderators like us directly  – because then our problems would be their problems. They would have to explain why employees are getting PTSD at work and are paid so much less than other staff. Or why we are forced to sign NDAs that try to ban us from discussing even the basic details of our jobs. So the work gets outsourced to companies like Sama.

Sama claims to be an ethical AI company that offers  “dignified digital work and pays them living wages”. This does not at all reflect my or my colleagues’ experience.

First, when faced with work like this, one would imagine that psychological support would be on hand and working hours would be strictly limited to mitigate the harm. This was not the case. The wages are also grossly exploitative.

Sama’s founder, the late Leila Jenah, had previously said that giving staff in Kenya a substantial rise would risk distorting local labour markets. Some employees were paid as little as $1.50 an hour. To put this amount into perspective, a fast food meal in Nairobi costs about $5.

Even today, in the face of public criticism, Sama continues to defend these low wages, noting that they pay three times the minimum wage. But in many African countries, the minimum wage does not equate to a living wage. It is a useless standard, you cannot survive on it.

A bitter fight for rights

As the working conditions became increasingly unbearable, I organised my co-workers into an Alliance of Workers to negotiate with management on our conditions. Our attempts to engage in good faith were met with intimidation, bullying and coercion. It quickly became clear that Sama’s managers were not interested in economic empowerment or poverty upliftment. They were in the business of making a profit.

The Alliance threatened to strike unless Sama committed to increased pay and better working conditions. 

In response, they brought in two high-paid executives from San Francisco to crush our budding union and the strike. I was fired, lost my visa, and had to leave Kenya.

This treatment of content moderators in Africa continues, and so too does the Sama rhetoric of being an ethical company empowering Africans. This is a blatant lie that covers really damaging exploitation on a continent where they believe they can get away with it.

Mark Zuckerberg, Facebook, Sama, and other exploitative corporations in this industry will not be absolved by history.  My life, and the lives of other content moderators in the industry, should come before profits. I see it as my role to lift the lid on these terrible practices on a continent where too many corporations have been able to trample on human rights with impunity. The same is true of tech companies, and it is up to us to draw a line to stop the abuse.

The future

I am really excited about the prospect of developing a feasible alternative to current workplace practices for content moderators. I’ll be investigating the adoption of minimum International Labour Organisation (ILO) labour rights, including the right to organise and safe working conditions. As well as the additional safeguards that are required for new forms of digital labour. It is also necessary to provide training and psychological support for content moderators, and look to professionalising the industry.