This article is more than

6 year old
Facebook

Film shows the unlikely people tasked with cleaning up social media and the ugly consequences

Source: News Corp Australia Network:
October 13, 2018 at 13:35
The Cleaners shows the ugly world of those tasked with the bizarre and awful job of cleaning up social media.Source:Supplied
The Cleaners shows the ugly world of those tasked with the bizarre and awful job of cleaning up social media.Source:Supplied
BEHIND the Facebook feed you see, are people tasked with the unthinkable. This is the job nobody wants to acknowledge exists.

“IF they find out I’m talking to you, I will be in trouble.”

Those worried words were written by a so-called content moderator in the Philippines. A troubled member who belongs to a small army of people quietly hired by social media companies to sift through the endless stream of pictures and videos that have been flagged as objectionable by users.

It includes some of the nastiest stuff imaginable. Think snuff films, self harm images, bestiality clips, videos of people being decapitated and child porn.

They see themselves as policemen or bodyguards, protecting the social media users and “keeping the platform healthy”. But the deleterious nature of the job means their own mental health is what soon fades.

A new documentary due to hit select Australian cinemas next weekend shines a light on the grim reality of this strange new job that has emerged in the age of social media.

Companies like Facebook, Google and Twitter who hire them, do so at an arms’ length and much of their operational work is shrouded in secrecy. All three companies declined to comment in the film.

The film, called The Cleaners, follows a handful of young, often naive, Filipino men and women who are happy to have a job that pays a decent wage in a country where many face poverty.

They deal with content coming primarily from the US and Europe. Some specialise in certain types of content such as “live self harm videos” and many are required to look through 25,000 pictures a day, choosing either to delete the image or ignore it, allowing it to stay up on the site.

These often religious, often sheltered, young workers are tasked with deleting some of the worst stuff online.
These often religious, often sheltered, young workers are tasked with deleting some of the worst stuff online.Source:Supplied

The cultural divide between them and the users who have posted and flagged the content they are tasked with judging causes all sorts of issues.

When Facebook cops backlash for pulling down iconic photos of the Vietnam War or artworks featuring nudity, it is very possible that a 20-something in an office on the other side of the world, who is just following the rule book, is responsible.

“In the beginning we needed training. We were introduced to all sorts of words about sex. Words like pussy or tits. I didn’t know those terms. I was kind of innocent then,” one young female content moderator says in the film.

“Then there were sex toys, I didn’t know what they were either. Terms like butt plugs.”

To become more acquainted with the hidden depravity of the western internet users she was policing, she would go home and watch different kinds of pornography. Coupled with the day-to-day demands of the job, it had a profound effect on her.

“I wasn’t used to seeing penises, so whenever I went to sleep I would dream of seeing different kinds of penises. That’s all I saw. Penises everywhere.”

Not that she minded all that much. In a way it became her “guilty pleasure”, she told filmmakers, displaying a resilience that was characteristic of many of the content moderators portrayed in the film, at least in the early stages of the job.

Facebook, Google and Twitter all declined to comment in the film.
Facebook, Google and Twitter all declined to comment in the film.Source:Supplied

Others have a much harder time dealing with the psychological effects of being a content moderator.

“I’ve seen hundreds of beheadings,” says one whose job it is to pull down pictures and videos related to terrorism.

As part of the work, moderators have to memorise the flag and slogans of terrorist groups around the world. They have seen so many beheadings they have become experts in the grisly act. The worst ones are when the knife is not that sharp, “kind of like a kitchen knife,” they say.

According to one moderator, if they were to pull down a live video of someone threatening self harm before they actually hurt themselves, the moderator will be hit with a strike. They are only allowed three strikes for wrongly deleted content in a month.

One moderator who specialised in live streaming self harm videos went on to hang himself.

The exact number of the workers around the world is hard to pin down, but Facebook is thought to have about 7500 global moderators who sift through 10 million potentially rule-breaking posts per week. Google also has many thousands doing such work.

While filtering technology is improving, the subtlety, context and nuance of some images makes it difficult to rely on artificial intelligence to catch all the problematic content.

“New technology like machine learning, computer vision and artificial intelligence helps us find more bad content, more quickly,” Antonia Sanda, Head of Communications for Facebook Australia told news.com.au recently.

But as one content moderator told filmmakers in an email: “Algorithms can’t do what we do.”

The release of Facebook’s content moderation guidelines earlier this year drew attention to the difficult work of those whose job it is to review such material all day long.

Last month, a former content moderator in California sued Facebook claiming she suffered from PTSD from being “exposed to highly toxic, unsafe and injurious content during her employment as a content moderator” for the billion dollar company.

Selena Scola was a content moderator at Facebook’s Menlo Park headquarters in California from June 2017 to March of this year. She has filed a damages claim against the tech giant, revealing she and others like her were bombarded with “thousands of videos, images and lifestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder,” according to court papers.

In response, Facebook said in a statement: “We recognise that this work can often be difficult.

“That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources.”

Mark Zuckerberg, chief executive officer and founder of Facebook has been under a lot of pressure lately. Picture: David Paul Morris
Mark Zuckerberg, chief executive officer and founder of Facebook has been under a lot of pressure lately. Picture: David Paul MorrisSource:Bloomberg

The documentary does a great job of juxtaposing the moderator grappling with whether or not to delete a particular image with the originator of the content — sometimes an artist or photographer — to highlight the grey areas of such work.

The film also delves into geopolitics and the effect social media is having on public discourse and democracy.

It paints a worrying picture of how tech companies have become the gatekeeper of politically sensitive information as they strike secret deals with different governments to block certain content on their platform.

“I think democracy becomes a little impossible,” says former Facebook employee and author of Chaos Monkeys, Antonio García Martínez, about the rise of social media giants that operate on metrics based around user engagement and growth.

The Cleaners is a fascinating look at a world shaped by these digital platforms largely built in a small corner of the globe and the bizarre and awful job of the people trusted to scrub away the really bad stuff.

- The Cleaners will have an exclusive Melbourne season at ACMI Cinemas from October 19 and is also screening at Golden Age in Sydney and Brisbane and will show at the Adelaide Film Festivals this month. For more information visit the website.

Keywords
You did not use the site, Click here to remain logged. Timeout: 60 second