This article is more than

6 year old
Facebook

Facebook’s disturbing moderator secrets

Source: News Corp Australia Network:
August 6, 2018 at 12:13
Inside Facebook: Four Corners’ aired the investigation on how and why moderators make decisions on what you see. Picture: AFP/Christophe SimonSource:AFP
Inside Facebook: Four Corners’ aired the investigation on how and why moderators make decisions on what you see. Picture: AFP/Christophe SimonSource:AFP
“MARK as disturbing”: Images of self-harm, child abuse and hate speech are deliberately allowed on Facebook, a documentary reveals.

AN UNDERCOVER reporter has lifted the lid on how Facebook decides what you see, training secretly as a Facebook moderator going undercover as a moderator to be told to leave offensive content on the site.

The documentary aired on ABC’s Four Corners revealed graphic videos of child abuse, school bullies, and posts and videos showing self harm and hate speech were being left on the site by moderators who were seemingly encouraged to mark the toxic content as “disturbing” rather than delete them entirely.

Inside Facebook: Secrets of the Social Network saw a British reporter pose as an employee of UK-based CPL Resources, undergoing training for the Facebook contractor.

The moderators review content reported for possible breaches of Facebook’s community standards, and are given three options: ignore, delete, or mark as disturbing.

Via secret interviews and secretly-recorded footage he revealed found “serious problems” with how Facebook’s guidelines were applied to what content is published on its platform.

And he found some pages were subject to “shielding” which allowed offensive content to remain on their sites, and subject to another level of review, even if guidelines had already been breached.

When the documentary aired in the UK, Facebook released a statement saying the report revealed practices “that do not reflect Facebook’s policies or values and fall short of its high standards”.

Toxic content is “essentially the crack cocaine” of Facebook’s product, Roger McNamme claims. Picture: ABC
Toxic content is “essentially the crack cocaine” of Facebook’s product, Roger McNamme claims. Picture: ABCSource:ABC

“We take these mistakes incredibly seriously and are grateful to the journalists who brought them to our attention.”

Among the “mistakes” highlighted in the documentary was the undercover reporter being shown video of a man kicking and beating a boy, and told it should be marked as “disturbing”, not deleted.

‘YOU’VE JUST SEEN A MAN BEATING A TINY BOY’

That video had been reported several years ago by child abuse campaigner Nicci Astin, but Facebook told her the video did not breach its guidelines, so the video remained on the site.

“Initially you see a little tiny boy, must be about two or three in the video, with a man talking to him and shouting at him and then he was hitting him and punching him.” Ms Astin told Four Corners.

“He was throwing him about and then he was stamping and kicking on him and then obviously the video cut. So, you’re left with knowing absolutely nothing apart from a sickening feeling that you’ve just … you’ve seen some man beating up a tiny little boy.

“You know yourself from watching that video that that child’s not just got up and skipped off out to play. You know he’s hurt.”

But she said when she reported the video in 2012 “we received a message back saying while it was disturbing, it did not have a celebratory caption, so it was not removed.”

Richard Allan, Facebook’s vice-president of public policy, told Four Corners it “should have been taken down”.

Asked why Facebook’s allowed content like that on the site, Mr Allen said “in order to aid in the possible identification and rescue of victims of physical child abuse, we may not immediately remove this content from Facebook.”

He said companies like CPL were “frontline reviewers, but behind them sits a team of child safety experts, they’re actually Facebook full-time staff - they will make an assessment of whether the child is at risk, they will make a decision about what to do with the content, including referring it to law enforcement agencies where that’s appropriate”.

Back at CPL, when the journalist asked colleagues and trainers why graphic violence would be marked as disturbing rather than removed, replies ranged from “we’d just mark it as disturbing so you can still share them” to deleting being “too much like censorship” and, in the case of teens fighting “if a young kid sees another kid getting the shit kicked out of them, it’s for their safety”.

Facebook's Richard Allen rejected claims toxic content was left up for the site to make money. Picture: ABC
Facebook's Richard Allen rejected claims toxic content was left up for the site to make money. Picture: ABCSource:ABC

THE ‘CRACK COCAINE’ OF FACEBOOK

Facebook says it treads a fine line between unacceptable content and freedom of speech, but

Roger McNamee, a former mentor to Facebook founder Mark Zuckerberg, said toxic content was the “crack cocaine” of Facebook.

“When they say freedom of speech, what they’re really saying is: ‘We really want to permit people to do whatever they want on this platform, and we will do the bare minimum to make that socially acceptable’,” he said.

“From Facebook’s point of view this is, essentially … the crack cocaine of their product.

“It’s the really extreme, really dangerous form of content that … attracts the most highly-engaged people on the platform.

“If you’re going to have an advertising-based business, you need them to see the ads, so you want them to spend more time on the site.

“And what Facebook has learned is that the people on the extremes are the really valuable ones, because one person on either extreme can often provoke 50 or 100 other people

and so they want as much extreme content as they can get.”

One CPL staffer told the reporter violent content was left on Facebook because “if you start censoring too much, then people lose interest”.

But Mr Allen said shocking content “does not make us more money — that’s just a misunderstanding of how the system works.

“That’s not our experience of the people who use our service round the world,” he said.

“There is a minority who are prepared to abuse our systems and other internet platforms to share the most offensive kind of material.

“But I just don’t agree that that is the experience that most people want and that’s not the experience we’re trying to deliver.”

‘SHE LOOKS LIKE A WILD ANIMAL’

Video of a two teenage girls fighting and shared more than 1000 times wasn’t removed because it had a caption condemning the violence.

“The other girl gets up and basically just goes to town on my daughter, and just repeatedly knees and kicks her in the head,” the mother, who tried to have the video removed, told Four Corners.

“She looks like a wild animal. To wake up the next day and find out that literally the whole world is watching … it was humiliating for her.”

“You see the images and it’s horrible, it’s disgusting. Why was it a discussion whether to take that down? I don’t get it that. You know, that’s someone’s child being battered in the park. It’s not Facebook entertainment.”

Thi footage of one girl beating another was not removed because it carried a caption condemning the violence. Picture: ABC
Thi footage of one girl beating another was not removed because it carried a caption condemning the violence. Picture: ABCSource:ABC

“There are other ways to spread awareness without putting a video out there with someone’s daughter being battered. If they were watching a video of their own daughter, what decision would they make about that video?”

Mr Allan said if a parent or guardian saw a video of their child in circumstances that they object to, “they do have the right to insist that we take it down and we do take it down where we’re made aware”.

”If the content is shared in a way that praises or encourages that violence, it’s going to come down. But where people are highlighting an issue and condemning the issue, even if the issue is painful, there are a lot of circumstances where people will say to us, ‘Look, Facebook, you should not interfere with my ability to highlight a problem that’s occurred’.”

SELF HARM, SHIELDS AND HATE SPEECH

The documentary also covered the issues of moderating images of self-harm, underage users, and alleged “shielding” of far-right pages with large followings.

Asked why any such images of self harm would be left on the site, Mr Allan said “There’s actually a very strong valid interest from that person, if they’re expressing distress, to be able to express their distress to their family and friends through Facebook and then get help”.

“If we took it down, the family and friends would not know that that individual was at risk.”

The undercover reporter is also told racially abusive comments against ethnic or religious communities are allowed provided they are described as immigrants.

“Shielded” pages received special protection, with deletions of content or complaints referred upwards to another moderator for review rather than disappearing, because they have lots of followers, it was claimed.

The far-right Britain First Facebook page had more than two million followers when it was deleted in March after its leaders were convicted of racially aggravated harassment.

“They had 8 or 9 violations and you’re only allowed five, but they had a lot of followers so were obviously making a lot of money for Facebook,” one CPL worker said.

“To reach the actual violation you have to jump through a lot of hoops to get there,” was one trainer’s explanation.

Mr Allan said it was not a “discussion about money, this is a discussion about political speech. And I think people would expect us to be careful and cautious before we take down their political speech.”

Both Facebook and CPL Resources said they were reviewing policies and retraining staff in the wake of the report.

Four Corners can be seen on ABC iview

 

Advice for an example of self harm on the site recommends the post stays, but moderators respond with offers and links to help like counselling and crisis services. Picture: ABC
Advice for an example of self harm on the site recommends the post stays, but moderators respond with offers and links to help like counselling and crisis services. Picture: ABCSource:ABC

 

Keywords
You did not use the site, Click here to remain logged. Timeout: 60 second