This article is more than
7 year oldCHILD porn is cropping up on Facebook again.
For the second time in the past month, Mark Zuckerberg’s social network is scrambling to remove filthy posts ridden with child pornography — as well as posts promoting ISIS — after media reports flagged them, reports the New York Post.
In the latest incident, the Times of London said it had used a dummy Facebook profile to alert the site to images of an allegedly violent sexual assault on a child, as well as cartoons of child abuse.
Facebook, which failed to remove the pictures until it was finally contacted directly by reporters, blamed the mess on human error.
The company said it sorts through about a million flagged posts a day, with human moderators giving priority to child abuse and suicide risks.
The offending pictures were finally taken down because they “violate our policies and have no place on Facebook,” Justin Osofsky, Facebook’s vice president of global operations, said.
He added that the social-networking giant was “sorry that this occurred”.
“It is clear that we can do better, and we’ll continue to work hard to live up to the high standards people rightly expect of Facebook.”
Last month, Osofsky landed in a mess when the BBC sent images of child porn it found on Facebook and the social network responded by reporting the BBC to law enforcement for distributing illegal images.
Using Facebook’s policing tools, the BBC had attempted to report 100 sexualised images of children, and found that Facebook eventually removed just 18 of them.
The BBCs report also said Facebook took no action when it was notified that five convicted paedophiles had active Facebook accounts, explicitly violating the company’s rules.
This article originally appeared on the New York Post.
Newer articles