This article is more than
7 year oldSeveral major companies have pulled their advertising from YouTube after their ads were shown on videos of young children that had attracted scores of comments from pedophiles.
The development comes just two days after YouTube announced a campaign to prevent inappropriate content and comments on its kids programming.
An investigation reported Friday by U.K.'s The Times found comments from hundreds of pedophiles posted on YouTube videos of scantily-clad children. Among the videos, most of which looked to have been uploaded by the children themselves, was a clip of of a pre-teen girl in a nightgown that had 6.5 million views, the Times said. The YouTube algorithm would then suggest other, similar videos, say of other children in bed or in baths.
Several big-name brands including food companies Mars (M&Ms, Snickers) and Mondelez (Oreos, Cadbury), Diageo (Guinness, Smirnoff vodka, Johnnie Walker scotch whisky), and German retail chain Lidl pulled their advertising from YouTube upon learning their ads ran alongside the videos, The Times first reported.
"We are shocked and appalled to see that our adverts have appeared alongside such exploitative and inappropriate content," said Mars, the McLean, Va.-headquartered food maker said in a statement to USA TODAY. "We have stringent guidelines and processes in place and are working with Google and our media buying agencies to understand what went wrong. Until we have confidence that appropriate safeguards are in place, we will not advertise on YouTube and Google.”
Follow USA TODAY Tech on Facebook
Similarly, New Jersey-based Mondelez said it was “deeply concerned" and had suspended its advertising on YouTube, too. “We are actively working with Google and agency partners on an ongoing basis to ensure brand safety, but recognise there is more to be done by all parties.”
YouTube reiterated its recent toughening of its guidelines for kids programming, noting in the past week it had disabled comments on "thousands of videos" that could be targeted by predators and shut down "hundreds of accounts" of users posting predatory comments.
“Content that endangers children is abhorrent and unacceptable to us," the Google-owned company said in statement. "There shouldn’t be any ads running on this content and we are working urgently to fix this."
This is just the latest situation in which YouTube has had to address advertiser concerns about user-generated content and user comments.
In March, the video site was met with advertiser pullouts when ads were found running on extremist content. As part of its response, YouTube established a 10,000-viewer requirement for creators to earn ad revenue as part of its YouTube Partner Program. It also moved to add warnings to extremist videos and prevented comments on them as a way to make the videos harder to find.
In August, YouTube said its combination of improved machine learning and bolstered staff of human experts has helped the site remove extremist and terrorist content more quickly.
But a series of news articles in recent weeks pointed to how poorly its filters were able to keep out scary or adult-themed content from its kids' app, and how content that exploited children was able to attract a wide following, earning money for its creators.
More: YouTube boots 50 channels in attempt to clean up kids' programming
More: The $80B business that linked all-American brands with hate content
Follow USA TODAY reporter Mike Snider on Twitter: @MikeSnider.
Newer articles
<p>The two leaders have discussed the Ukraine conflict, with the German chancellor calling on Moscow to hold peace talks with Kiev</p>