This article is more than
7 year oldTechnology and more human eyeballs are helping YouTube more quickly find and remove terrorist content from the video-sharing site.
Last month, YouTube said it was initiating a multipronged strategy to combat how extremist groups including ISIS use video on the site to recruit and radicalize prospective terrorists.
Machine learning has helped the YouTube detect and remove controversial content more quickly, the company said Tuesday in a blog post. Over the past month, three-fourths of the violent or extremist videos removed were taken down before a person on YouTube flagged it for inappropriateness. "Systems have proven more accurate than humans at flagging videos that need to be removed," the YouTube team says in the post.
As YouTube deployed machine learning technology over the past month, the number of videos removed has more than doubled, as has the rate of removal. "With over 400 hours of content uploaded to YouTube every minute, finding and taking action on violent extremist content poses a significant challenge," YouTube says.
The site's program of “Trusted Flaggers,” human experts who help spot problem videos, has been bolstered by cooperation from 15 more non-governmental organizations including the Anti-Defamation League, the No Hate Speech Movement, and the Institute for Strategic Dialogue.
New standards will be applied to videos that are not illegal but users have flagged by users "as potential violations of our policies on hate speech and violent extremism," YouTube says. Those videos may remain on the site, but will not be recommended, make money from ads or have comments. "We’ll begin to roll this new treatment out to videos on desktop versions of YouTube in the coming weeks, and will bring it to mobile experiences soon thereafter," the company says.
Less than two weeks ago, YouTube began redirecting searches for extremist and terrorist words to a playlist of antiterrorist content. YouTube's Creators for Change program last week hosted a two-day workshop in the U.K. for teens to "help them find a positive sense of belonging online and learn skills on how to participate safely and responsibly on the internet," YouTube says. It also plans to expand that program to reach 20,000 more U.K. teens across.
Online extremism has been a problem for online sites including YouTube and parent company Google. Many advertisers began pulling their business from YouTube in March after finding their ads played on videos promoting terrorism and extremist content. Subsequently, YouTube set a 10,000-viewer requirement for its creators to earn revenue on their videos.
Extremist content on YouTube re-emerged as an issue when it was revealed one of the three attackers in the London Bridge terror incident June 3 had been influenced by extremist videos on YouTube.
"Altogether, we have taken significant steps over the last month in our fight against online terrorism. But this is not the end. We know there is always more work to be done," the company said. "With the help of new machine learning technology, deep partnerships, ongoing collaborations with other companies through the Global Internet Forum, and our vigilant community we are confident we can continue to make progress against this ever-changing threat. We look forward to sharing more with you in the months ahead."
MORE:
YouTube redirects ISIS recruits to anti-terrorist videos
After London attacks, Facebook, Twitter pledge to continue anti-terror help
Extremist videos on YouTube will now carry warnings
Follow USA TODAY reporter Mike Snider on Twitter: @MikeSnider.
Newer articles
<p> </p> <div data-testid="westminster"> <div data-testid="card-text-wrapper"> <p data-testid="card-description">The foreign secretary's remarks come as the government...