This article is more than
6 year oldPeople doing stupid stuff on the Internet is hardly news. To wit: The Tide Pod Challenge, in which YouTubers have been filming themselves eating -- or, we really hope, pretending to eat -- laundry detergent pods.
Why? Uh, because they're brightly colored?? We guess???????
Obviously this is Darwin Awards' levels of idiocy -- given that detergent is, y'know, not at all edible, toxic to biological life and a potent skin irritant. It would also literally taste of soap. Truly, one wonders what social historians will make of the 21st century.
But while eating Tide Pods appears to have started as a silly meme -- which now has its own long and rich history -- once YouTubers got hold of it, well, things started to turn from funny fantasy to toxic reality.
So now YouTube appears to be trying to get ahead of any wider societal outcry over (yet more) algorithmically accelerated idiocy on its platform -- i.e. when sane people realize kids have been filming themselves eating detergent just to try to go viral on YouTube -- and is removing Tide Pod Challenge videos.
At least when they have been reported.
A YouTube spokesperson sent us the following statement on this: "YouTube’s Community Guidelines prohibit content that's intended to encourage dangerous activities that have an inherent risk of physical harm. We work to quickly remove flagged videos that violate our policies."
Under YouTube's policy channels that have a video removed on such grounds will get a strike -- and if they get too many strikes could face having their channel suspended.
At the time of writing it's still possible to find Tide Pod Challenge videos on YouTube, though most of the videos being surfaced seem to be denouncing the stupidity of the 'challenge' (even if they have clickbait-y titles that claim they're going to eat the pods -- hey, savvy YouTubers know a good viral backlash bandwagon to jump on when they see one!).
Other videos that we found -- still critical of the challenge but which include actual footage of people biting into Tide Pods -- require sign in for age verification and are also gated behind a warning message that the content "may be inappropriate for some users".
As we understand it, videos that discuss the Tide Pod challenge in a news setting or educational/documentary fashion are still allowed -- although it's not clear where exactly YouTube moderators are drawing the tonal line. (For example this YouTube creator's satirical video denouncing the stupidity of the Tide Pod Challenge was apparently removed on safety grounds.)
Fast Company reports that YouTube clamping down on Tide Pod Challenge videos is in response to pressure from the detergent brand's parent company, Procter & Gamble -- which has said it is working with “leading social media sites” to encourage the removal of videos that violate their polices.
Because, strangely enough, Procter & Gamble is not ecstatic that people have been trying to eat its laundry pods...
What should Tide PODs be used for? DOING LAUNDRY. Nothing else.
— Tide (@tide) January 12, 2018
Eating a Tide POD is a BAD IDEA, and we asked our friend @robgronkowski to help explain. pic.twitter.com/0JnFdhnsWZ
And while removal of videos that encourage dangerous activities is not a new policy on YouTube's part, YouTube taking a more pro-active approach to enforcement of its own policies is clearly the name of the game for the platform these days.
That's because a series of YouTube content scandals blew up last year -- triggering advertisers to start pulling their dollars off of the platform, including after marketing messages were shown being displayed alongside hateful and/or obscene content.
YouTube responded to the ad boycott by saying it would give brands more control over where their ads appeared. It also started demonitizing certain types of videos.
There was also a spike in concern last year about the kinds of videos children were being exposed to on YouTube -- and indeed the kinds of activities YouTubers were exposing their children to in their efforts to catch the algorithm's eye -- which also led the company to tighten its rules and enforcement.
YouTube is also increasingly in politicians' crosshairs for algorithmically accelerating extremism -- and it made a policy shift last year to also remove non-violent content made by listed terrorists.
It remains under rising political pressure to come up with technical solutions for limiting the spread of hate speech and other illegal content -- with European Union lawmakers warning platforms last month they could look to legislate if tech giants don't get better at moderating content themselves.
At the end of last year YouTube said it would be increasing its content moderation and other enforcement staff to 10,000 in 2018, as it sought to get on top of all the content criticism.
The long and short of all this is that user generated content is increasing under the spotlight and some of the things YouTubers have been showing and doing to gain views by 'pleasing the algorithm' have turned out to be rather less pleasing for YouTube the company.
As one YouTuber abruptly facing demonitization of his channel -- which included videos of his children doing things like being terrified at flu jabs or crying over dead pets -- told Buzzfeed last year: "The [YouTube] algorithm is the thing we had a relationship with since the beginning. That's what got us out there and popular. We learned to fuel it and do whatever it took to please the algorithm."
Another truly terrible example of the YouTuber quest for viral views occurred at the start of this year, when YouTube 'star', Logan Paul -- whose influencer status had earned him a position in Google's Preferred ad program -- filmed himself laughing beside the dead body of a suicide victim in Japan.
It gets worse: This video had actually been manually approved by YouTube moderators, going on to rack up millions of views and appearing in the top trending section on the platform -- before Paul himself took it down in the face of widespread outrage.
In response to that, earlier this week YouTube announced yet another tightening of its rules, around creator monetization and partnerships -- saying content on its Preferred Program would be "the most vetted".
Last month it also dropped Paul from the partner program.
Compared to that YouTube-specific scandal, the Tide Pod Challenge looks like a mere irritant.
Newer articles
<p> </p> <div data-testid="westminster"> <div data-testid="card-text-wrapper"> <p data-testid="card-description">The foreign secretary's remarks come as the government...