This article is more than
5 year oldA damning report from Bloomberg Tuesday revealed that top YouTube executives debated for years whether extremist viral videos on its platform were really a problem — often rejecting solutions to manage the situation — in an effort to maximize growth and profits.
Why it matters: Tech companies have long been criticized for harboring hate, but as the consequences of their inactions begin to unfold more visibly in the real world, companies like YouTube are facing more pressure to address whether their ignorance was actually malpractice.
Driving the news: The most poignant aspect of the Bloomberg report is a narrative similar to one that's been reported about Facebook's handling of Russian misinformation: top executives were repeatedly briefed that there was a problem, and chose to downplay it for the sake of focusing on business outcomes.
The big picture: The report comes as Facebook is scrambling to manage hateful content and misinformation on its platforms ahead of upcoming elections in India and the coming round of U.S. presidential primaries.
Be smart: Calls for change have started to pick up in the wake of real-world outcomes occurring as the result of people who have been radicalized by hateful or conspiracy-minded content. As Axios has previously noted:
Bottom line: Two years after the 2016 election, it has become increasingly apparent that Google and Facebook, despite warnings about ways their platforms' algorithms allowed bad content to flourish, shied away from doing much about it for business reasons. Now, facing elections and misinformation crises around the world, they are being forced to reckon with those decisions.
Newer articles
<p>The two leaders have discussed the Ukraine conflict, with the German chancellor calling on Moscow to hold peace talks with Kiev</p>