This article is more than
1 year oldFor a few weeks in July, an uncanny phrase permeated the air that mildly pained some who typed it: "Hot Zuck Summer."
There was that photo of Mark Zuckerberg shirtless and eerily ripped. The successful launch of Threads. The whole embarrassing business where Elon Musk challenged Zuckerberg to a physical fight.
In comparison with Musk, who was busy destroying both Twitter and his reputation, the Meta CEO seemed to be a reasonable adult and measured executive.
For one brief summer, things finally seemed rosy for Zuckerberg and Meta: The distance of the 2016 election and all the mess that came from it was far enough away in the rear-view mirror. The explosive revelations in 2021 from Frances Haugen and the "Facebook Papers" weren't at the forefront of everyone's minds. Twitter, FTX, and AI occupied the headlines.
It was as if Meta actually got to add a few numbers to a big sign that read "Days without a major scandal."
But that counter just got set back to "0." And although "Hot Zuck Summer" may have been a lighthearted take on Zuckerberg, the latest scandal is anything but.
On Monday, The Wall Street Journal reported the disturbing way that sexualized content of children was served to adults through Instagram's Reels.
The Journal created test accounts that followed only teen and tween gymnastics and cheerleading influencers. Those accounts were then recommended Reels for adult sexual content and sexualized child content, the Journal reported. The Canadian Centre for Child Protection ran a similar test with similar results, the report said.
From the Journal:
Instagram's system served jarring doses of salacious content to those test accounts, including risqué footage of children as well as overtly sexual adult videos — and ads for some of the biggest U.S. brands. The Journal set up the test accounts after observing that the thousands of followers of such young people's accounts often include large numbers of adult men, and that many of the accounts who followed those children also had demonstrated interest in sex content related to both children and adults.
Alarmingly, sexually suggestive content was also shown between ads for big companies.
From the report:
The tests showed that following only the young girls triggered Instagram to begin serving videos from accounts promoting adult sex content alongside ads for major consumer brands, such as one for Walmart that ran after a video of a woman exposing her crotch.
And perhaps most depressing of all:
An ad for Lean In Girls, the young women's empowerment nonprofit run by former Meta Chief Operating Officer Sheryl Sandberg, ran directly before a promotion for an adult sex-content creator who often appears in schoolgirl attire. Sandberg declined to comment.
A Meta spokesperson told the Journal that the company recently launched new brand-safety tools and has a task force for detecting suspicious users. Sandberg declined to comment to the Journal, as did Walmart.
In a statement to Business Insider, Meta said: "We don't want this kind of content on our platforms and brands don't want their ads to appear next to it. We continue to invest aggressively to stop it — and report every quarter on the prevalence of such content, which remains very low." It also said that the Journal's test was a "manufactured experience" that didn't represent what real users see each day.
Earlier this fall, 33 states filed a lawsuit against Meta, accusing it of ignoring warnings about potential harm to young girls. The lawsuit also claims Meta knew about millions of accounts that were opened by kids under 13 but didn't shut them down.
A Massachusetts lawsuit claims Meta ignored efforts to improve teen well-being on its apps.
A recently unsealed complaint as part of the lawsuit filed by 33 states appears to show Instagram executives were well aware of a phenomenon that seems fairly intuitive to anyone who has used Instagram: If you see all your friends having fun living their best lives and tons pictures of extremely hot people, that can make you feel bad.
(Liza Crenshaw, a Meta spokeswoman, told BI the complaint "mischaracterizes our work using selective quotes and cherry-picked documents." She said: "We want teens to have safe, age-appropriate experiences online, and we have over 30 tools to support them and their parents.")
From the complaint, emphasis mine:
Meta's senior leadership admits that social comparison is a critical issue with serious consequences for its users, particularly for Instagram. [Adam] Mosseri wrote in an internal email, "I see social comparison as the existential question Instagram faces within the broader question of whether or not social media is good or bad for people." Because of Instagram's "focus on young people and visual communication," its emphasis on beauty and fashion content, and a "marketing look and feel often biasing too polished," Mosseri reasoned that "social comparison is to Instagram [what] election interference is to Facebook."
I think Mosseri was spot on. This is the existential question that is now being debated about Instagram.
(The interpretation of Mosseri's line about election interference isn't totally clear to me. I think a lot of Meta employees believe that the 2016 election-interference narrative was overblown. Perhaps what he means is that this will be Instagram's moment to have a massive scandal where public opinion is rapidly turned against it, cable news people yell about it, people delete their accounts, and someone gets hauled in front of Congress to be grilled about it.)
Either way — Mosseri is right. This is a big public scandal. Enough shoes have dropped. Big advertisers such as Match and Bumble are canceling ads over the Journal's report, the publication said.
"Hot Zuck Summer" has turned into "Instagram Nightmare Fall."
Newer articles
<p> </p> <div data-testid="westminster"> <div data-testid="card-text-wrapper"> <p data-testid="card-description">The foreign secretary's remarks come as the government...