This article is more than

8 year old
Facebook

'Facebook needs an editor': media experts urge change after photo dispute

Source: The Guardian::
September 11, 2016 at 12:31
Controversy over a censored Vietnam war photo highlights concerns over the social network’s vital – if reluctant – role as users’ primary news source

Tensions between Facebook and the news industry boiled over this week when the social media corporation censored a Pulitzer-winning Vietnam war photo, because it featured a naked child and violated site “community standards”.

The dispute over the “napalm girl” image, which a Norwegian writer published in a post about historic warfare photography, ended Friday when Facebook reversed its decision, acknowledging the “global importance of this image in documenting a particular moment in time”.

But the spat has exposed what journalists and ethicists say are fundamental flaws in the way Facebook controls and spreads news. Critics say the company’s decisions were driven by PR concerns and should serve as a wake-up call to free speech advocates about how powerful Facebook has become– and how ill-equipped the corporation is for its role, however unwilling, in journalism.
Some hope the scandal will be a turning point for CEO Mark Zuckerberg, who critics say has a moral obligation to recognize his role as the “world’s most powerful editor” and take meaningful steps to make Facebook accountable for what it distributes.

“What Facebook has to do now is think very hard about what it really means to be a publisher,” said Emily Bell, director of the Tow Center for Digital Journalism at Columbia University. “If they don’t,” she warned, “this is going to happen to them over and over again.”

‘We need more than just algorithms’

Whether Facebook and media executives like to admit it, the social media site now plays a vital role in how people consume news, carrying an influence that is difficult to overstate. Studies have repeatedly found that Facebook has become the primary news source for many people, and that publishers’ revenues have been hit hard as a result.

To some journalists, the decline in advertising revenue is less alarming than what the row over the war photo revealed: Facebook’s control over the news has become so vast that it could censor stories in ways that deeply threaten the free press.

Whether Facebook knows what to do with that power is an open question.

“It’s almost like these are kids playing with a toy above their age bracket,” said Jane Kirtley, professor of media ethics and law at the University of Minnesota. “We surely need something more than just algorithms. We need people who are sentient beings who will actually stop and analyze these things.”

Zuckerberg has refused to acknowledge that Facebook is a publisher or media company, instead sticking to the labels of “tech company” and “platform”. But the idea of an unbiased platform was recently called into question when it was revealed that Facebook staff curated news articles in its “trending” section.

The company then fired the team that managed the trending pieces and let an algorithm take over, and Facebook immediately began pushing fake and vulgar stories into users’ feeds.

“Facebook wants to have the responsibility of a publisher but also to be seen as a neutral carrier of information that is not in the position of making news judgments,” said Jim Newton, a former Los Angeles Times editor who teaches journalism ethics at the University of California, Los Angeles. “I don’t know how they are going to be able to navigate that in the long term.”

Bell said Facebook was a “spectacularly well resourced and brilliant organization from a technological perspective” – and that its editorial efforts should start to reflect that rigor and dedication. Some have called for someone responsible for tough newsroom decisions to take over: an editor in duties and title.

“Facebook needs an editor – to stop Facebook from editing,” Jeff Jarvis, journalism professor at the City University of New York, wrote on Friday. “It needs someone to save Facebook from itself by bringing principles to the discussion of rules.”


 

Jarvis argued that Facebook should allow editors of reputable news organizations to make key decisions related to how they use the platform – such as publishing a war photo that may technically violate policy. A Facebook editor could intervene when there are disagreements, he said.

Edward Wasserman, dean of the University of California, Berkeley journalism school, said Facebook should create a team of journalists to address these complex issues – instead of reacting on reflex when a mistake goes viral.

“If they are in the news business, which they are, then they have to get into the world of editorial judgement.”

‘Black boxes’ that no one understands

To some in media, a key first step to making Facebook a respectable news publisher is increased transparency.

Aside from leaked documents, the public knows little about how Facebook’s algorithms work and what role employees play. Both algorithms and editors can be heavily biased, and until Facebook is more open about how and why certain news stories are permitted and promoted, the company will continue to earn the ire of journalists, whose stories can live or die by Facebook.

“Facebook and Google are like black boxes. Nobody understands the algorithms except the people inside,” said Jonathan Taplin, former director of the University of Southern California’s Annenberg Innovation Lab.

Without knowledge of how Facebook determines a war photo is pornographic, there is an assumption that a machine or an ignorant low-level staffer made the call – and there is no one to hold accountable.

“It was probably a human who is 22 years old and has no fucking idea about the importance of the [napalm] picture,” Taplin said.

While news organizations like the Guardian operate as a public trust, Facebook has a different agenda, he added. He noted that the Facebook board continues to include Peter Thiel, the tech billionaire who secretly funded a lawsuit thatbankrupted Gawker.

Still, it’s in Facebook’s best interest in the long run to become a reliable and reputable system for news, said Kathleen Culver, director of the Center for Journalism Ethics at the University of Wisconsin-Madison.

“Facebook is trying to deliver a platform that people want to advertise on,” she said, “and [one that] users want to be on.” Culver compared the network to MySpace, which had failed to attract ad revenue because it was so unregulated and chaotic.

A Vietnam war image would not discourage advertisers or users, she said, but a constant stream of fake or obscene stories can be a legitimate turnoff.

“If I go right now to trending topics, I guarantee I can find you conspiracy theories [and] misinformation,” said Culver.

On Friday, when Facebook was still censoring the photo of nine-year-old Kim Phúc running away from a napalm attack, the site’s trending bar began posting something indisputably offensive – a story suggesting that the September 11 terrorist attacks were caused by a “controlled demolition”.
 

While Facebook has vowed to discuss the matter with publishers, journalists said the case highlighted the importance of media pressure on the social media giant to rethink its practices.

The blistering open letter by Espen Egil Hansen, editor of Aftenposten, the Norwegian paper that launched the controversy, offered a strong blueprint for how media should push back against Facebook censorship, Culver said.

“It’s very important for journalists and news organizations to be fully defending free expression.”

News outlets including the Guardian, the Washington PostTime and NBC posted articles with the napalm photo on their Facebook pages after Hansen’s front-page rebuke of Zuckerberg began to spread. “More and more people have to stand up and say this is not the role that Facebook should have,” Taplin said.

But although the Aftenposten letter pushed Facebook to back down, Wasserman remained perturbed by the company’s intransigence. “It shouldn’t take a front-page expression of disgust,” he said, “to get their attention.”

Keywords
You did not use the site, Click here to remain logged. Timeout: 60 second