<
 
 
 
 
×
>
You are viewing an archived web page, collected at the request of United Nations Educational, Scientific and Cultural Organization (UNESCO) using Archive-It. This page was captured on 10:58:30 Jun 16, 2019, and is part of the UNESCO collection. The information on this web page may be out of date. See All versions of this archived page.
Loading media information hide

Building peace in the minds of men and women

Wide Angle

Aftenposten versus Facebook: triggering a crucial debate

cou_02_2017_debate_02.jpg

Norwegian illustrator Inge Grodum’s take on the iconic photograph, “The Terror of War”, condemning Facebook’s censorship for “inappropriate content”.
© Nick Ut / Sipa Press / Inge Grodum
The increasing role that social media sites play in news distribution raises several concerns. Espen Egil Hansen of Aftenposten (Norway) and Richard Allan of Facebook come from different worlds, yet face a similar challenge.

By Marina Yaloyan 

It is an icon of war photography: the black-and-white image reveals a naked nine- year-old girl, fleeing from an explosion, screaming, her face distorted with pain. Taken by Vietnamese-American photographer, Nick Ut, during the napalm strike on a Vietnamese village in 1972, the Pulitzer prize-winning photo, “The Terror of War”, raised controversy in 2016, when Facebook banned it because of “inappropriate content”.

“I wrote to Mark Zuckerberg telling him that I wouldn’t comply,” remembers Espen Egil Hansen, the editor-in-chief of Aftenposten, Norway’s largest newspaper, who shared the post on Facebook and got threatened with a permanent ban. Hansen’s bold letter, splashed across the front page of Aftenposten, condemned Facebook for creating rules that first “cannot distinguish between child pornography and famous war photographs” and then “exclude every possible debate.” The letter drew massive support and became the starting point for a heated discussion around Facebook’s intricate censorship rules and the control of content through newsfeed algorithms.

With two billion users worldwide and leading more traffic to news sites than Google, Facebook has now emerged as a major player in news distribution, even though it still evades formal responsibility by positioning itself as a “technical platform”. Nevertheless, it has arguably become the largest global media site, which has turned Mark Zuckerberg into “the most powerful editor-in-chief in the world,” according to Hansen.

“I reminded Zuckerberg that this title comes with responsibility. He doesn’t just own a tech company, he owns a media company.” This is precisely why Hansen considers that censoring an iconic image of photojournalism because of nudity was a bad editorial decision. “Disturbing images may not always be comfortable to look at, but they are the ones that help promote awareness in a democratic society,” he says.

Millions of people post content on Facebook’s pages every day, which makes the process of selecting information case-by-case an obvious challenge. Richard Allan, Facebook’s Vice President, Public Policy, EMEA (Europe, the Middle East and Africa), defends the site’s general guidelines, which demand that photographs of children under 18 containing nudity are tracked and taken down.  Yet, he does admit that with the “The Terror of War” photo, this policy fell short. “Just to be clear, we question ourselves all the time. When we face a new situation we haven’t anticipated before, we ask ourselves, what should we do now? Should we change our rules?” he stated during the colloquium, Journalism under Fire, at UNESCO in March 2017.

Two months later, Facebook announced that it would add 3,000 more people to its 4,500-strong community operations team. The newly-adopted, more flexible, approach requires content reviewers to treat news as an exception. “There are occasional photos of naked children where the public interest of the publication of that photo and, in this case, the consent of the person involved, outweighs the regular policy,” Allan says.

Algorithm – the world’s new editor-in-chief

There is little or no significant difference between Facebook and traditional news venues when it comes to editorial choice. “In the same way in which an editor-in-chief of Fox News is responsible for the editorial content of Fox News, Mark Zuckerberg is responsible for the editorial content of Facebook,” insists Hansen.

The only real difference between the two is the largely misunderstood and controversial newsfeed algorithm that traditional media editors do not use. “We want to keep our essence. You are your own editor and you choose what you want to see,” states Richard Allan, regarding Facebook’s policy.

Meanwhile,  algorithms continue to shape the reading habits of 1.28 billion daily active Facebook users (March 2017), or one-fifth of the world’s population. Facebook scans and analyzes all the information posted by any given user in the previous week, taking into account every page that he or she has liked, all the groups he/she belongs to and everybody he/she follows. Then, according to a closely-guarded and constantly evolving formula, the algorithms rank the posts in the precise order they believe the user will find worthwhile.

However, the very nature of algorithms can turn them into controversial, and even dangerous, tools. “Algorithms may create the so-called filter bubbles, which reinforce a negative trend of our time – one that leads to more polarized communities,” says Hansen. “More and more people live in bubbles, where they only get the information they want, and communicate only with like-minded people.” From this perspective, the criteria of selection used by algorithms to classify information become crucially important.

Allan, however, compares the newsfeed to a periodical subscription and denies imposing any content on Facebook’s readers. According to him, algorithms merely allow for the arranging of periodicals in a way that is most convenient for the reader. The challenge, however, lies in the large quantity of newsfeeds available. “What we find is that people sign up for a thousand different feeds when they only have time to read twenty of them,” he says. “The thousand feeds are still there but this obviously creates a selection process, as we pick those that are going to appear on the top.”

Favouring the information that readers prefer can be a slippery road. According to Hansen, it is a “convenient strategy when watching Netflix [the United States-based streaming service]” but remains a “questionable principle for the free flow of information in a society.”

Fake news – real solutions?

 

On a positive note, social media sites do breach barriers and make it easier for people to express themselves. “When I wrote my letter to Mark Zuckerberg, I published it in a small paper in a small country, but the story immediately went viral. Ironically, I think it was Facebook itself that made the story so popular,” recalls Hansen, whose own newspaper has more than 340, 000 followers on Facebook. However, he quickly admits that the opportunity given to everyone to publish information is a double-edged sword that may lead to disinformation. “It is obviously easier today to mislead very large parts of populations. I wonder if, as a society, we are actually prepared for the alarming trends that we are witnessing,” he says.

In a series of scandals related to fake news that shook Facebook in 2016, the company has been accused of influencing the United States Presidential election, by spreading fake news stories and creating filter bubbles that isolated voters from other opinions. Overall, fake news about US politics alone accounted for 10.6 million of the 21.5 million total shares, reactions, and comments these English-language stories generated on Facebook this year, according to one analysis. A hoax about former US President Barack Obama generated more than 2.1 million comments, reactions and shares on Facebook in just two months.

No wonder that in order to curb criticism, Facebook has introduced a corrective fact-checking programme. Starting in May 2017, stories that have been signalled by users as unreliable are verified by independent fact-checking experts and labelled as “disputed”.

“We are not going to remove these,” stresses Richard Allan. “On one hand, we don’t want to be the arbiters of truth and edit content. On the other, we would like to build an informed community, as we have a responsibility to society.”

Hansen views the acknowledgement of this responsibility as key. He praises the positive improvements that Facebook has adopted since his letter in Aftenposten first came into the spotlight. “Mark Zuckerberg gave an interview to the New York Times, where he said that the controversy around this letter was an eye-opener and made him realize that he needed to change the way Facebook functions.”

This realization and subsequent change become crucial, especially in the wake of social media’s broad impact on traditional media and its ever-growing omnipresence in our daily lives.


© Ruben Oppenheimer

An open letter that prompts change

“…Listen, Mark, this is serious! First you create rules that don’t distinguish between child pornography and famous war photographs. Then you practise these rules without allowing space for good judgement. Finally you even censor criticism against and a discussion about the decision – and you punish the person who dares to voice criticism…

The free and independent media have an important task in bringing information, even including pictures, which sometimes may be unpleasant, and which the ruling elite and maybe even ordinary citizens cannot bear to see or hear, but which might be important precisely for that reason…

The media have a responsibility to consider publication in every single case. This may be a heavy responsibility. Each editor must weigh the pros and cons. This right and duty, which all editors in the world have, should not be undermined by algorithms encoded in your office in California.

Facebook’s Mission Statement states that your objective is to “make the world more open and connected”. In reality you are doing this in a totally superficial sense. If you will not distinguish between child pornography and documentary photographs from a war, this will simply promote stupidity and fail to bring human beings closer to each other.

To pretend that it is possible to create common, global rules for what Espen Egil Hansen to Mark Zuckerberg may and what may not be published, only throws dust into people’s eyes…”

(Extracts from an open letter from published on the front page of Aftenposten on 8 September 2016.)