Log In

Reset Password
BERMUDA | RSS PODCAST

Facebook must confront its media responsibility

Sticking to his script: Mark Zuckerberg, Facebook CEO, keeps denying responsibility for what is posted on Facebook

Since the presidential election last week, Facebook’s role in policing fake news on its site has become a very hot topic.

And it should be. Throughout the election, Facebook’s behaviour has exposed what seems to be a great contradiction at its heart. As the social network has pushed hard to dominate new forms of media, it has also bent over backwards to deny that it is a media company — and denying that responsibility that comes with that label.

The truth is that Facebook has already taken on one of the functions of a media company: to act as a gatekeeper. It has labelled satire. It takes down “click bait” articles that, in its own words, have headlines that “intentionally leave out crucial information or mislead people”. Its algorithms clearly have some standards for content quality.

But Facebook will not apply those same standards to its fake-news problem. In fact, Facebook chief executive Mark Zuckerberg took to his own profile to explain why and to reject the much discussed idea that false news articles on the network could have affected the election. Zuckerberg said the same thing at a conference last week, but the idea has persisted so strongly that he decided to address it again.

I don’t know if we can lay credit or blame for the outcome of the election at social media’s doorstep. Finding that out would take a lot of research, an army of sociologists and access to a lot of Facebook data that I don’t have.

But what is troubling about Zuckerberg’s post is his explanation for why Facebook is not tagging or penalising false news:

“Identifying the ‘truth’ is complicated. While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted. An even greater volume of stories express an opinion that many will disagree with and flag as incorrect even when factual. I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves.”

I agree it is not easy for Facebook to tackle this problem. Worries of a politicised Facebook have dogged its steps before. Those accusations are still haunting the network: Facebook, in a statement on Monday, denied a Gizmodo article claiming it had a solution to its fake-news problem but quashed it, fearing backlash from the Right.

It is true that Facebook alone should not define what is the truth. But its prominence as a source of news gives it the responsibility to flag what is false.

In his post, Zuckerberg essentially falls back on an old excuse when the fake-news issue comes up: that Facebook is just technology firm and a platform ... an aggregator not up to the task of policing its users.

That excuse started out thin and is only getting thinner.

Sure, social-media companies may want to deny that they are in the content business. But they are in it — and only getting deeper. Look at where these companies are investing: more photo-sharing, video, virtual reality. Those are all new media products that they say they need to survive.

Yet Zuckerberg is sticking to his tech company script, denying responsibility for what is posted on Facebook — even when it does not make sense.

For example, to prove his point that fake news has little influence on Facebook, Zuckerberg pointed out that there is not that much of it on the network. “Of all the content on Facebook, more than 99 per cent of what people see is authentic,” he said in his post. “Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.”

That may be true, but does not reflect how Facebook works. A piece of writing crafted to generate clicks, likes and shares, by design, gets outsized attention. Zuckerberg himself said in his post’s comments that specific users may see more false content on their feeds depending on how they and their friends use the site.

He also rebutted the idea that slanted news could have on individual voters by citing Facebook’s own research, which shows the network exposes people to a broader set of ideas. Here, again, the issue is more nuanced. Yes, a recent Pew Research Centre study showed social-media posts have changed some people’s minds. But the same study showed that people often mute, block or otherwise filter out social-media opinions they do not want to hear.

“We see people engaging in a broader range of viewpoints while simultaneously taking steps to avoid content that is against their views,” said Aaron Smith, associate research director at Pew. “If there’s one consistency, it’s that you can really see both sides of that coin depending on how you ask things.”

Smith added that individuals do not control everything they see on Facebook; Facebook’s algorithms do. So, again, Facebook is the only one with the power to control what is on its site. And for a company that crows over its abilities to convince people to buy products, play games and even convince people to vote, it rings false for Facebook to reject the idea that its algorithms pick could not possibly influence those votes.

Facebook is not alone in wrestling with this problem. On Monday, the first Google result that appeared when searching for popular vote tallies linked to a false news report. That prompted both Google and Facebook to say they will not accept advertisements from fake news sites any more — a small step that will not address the larger problem, but at least makes an attempt.

Killing fake news online is a shared responsibility. Social-media users should, of course, be critical of the things they read. But Facebook and other social-media sites are not picking up their share of the work. Otherwise, such obviously false information would not score so well with those so-called neutral algorithms.

And if these companies continue to push their media efforts, they cannot then shy away from the responsibilities that come with the business. If quality and truth are what matter when it comes to content, that has to matter all of the time — not just when it is uncontroversial.

Hayley Tsukayama covers consumer technology for The Washington Post