Facebook’s Walled Wonderland Is Inherently Incompatible With News

Frederic Filloux
Monday Note
Published in
7 min readDec 5, 2016

--

Mark’s Dilemma

by Frederic Filloux

Setting aside the need to fix its current PR nightmare, Facebook has no objective interest in fixing its fake stories problem.

In the end, it all boils down to this:

  1. Facebook is above all an advertising machine. A fantastic one. I encourage everyone to explore its spectacular advertising interface and, even better, to spend a few bucks to boost a post, or build an ad. Its power, reach, granularity and overall efficiency are dizzying.
  2. Facebook’s revenue system depends on a single parameter: page views. Pages views come from sharing. Which page criteria lead to the best sharing volumes?
  • Emotions. Preferably positive ones. The little one's smile, the cat looking at a horror movie, etc. Or the human story loaded with sentiment. Facebook is plainly honest about emotions being a dominant factor: I often heard its people telling social media editors: “Go for emotion. It gets the best engagement.”
  • Fun, entertaining stuff. Again, cat videos, listicles, cartoons.
  • Proximity. Things emanating from friends and family. Facebook has to severely edit its huge content firehose in order to determine what is eligible to be shown in one's newsfeed. In doing so, the company chooses to give more weight to content originated by friends and family.
  • Affinities. Content that will comfort users in their opinions and feelings toward society or politics. On Facebook, you’ll never be alone thinking or believing what you hold dear.

So, sharing is key because it leads to higher page consumption which, in turn, leads to multiple bespoke advertising exposures.

How does Facebook tweak its system in order to favor sharing? It does so by becoming the ultimate filter bubble.

On Facebook, what you click on, what you share with your “friends” shapes your profile, preferences, affinities, political opinions and your vision of the world. The last thing Facebook wants is to contradict you in any way. The sanction would be immediate: you’d click/share much less; even worse, you might cut your session short. Therefore, Facebook has no choice but keeping you in the warm, comfort of the cosy environment you created click after click. In the United States, Facebook does this for 40 minutes per user and per day.

In a recent piece from the New York Times Magazine, writer Jenna Wortham explained it perfectly:

I’ve spent nearly 10 years coaching Facebook — and Instagram and Twitter — on what kinds of news and photos I don’t want to see, and they all behaved accordingly. Each time I liked an article, or clicked on a link, or hid another, the algorithms that curate my streams took notice and showed me only what they thought I wanted to see. That meant I didn’t realize that most of my family members, who live in rural Virginia, were voicing their support for Trump online, and I didn’t see any of the pro-Trump memes that were in heavy circulation before the election. I never saw a Trump hat or a sign or a shirt in my feeds, and the only Election Day selfies I saw were of people declaring their support for Hillary Clinton.

Here is an important question: How does news fit in Facebook’s walled Wonderland? Short answer: it doesn’t.

Unfiltered news doesn’t share well, not at all:
• It can be emotional, but in the worse sense; no one is willing to spread a gruesome account from Mosul among his/er peers.
• Most likely, unfiltered news will convey a negative aspect of society. Again, another revelation from The Intercept or ProPublica won’t get many clicks.
• Unfiltered news can upset users’ views, beliefs, or opinions.

Hence the importance of strongly filtering what comes from the news media. This, the social network candidly acknowledged last June when justifying a change in its algorithm. Here is what Adam Mosseri, VP Product Management for News Feed had to say on June 29 (edits and emphasis mine):

People expect the stories in their feed to be meaningful to them — and we have learned over time that people value stories that they consider informative. Something that one person finds informative or interesting may be different from what another person finds informative or interesting. (…) We’ve also found that people enjoy their feeds as a source of entertainment.

We are not in the business of picking which issues the world should read about. We are in the business of connecting people and ideas — and matching people with the stories they find most meaningful. Our integrity depends on being inclusive of all perspectives and view points, and using ranking to connect people with the stories and sources they find the most meaningful and engaging.

We don’t favor specific kinds of sources — or ideas. Our aim is to deliver the types of stories we’ve gotten feedback that an individual person most wants to see.

Here it gets interesting: At one level, Mosseri’s statement is plainly true; at another, it is a blatant lie. He’s truthful when he says Facebook doesn’t make a selection of sources over the immense spectrum of its audience. But he fails to mention that, at the individual level, the algorithm makes a strict selection of sources that show up in a user’s newsfeed.

At least, he’s honest when he adds:

We do this not only because we believe it’s the right thing but also because it’s good for our business. When people see content they are interested in, they are more likely to spend time on News Feed and enjoy their experience. (…)

On the technical side, Lars Backstrom, Facebook’s Engineering Director, alluded to the consequences on Pages maintained by people or news organizations — sometimes at a significant cost:

Overall, we anticipate that this update may cause reach and referral traffic to decline for some Pages. The specific impact on your Page’s distribution and other metrics may vary depending on the composition of your audience. (…) We encourage Pages to post things that their audience are likely to share with their friends.

In other words: publishers and organizations who maintained a FB Page that is not saturated with stuff fit to be shared with friends can expect some pain. (Actually, the impact of the algorithm change is disputed. According to Chartbeat, when considered broadly — hundreds of publishers — the impact was minimal. However, within the New York Times or among European publishers I regularly speak with, referrals volume coming from Facebook has dropped sharply).

Forgive me for filling this post with so many quotes but… if those who keep whining about Facebook’s negative impact on the 2016 election had read what Facebook repeatedly and plainly stated, they would have been less surprised. As we say in French, “c’est écrit sur l’emballage” (it’s written on the package).

If Facebook can’t be criticized for not warning its stakeholders in the news media, it still misled them it two major ways.

The first one was luring scores of media to FB’s in-house news products — Instant Articles, Facebook Live — and then squeezing them through its algorithm blackbox (not to mention revenue sharing that amounts to a miserly trickle of water).

Second, when he reiterated Facebook’s mission statement at the F8 conference last April, this is what Mark Zuckerberg had to say:

We stand for connecting every person. For a global community. For bringing people together. For giving all people a voice. For a free flow of ideas and culture across nations. (…) We’ve gone from a world of isolated communities to one global community, and we’re all better off for it.

Well. No. That is cool mental construct, but it simply is not true:

Facebook might have created a “global community” but its components are utterly segregated and fragmented.

Facebook is made up of dozens of millions of groups carefully designed to share the same views and opinions. Each group is protected against ideological infiltration from other cohorts. Maintaining the integrity of these walls is the primary mission of Facebook’s algorithm.

We must face the fact that Facebook doesn’t care about news in the journalism sense. News represents about 10% of the average user newsfeed and news can be cut overnight if circumstances dictate with no significant impact for the platform. (Actually, someone with good inside knowledge of the social network told me that news will be removed from users’ feed should the European Union move against Facebook in the same way it attacks Google on editorial issues).

In that broad context, the fake news situation is just a part of Facebook’s system, a bad apple in a large basket. It is impossible to believe that one of the best engineering companies in the world has not seen it coming; fake news was simply considered an unpleasant parasite, the wine lees at the bottom of the barrel… until Trump’s campaign made such a large use of fake news that it blew up.

There is no doubt that Facebook will address the issue sooner than many expect. It is largely doable. Proof is the results of the spontaneous crowdsourcing effort in that direction by smart people building B.S. detectors (read here in Quartz, as but one example). Even Yann LeCun, the Artificial Intelligence chief at Facebook acknowledged it:

“The technology either exists or can be developed. But then the question is how does it make sense to deploy it? And this isn’t my department.”

Facebook will solve just enough of the fake news to put down the raging PR fire. But the company will never jeopardize its money machine for the sake of mere societal considerations.

frederic.filloux@mondaynote.com

--

--