Mark Zuckerberg Thinks We’re Idiots.

Jean-Louis Gassée
Monday Note
Published in
5 min readMar 25, 2018

--

by Jean-Louis Gassée

Surprise: Thanks to the Cambridge Analytica revelations, we’re finding out that Facebook allowed a much broader and deeper prostitution of our private data than it had previously claimed. Facebook’s disingenuous explanations call for more questions and even less trust.

Reacting to the Cambridge Analytica scandal, Facebook’s founder and CEO Mark Zuckerberg hasn’t been as confrontational as Steve Jobs (“You’re holding it wrong”) or Sun Microsystems’ CEO Scott McNealy (“Get over it. You have no privacy”). Far from it, Zuckerberg’s apologies have been well-rehearsed in their embarrassment and clumsy phrasing : “I think we let the community down, and I feel really bad and I’m sorry about that…” (from Recode’s interview with Zuckerberg).

As Facebook’s leader, Zuckerberg resolves to get things straightened out in the future (“it’s my job, right?”) while he delivers a callcenter-style broken record reassurance: “Your privacy is important to us”. Yes, of course, our privacy is important to you; you made billions by surveilling and mining our private lives. One wonders how aware Zuckerberg is of the double entendre.

This wasn’t Zuckerberg’s first apology. Last September, he confessed that he regretted his previous dismissal of the notion that Facebook might have played a role in the outcome of the November 2016 election [as always, edits and emphasis mine]:

“After the election, I made a comment that I thought the idea misinformation on Facebook changed the outcome of the election was a crazy idea. Calling that crazy was dismissive and I regret it,” the founder wrote, offering his first public admission that his initial remarks were off-base. “This is too important an issue to be dismissive.”

With the Guardian’s new exposé, things are different for Facebook: They’re much worse. The company is in a new kind of defensive stance as it tries to explain how the data of as many as 57 million users was aspirated by a researcher named Aleksandr Kogan and handed over to Cambridge Analytica, a political influence peddler.

After a few days of silence, Zuckerberg talked to the New York Times, to CNN, to Wired and Recode to explain what happened and, more important, to tell us what his company intends to do to put things right.

Carefully reading and re-reading Zuckerberg’s words puts me ill at ease. Of course, simply complaining that Facebook’s CEO sounds well-rehearsed won’t do. He’s a pro at managing a major crisis. Persphinctery statements are part of the fare (from the NYT interview):

“Privacy issues have always been incredibly important to people. One of our biggest responsibilities is to protect data.”

But we quickly get to the misrepresentations.

“… someone’s data gets passed to someone who the rules of the system shouldn’t have allowed it to, that’s rightfully a big issue and deserves to be a big uproar.”

Here, Zuckerberg glosses over the pivotal fact that researcher Aleksandr Kogan accessed data in a manner that was fully compatible with Facebook’s own rules (see below). It appears that the rule-breaking started after he put his mitts on the data and made a deal with Cambridge Analytica.

Next, we’re treated to the resolute statements. Facebook now realizes what transpired and will make sure it won’t happen in the future:

“So the actions here that we’re going to do involve first, dramatically reducing the amount of data that developers have access to, so that apps and developers can’t do what Kogan did here. The most important actions there we actually took three or four years ago, in 2014. But when we examined the systems this week, there were certainly other things we felt we should lock down, too.”

Three rich sentences, here. And a problem with each one…

First, an admission that Facebook’s own rules allowed developers overly-broad access to our personal data. Thanks to Ben Thompson, we have a picture of the bewildering breadth of user data developers had access to:

(Thompson’s Stratechery Newsletter is a valuable source of insights, of useful agreements and disagreements.)

Of course, developers have to request the user’s permission to make use of their data — even for something as seemingly “innocent” as a game or psychological quiz — but this isn’t properly informed consent. Facebook users aren’t legal eagles trained in the parsing of deliberately obscure sentences and networks of references and footnotes.

Second, Mark Zuckerberg claims that it wasn’t until 2014 that the company became aware of Cambridge Analytica’s abuse of Facebook’s Open Graph (introduced in 2010). This, to be polite, strains credulity. Facebook is a surveillance machine, its business is knowing what’s happening on its network, on its social graph. More damning is the evidence that Facebook was warned about app permissions abuses in 2011:

“… in August 2011 [European privacy campaigner and lawyer Max] Schrems filed a complaint with the Irish Data Protection Commission exactly flagging the app permissions data sinkhole (Ireland being the focal point for the complaint because that’s where Facebook’s European HQ is based).”

Finally, Zuckerberg tells us that upon closer examination Facebook realizes that it still has problematic data leaks that need to be attended to (“So we’re going ahead and doing that” he reassures us).

The message is clear: Zuckerberg thinks we’re idiots. How are we to believe Facebook didn’t know — and derived benefits — from the widespread abuse of user data by its developers. We just became aware of the Cambridge Analytica cockroach…how many more are under the sink? In more lawyerly terms: “What did you know, and when did you know it?”

A company’s culture emanates from the top and it starts early. In 2004, the man who was in the process of creating Facebook allegedly called Harvard people who entrusted him with their emails, text messages, pictures, and addresses “dumb fucks”. Should we charitably assume he was joking, or ponder the revelatory power of such cracks?

I have no idea how much trust Facebook lost in the current scandal, how much will be lost or regained in the combination of revisiting past misdeeds and making new, more intelligible rules. Zuckerberg, who's very well read, certainly knows that everything runs on trust, even dictatorships.

The next weeks and months will be unusually interesting, especially with the new European privacy regulation (GDPR) becoming enforceable this coming May 2018.

JLG@mondaynote.com

--

--