After 10 hours of Capitol Hill hearings and five hours in Westminster in London, Facebook Inc. executives’ main pitch to avoid draconian regulation has been, “We dropped the ball, you’ve just got to believe us when we say we’ll do better.” In the company’s narrative, Facebook was created by a bunch of naive college kids who had little notion that their platform might be used to malicious ends.
Here’s Chief Technology Officer Mark Schroepfer on Thursday in the U.K. reflecting on when he started at Facebook in 2007:
“I did not have any idea that a decade later I would be here talking to you about Russia interfering with elections using products that I built.”
And CEO Mark Zuckerberg in the Senate earlier this month:
“I think it’s pretty much impossible, I believe, to start a company in your dorm room and then grow it to be at the scale that we’re at now without making some mistakes.”
The naivete excuse may have held water a decade ago. But we now know that Facebook executives did discuss the risks internally prior to the U.S. presidential election, as this Buzzfeed scoop from March shows, and seemingly did little to fix them.
Schroepfer moved on to how the company is now tackling the matter:
“The best we can do now is spend all of our time and energy asking what are all the possible vectors of abuse, what are the trade-offs of safety and security.”
So Facebook is asking regulators (and the general public) to trust it. In Schroepfer’s words
“If you’re asking me a question of intent, I can only tell you what’s in my heart, which is that we do really care about these things.”
That’s where the defense gets particularly dubious, and it was a theme that was revisited regularly after the House of Commons select committee, which was quizzing Schroepfer, broke for lunch. Members of Parliament cited repeated failures to act.
Here’s Ian Lucas:
“Mr Schroepfer, I remain to be convinced that your company has integrity.”
Then Brendan O’Hara:
“Isn’t it the case that the only difference as to why you’re acting now is because the rest of us know, and had it not exploded in the public forum the way it has, nothing would have changed, would it?”
And the chairman, Damian Collins:
“What’s been problematic for us in this investigation is a pattern of behavior … At the beginning of this, Facebook did not want to engage with any of our questions, it would only engage with the electoral commission.”
obfuscation. Zuckerberg repeatedly conflated data and content in his testimony to U.S. lawmakers. When confronted with the same topic on Wednesday as it pertained to the collection of cookies from third-party websites, Schroepfer hastily claimed (misleadingly) such data’s sole value was in identifying fake accounts
. The reality is that it allows for better ad targeting.
The company is now making an effort to appear more transparent. In a blog post outlining how user data is used for ads, Facebook maintained that the user is not the product, insisting that “the core product is reading the news or finding information.” It’s another convenient conflation. The product is really ads, as Zuckerberg repeatedly says on earnings calls, and they are dependent on user data.
When Facebook unveiled its new privacy settings in the context of Europe’s General Data Protection Regulation, it invited a select group of journalists, excluding many of the usual beat reporters. TechCrunch’s Josh Costine wrote a long analysis of the new agreements, concluding “it seems like Facebook is complying with the letter of GDPR law, but with questionable spirit.”
Of course there is good regulation and bad regulation. But Zuckerberg and the other Facebook executives are asking its users and the public to trust them. So far, they’ve given scant reason to do so.
This column does not necessarily reflect the opinion of Bloomberg LP and its owners.
To contact the editor responsible for this story:
Daniel Niemi at firstname.lastname@example.org