It’s past time for Facebook to come clean about how it is handling user data. After the latest Cambridge Analytica news broke the dam on over a decade of Facebook privacy concerns, Mark Zuckerberg is heading to to Washington, D.C. this week for two days of Congressional testimony. On Tuesday, he’ll appear before the Senate Judiciary and Commerce Committees, and on Wednesday the House Energy & Commerce Committee.
The last thing we need from Zuckerberg at these hearings is more apologies. What we want is information and accountability, and not just in connection with discrete scandals like Cambridge Analytica. Congress should be asking Facebook for a detailed account of what data Facebook has shared with third parties, what it has done to prevent misuse of that data, what it told users about how it would handle their information, and what steps it will take in the future to respect users' privacy rights.
And because this is about more than just Facebook, Congress should also be asking whether Facebook will serve as an industry leader by publicly embracing key privacy-protective principles.
A company ethos of connection and growth at all costs cannot co-exist with users' privacy rights.
Beyond nailing down down the details of specific cases like the Cambridge Analytica mess and the revelation of paid Russian propaganda on the social media giant's platform, we hope lawmakers will also keep in mind the larger tension at the core of each one of Facebook's privacy missteps and scandals: A company ethos of connection and growth at all costs cannot co-exist with users' privacy rights. Facebook operates by collecting, storing, and making it easy for third parties to use unprecedented amounts of user data. Until that changes, the privacy and integrity concerns that spurred these hearings are here to stay.
Getting to the Bottom of Facebook’s Word Games
There is no shortage of questions and angles that Congress members can grill Zuckerberg on—from the Cambridge Analytica fiasco that exposed the data of approximately 1 in 5 Americans, to paid propaganda’s effect on the 2016 election, to private censorship, to the role of AI technologies in detecting and mitigating all of the above.
And Zuckerberg will no doubt weave word games and roundabout language into his answers to distract from the real problem.
Here is some language to watch out for—and for lawmakers to drill down on when they hear it:
“Bad actors”
In responding to Cambridge Analytica, Zuckerberg and other Facebook executives have focused on bad actors, malicious third parties, and hackers of great scale and sophistication. But in the vast majority of cases, these parties did not have to “hack” into Facebook's systems to violate user privacy or manipulate user attention. Instead, they simply scooped up the user data that Facebook made available to them, using the tools that Facebook provided. The problem is not the actions of any “bad actors." The problem is with Facebook. The company collects and retains unprecedented amounts of user information, fails to provide meaningful transparency, and makes it easy to for third parties to find, analyze, and even abuse this information without users’ knowledge or consent.
The problem is not the actions of any “bad actors." The problem is with Facebook.
Facebook’s recently announced API changes—which we can expect this week’s hearings to delve into—do a lot to lock away user information from third-party developers, but little to protect user information from Facebook itself. With one notable exception (limiting the retention of Android users’ call and SMS logging data, which received international media scrutiny), the company has made no changes toward collecting or storing less user data for its own purposes. By locking down its APIs, Facebook is saying that only it and it alone can be trusted with user data. Now users have even less power to use third-party tools that they do trust to explore the data held by Facebook and hold the company accountable.
Selling or not selling user data
Zuckerberg has repeatedly insisted that Facebook does not, and never will, sell user data to advertisers. While it is true that Facebook does not sell user data directly to advertisers, this point functions to distract from the indisputable fact that the company does sell access to user data and attention in the form of targeted advertising spots. No matter how Zuckerberg slices it, Facebook’s business model revolves around monetization of user data.
What Facebook knows about you
Last week, while on the record with reporters, Zuckerberg claimed, “The vast majority of data that Facebook knows about you is because you chose to share it.” But that simply does not square with several aspects of how Facebook collects and analyzes user data, including (but not limited to) third-party tracking in the form of Facebook’s “like” buttons across the web, shadow profiles, call and text logs, and computational inferences that can conclude characteristics and preferences a user never told Facebook about.
Zuckerberg’s language here misses the critical distinction between the information a person actively shares, like photos or status updates, and information that Facebook takes from users without their knowledge or consent.
“Idealistic and optimistic”
In the statement Zuckerberg will deliver at his hearing in the House on Wednesday, he begins by describing Facebook as an “idealistic and optimistic company” that was focused on fostering personal connections and got caught off-guard by abuse on its platform.
If Facebook didn’t see this coming, it’s because it wasn’t listening.
But that posture can only go so far. If Facebook didn’t see this coming, it’s because it wasn’t listening. EFF, other civil liberties groups, and the press have been sounding the alarm on aspects of today’s Cambridge Analytica scandal since as early as 2009. On top of that, the abusive consequences of prolific data collection are predictable and well-documented. It’s willful ignorance if a company engaged in that kind of data collection didn’t do their homework.
Back to the Big Question
We can expect Zuckerberg to apologize for past mistakes, explain the challenges his platform faces, and outline the fixes Facebook is ready to roll out. But the big question is: Will he be able to convince users and members of Congress that any of those fixes is substantial enough to amount to real change?
The CEO’s testimonies are an opportunity for Congress to shed light not only on Facebook’s “black box” algorithms but also on its data operation as a whole. That means confronting the hard, fundamental questions about how an advertising-powered, surveillance-based platform can provide adequate user privacy protections.
In particular, Congress should beware of offers from Zuckerberg to better control the misuse of user data and expression by granting his company and other tech giants an even greater, more exclusive role as the opaque guardians of that data. Facebook has had a long history of saying, “Trust us. We know what we’re doing," without offering much transparency or accountability to their users. Congress should take this opportunity to trust Zuckerberg a little less, and Facebooks’ users—who are also, coincidentally, their voters—a little more.