Another data-point to flesh out the Facebook data misuse scandal: The company has informed the European Commission that a total of 2.7 million EU citizens had their information improperly shared with the controversial political consultancy, Cambridge Analytica (via Reuters).
Facebook had already revealed a breakdown of the top ten markets of affected users. But in the list of countries it published the only EU nation was the UK — which it said could have up to almost 1.08M affected users. So up to a further million EU citizens could also have had their data swiped as a result of the scandal, without their knowledge or consent.
Privacy is a fundamental right under the bloc’s legal regime so the improper sharing of millions of EU citizens’ data could have legal consequences for the company.
“Facebook confirmed to us that the data of overall up to 2.7 million Europeans or people in the EU to be more precise may have been improperly shared with Cambridge Analytica . The letter also explains the steps Facebook has taken in response since,” an EC spokesman told Reuters.
At the time of writing Facebook could not immediately be reached for comment.
The company is a signatory to the EU-US Privacy Shield framework; a mechanism which came into force in mid 2016 — replacing the invalidated Safe Harbor arrangement which had stood for 15 years — intended to simplify the process of authorizing transfers of EU citizens’ personal data across the Atlantic.
Companies on the Privacy Shield list self-certify to adhere to a set of privacy principles. However they can be removed if they are determined to have violated their obligations — with the US’ FTC acting as the enforcement authority.
The same federal watchdog is now investigating Facebook as a result of the Cambridge Analytica data misuse scandal. Nor is this the first time the FTC has probed Facebook’s actions in relation to user privacy. In 2011 it charged the company over deceptive privacy claims.
In the subsequent FTC settlement Facebook committed to giving users “clear and prominent notice” and to obtaining their consent before sharing their information beyond their privacy settings.
Facebook will now need to explain to the FTC how its actions in 2013-2015 mesh with that earlier consent agreement.
In mid 2015 the company finally tightened app permissions’ settings for all developers on its platform. But prior to that these had been lax enough for vast amounts of personal data to be sucked out without most users being aware — because the data sharing was being ‘authorized’ by their Facebook friends (who also likely weren’t aware what they were agreeing to).
So, for example, just 558 Filipino Facebook users installed the personality quiz app that passed data to Cambridge Analytica — yet the company was able to grab personal data on up to 1,175,312 more users in that country as a result of how Facebook allowed people’s data to be shared with developers on its platform.
Yesterday Facebook admitted as many as 87 million users in total could have had their personal info shared with Cambridge Analytica after 270k people downloaded the quiz app on its platform. (Though CA has disputed the 87M figure, claiming it only licensed data from the quiz app developer for 30M Facebook users.)
Writing about the data misuse scandal in the Harvard Law Review, David Vladeck, the FTC’s former director, argues there are now only two interpretations of Facebook’s actions vis-a-vis data protection and user privacy: Cluelessness or venality.
“Facebook now has three strikes against it: Beacon, the privacy modifications it made in 2009 to force private user information public, and now the Kogan/Cambridge Analytica revelation,” he writes. “Facebook can’t claim to be clueless about how this happened. The FTC consent decree put Facebook on notice. All of Facebook’s actions were calculated and deliberate, integral to the company’s business model, and at odds with the company’s claims about privacy and its corporate values. So many of the signs of venality are present.”
“[V]ague and unenforceable promises are not enough,” he adds. “The better approach would be for Facebook to acknowledge that it violated the consent decree and to come to the FTC with specific proposals for serious and enduring reform.”
In terms of specific proposals to reform privacy rules, Vladeck suggests Facebook needs to create systems that ensure third parties do not have access to user data “without safeguards that are effective, easy to use, and verifiable”.
“When third party access is sought, users must be given clear notice and an opportunity to say yes or no – that is, the gateway must be notice and the affirmative express consent required by the 2011 decree,” he adds. “Facebook also must develop accountability systems that prove that consumers have in fact consented to each use of their data by Facebook or by third parties. And Facebook must agree to refrain from using blanket consents; after all, blanket consents are the enemy of informed consent.”
In his view the company also needs to create systems to audit third party data collection and sharing “on an ongoing basis” — and thereby “hold third parties to their promises by engineering controls and contractual lockups” — including “effective remedies when third parties break the rules – including enforceable rights to audit, retrieve, delete and destroy data improperly acquired or used, and liquidated and actual damages for violations”. Rather than taking it on trust that developers given access to masses of user data will do the right thing.
“Facebook must also be accountable to the public,” he adds. “There must be far more robust reporting to the FTC, but those reports are non-public. To re-establish trust with its uses, Facebook should consider appointing a data ombudsperson and establishing a group outside the company that have unfettered access to Facebook data and employees to ensure that Facebook is now, finally, honoring its commitments to users, and this group should periodically report its findings on Facebook’s compliance.”