What Facebook’s troubles teach us
On Monday April 9th 2018, individual Facebook users learnt whether their own personal data was caught up in the haul of 87 million accounts harvested by Cambridge Analytica.
Aleksandr Kogan’s thisisyourdigitallife app at the centre of the scandal was downloaded by ‘just’ 270,000 people, but – by exploiting Facebook’s T&Cs and API policies for third-party developers at the time – it extended its reach into the profile information of users’ friends (who hadn’t downloaded it themselves, and hadn’t agreed to their data being swept up). Initially the massive dataset was described as containing information pertaining to “mostly US citizens”, but since Facebook’s network effects and its users’ Friends lists rarely respect national borders, it was never going to remain a North American problem for long (current estimates are that around 1 million UK users have been affected, with an unknown number elsewhere in the EU).
Was the Cambridge Analytica moment a ‘breach’ in the traditional sense? It wasn’t, strictly-speaking, a hack – insofar as nobody needed to break into Facebook’s systems in order to access the data. The 270,000 thisisyourdigitallife users willingly (if unknowingly) signed up to give the app access to their personal profile information (although it’s unlikely any of them considered for a moment the use to which it would eventually be put). Their combined near-87 million (estimated) friends at the time however didn’t consent to the app accessing their profile data directly – that happened because they were happy for their friends to see it, and Facebook (at the time) allowed third-party apps to exploit that exposure and scoop up whatever the primary user themselves could see, as if acting as their proxy.
However, the episode certainly doesn’t exhibit the ‘privacy-by-default’ ethos espoused by the EU’s General Data Protection Regulation (GDPR); nor does it show Facebook in a particularly flattering light, or demonstrate that it’s treating its users’ personal data with trust and transparency. Whether the EU would have found Facebook in breach of the GDPR, had this all happened post-May 25th 2018 would be a matter for the courts.
If they found against Facebook, the company would be looking at a potential fine of over $1.6B (i.e. 4% of 2017’s global revenue). Whilst that wouldn’t be the largest corporate penalty in history, not even enough to get it into the Top Ten (most of those accolades go to financial institutions, topped off with BP’s $20.8B payout as a result of the Deepwater Horizon disaster in 2010), the hit that the company’s stock price has seen since the story broke into mainstream media is real enough (it’s now standing around $100B down from a peak towards the end of 2017). Plus the reputational damage to the Facebook brand shows no sign of diminishing quickly, as Cambridge Analytica revelations continue to emerge almost daily as governments and regulators take testimony from the main characters involved, and more whistleblowers emerge from the woodwork.
The fallout that Facebook’s experiencing couldn’t be a more faithful realisation of the “damage [to] both public reputation and bank balance” which UK Information Commissioner Elizabeth Denham warned of, in her address to businesses a year ago. Whatever your technical position on the letter (if not the spirit) of GDPR compliance come May 25th 2018, bad data governance is bad for business – it’s time to go beyond what regulation mandates and build an ongoing culture of positive data ethics and reap the benefits of a transparency premium when it comes to demonstrating good stewardship of your customers’, partners’, and employees’ personal data.
In an era of burgeoning privacy awareness and with the rise of the data self-sovereignty movement and the smarter data-savvy customer, if both sides of the data producer and consumer transaction better understand the affordances of their relationship (the rights, responsibilities, and trade-offs) then companies can develop a more sustainable data platform upon which to deliver the seamless, hyper-personalised experiences their customers are growing to expect.
It’s often been said that if you’re not paying with cash, then you’re paying with data. Regulations like the GDPR are giving customers the tools with which to drive a harder bargain for theirs; and service providers will need to adjust their economics and data literacy culture to suit.
Just because you can do something… that doesn’t necessarily mean you should.