Surveillance capitalism and the liberal democracy

Nichola Cooper‘s fourth post in our Emerging Fellows program explores privacy, trust, and social media. The views expressed are those of the author and not necessarily those of the APF or its other members.

When Mark Zuckerberg was designing The Facebook in 2004, he was staggered that people were willing to share their personal data, “they just submitted it, I don’t know why, they just trust me – dumb f**ks”.

Well, those “dumb” people have just realised what Facebook harvests and monetises personal data for in the Cambridge Analytica shenanigans. Zuckerberg is contrite, of course, that’s this thing; apologising for a “breach of trust.” But, users don’t really trust social platforms anyway. Their upset is probably because they didn’t see this coming. It’s the advertisers Zuckerberg is really apologising to. Facebook’s business model relies on users sharing content and being receptive to advertisers’ messages. As the mantra goes: “if the service is free, you are the product”. Judging by the #deletefacebook and #faceblock campaigns, people don’t want to be a product.

Of course, widespread abandonment of social media is unlikely. In a globalised world we need a way of connecting, so regulators are starting to act. On 25 May, the EU General Data Protection Regulation (GDPR) will come into effect which will shift the balance of power from the company to the user and Facebook will need to watch their step. Article 25 – privacy by design – addresses how privacy protocol redesign should be interpreted: proactively. Yet, given Facebook’s history of asking for forgiveness over permission, users might be forgiven for expecting future privacy breaches despite regulatory controls.

Looking forward, there remain serious concerns about the future of democracy. The UK Information Commissioner’s Office is investigating 30 organisations – including Facebook – as part of its inquiry into the use of personal data and analytics for political purposes. The UK’s final European Commissioner for security, Sir Julian King, writing to the European Commission that Cambridge Analytica’s psychometric profiling during the Vote Leave Brexit Campaign was “a preview of the profoundly disturbing effects such disinformation could have on the functioning of liberal democracies”, asking for plans to manage social media companies in preparation for the European political campaigns of 2019. Emmanuel Macron supports Sir Julian, promising to ban fake news during election campaigns. It follows Malaysia. One of the first countries in the world to put an Anti-Fake News bill before parliament which will penalise those who create or circulate fake news with 10-years imprisonment or a fine of up to 500,000 ringgit (£90,000).

Then there are Comcast and Verizon to worry about. After the US Congress extended the same data-gathering practices to internet service providers like Comcast, AT&T, and Verizon, US consumers are concerned about their role as internet gatekeepers. The risks of exploiting personal data are far higher, beyond unsolicited book recommendations. Several companies are holding the length and breadth of your entire digital footprint. Take China as an example; from May, Chinese citizens with poor “social credit” may be banned from public transport for up to a year based on what they buy, say and do.

We are at a critical juncture where the sociopolitical consequences of surveillance capitalism can get much better, or much worse. Can we afford to not pay attention?

© Nichola Cooper 2018