We are seeing a watershed moment in the history of the use of private data to stealthily subvert the democratic process, and Facebook and its tech friends may face a serious backlash.
Whether this will finally persuade lethargic regulators in India, to put in place regulations about data protection, remains to be seen.
There has been a lot of concern about how the cavalier use of customer data by various technology platforms can lead to unforeseen consequences, and there have been calls to restrict the ways data is captured, analysed and stored. I have been particularly concerned about the way Indian customer data is flowing to Chinese and American entities, as I mentioned in these pages earlier (Tech giants and data: Is India giving away a treasure trove of data about its citizens?)
However, until the Cambridge Analytica case blew up so spectacularly, there hasn’t been a concrete instance of data affecting something politicians actually care about: losing elections. So now calls for regulation will be positively deafening.
The allegation about Cambridge Analytica, based on information from a whistleblower named Chris Wylie, is that it has (possibly illegally, and almost certainly unethically) used deep information about 50 million voters (some say it is almost all 230 million US voters), gathered from their Facebook feeds and their friends’ Facebook feeds, to influence them in subtle ways. The reason this is causing an uproar, is because there is a chance that Steve Bannon, a former Trump aide, may well have influenced these voters to plump for President Donald Trump.
But that partisan point has resonance in India as well, because the same Cambridge Analytica has been talked about by the Indian National Congress as its potential marketing partner. Even though there is nothing definitive about an agreement between them, but many news stories suggest they were in late-stage discussions.
An intriguing suggestion from a reader Anand was as follows: a key goal in the US campaign was “voter disengagement” and persuading Democratic Party voters to “stay at home”. In the wake of the large number of NOTAs (non of the above) in Gujarat, and the very low turnout in Gorakhpur, is a similar ‘voter disengagement’ strategy being rolled out in India as well? Maybe then there’s a stealth campaign already in place in India with Cambridge Analytica? Is this the preferred Congress tactic for 2019?
There was always something a little sinister about Cambridge Analytica, and people have mumbled darkly about its role in the Brexit and Trump campaigns. And now it appears as though the suspicions had merit: Wylie, as reported in a detailed profile in The Guardian, (part of the Cambridge Analytica Files, 17 March claims he “made Steve Bannon’s psychological warfare mindf*** tool”.
Wylie, the whistleblower, was employed by SCL Elections, the parent company of Cambridge Analytica, which is a joint venture between SCL Elections and Robert Mercer, a hedge-fund billionaire and an active supporter of Republican causes in the US, as well as of Donald Trump. It turns out that (quoted from the above Guardian story):
…[SCL Elections’] expertise was in “psychological operations” - or psyops - changing people’s minds not through persuasion but through “informational dominance”, a set of techniques that rumour, disinformation and fake news.
SCL Elections had used a similar suite of tools in more than 200 elections around the world, mostly in undeveloped democracies that Wylie would come to recognise were unequipped to defend themselves. [emphasis mine]
India being one of those “underdeveloped democracies” they toyed with makes me queasy. Yes, they used a standard military psy-war operation to turn the data from those millions of Americans into a potent information warfare weapon. In essence, by gaining an understanding of the individual peculiarities of voters (for instance, through correlations such as the one that says leftists are more likely to buy Nike shoes), they were able to tailor messages to individuals that would work on their particular psychologies. By deeply analysing the Facebook data, they were able to extrapolate and deduce many things about the users that are not obvious.
This sort of staggering extrapolation is not new: there was a story recently about how, by simply analysing Uber traffic data, it was possible to figure out who was having an affair with whom! Thus our ‘digital footprint’ betrays us in all sorts of ways. The dictum “data is the new oil” is not entirely vacuous, it turns out. Also, a widely-reported story in Science showed that fake news was six times as likely as real news to go viral.
These revelations have already had an impact on Facebook, already embattled over accusations about fake news. It’s stock fell 4 per cent in trading before exchanges opened; and this is likely to increase the calls for accountability on the parts of several dominant technology companies, especially Facebook, Google and Twitter. They have generally been able to claim that they are merely platforms and are therefore not responsible for the content that is posted on them.
That argument may not hold water anymore; increasingly alarmed regulators may impose stiff conditions on privacy protection, as envisaged in the European data protection regulations that go into effect in May. China, Vietnam, Singapore, etc. are considering similar regulations so that data about their citizens has to be maintained in-country. It is high time India also imposed laws that insist that, as in the Chinese cyber-security law, insist that “personal information and other important data gathered or produced by critical information infrastructure operators” should be stored in mainland China, according to The Economist (Technopolitics, 15 March).
In any case, I think we are seeing a watershed moment in the history of the use of private data to stealthily subvert the democratic process, and Facebook and its tech friends may face a serious backlash. Whether this will finally persuade lethargic regulators in India, to put in place regulations about data protection, remains to be seen. We can only hope.