The Guardian reports this morning that:
Facebook is being investigated to assess whether an experiment in which it manipulated users’ news feeds to study the effect it had on moods might have broken data protection laws, it has been reported.
The Information Commissioner’s Office is said to be looking into the experiment carried out by the social network and two US universities in which almost 700,000 users had their news feeds secretly altered to study the impact of “emotional contagion”.
Meanwhile, the original Cornell press release which let on to the experiment has also been altered. Where it originally asserted the Army had co-funded the adventure, it now says (scroll down to the bottom of the page):
Correction: An earlier version of this story reported that the study was funded in part by the James S. McDonnell Foundation and the Army Research Office. In fact, the study received no external funding.
Perhaps, in the event, it would be churlish of us to complain. As Paul has ironically pointed out, the T&Cs we signed up to on becoming Facebook users (more and more the 21st century equivalent of passing over to the dark side of a club membership you can never leave once entered) are pretty broad-ranging and may allow for such abuse.
Even so, the situation is sufficiently serious for institutions like Cornell to follow up with these kind of assertions (the bold is mine):
ITHACA, N.Y. – Cornell University Professor of Communication and Information Science Jeffrey Hancock and Jamie Guillory, a Cornell doctoral student at the time (now at University of California San Francisco) analyzed results from previously conducted research by Facebook into emotional contagion among its users. Professor Hancock and Dr. Guillory did not participate in data collection and did not have access to user data. Their work was limited to initial discussions, analyzing the research results and working with colleagues from Facebook to prepare the peer-reviewed paper “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks,” published online June 2 in Proceedings of the National Academy of Science-Social Science.
Because the research was conducted independently by Facebook and Professor Hancock had access only to results – and not to any individual, identifiable data at any time – Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required.
So, then, it’s OK to use research that has been obtained without permission from any source whatsoever, as long as one cannot identify the
victims unwilling participants social network users in question – creatures, incidentally, who occupy the lowest of all low strata in the 21st century litany of unobserved rights and excessive obligations.
Which doesn’t half me remind me of another constituency out there. Indeed, it would probably be rather unfair to criticise Facebook for following on from where certain British political representatives have gone before.
Before #FacebookExperiment, surely we have had #CoalitionExperiment – a deliberate process of emotional manipulation of both the most defenceless in society as well as, in the event, the most determined not to see their rights trampled on.
And if the ICO feels that data protection laws may have been broken when Facebook experimented on the way that people reacted to negative and positive stories, without asking their permission first and even though they’d signed up to a wide-ranging set of T&Cs, who is to say this Coalition government didn’t similarly break human rights laws when they decided to experiment on how a nation might react to a barrage of false stories about immigrants “nicking” jobs, the “scrounging” poor, the “feckless” disabled and a well-packaged myriad of other lies, distortions and half-truths?
If Facebook is to be investigated by the ICO, or perhaps even a select committee which feels particularly (and rightly) aggrieved about the situation, who will have the guts to investigate entire governments such as ours? And given the close ties between the aforementioned social network and the security arms of the latter everywhere, doesn’t it make you wonder whether in fact this story is little more than a softening-up of public opinion as we await ultimate revelations from the Snowden cache of documents?
Is the #FacebookExperiment an isolated example of an always-slightly-maverick social network going out on a limb – or, more likely, does Facebook simply reflect what others, less visible, are now doing all the time? And does Facebook do what it needs to sustain a business model – or is it more a question of doing the bidding of those who most need to structure people’s feelings in times of unrelenting crisis?
That is to say, our unrepresentative, undemocratic, inefficient and incompetent political leaders various … excellent reasons all, in the light of the above, to investigate much more profoundly how our body politic is doing a Facebook.