The creeping invasion of Big Data

The creeping invasion of Big Data

FACEBOOK isn't just a platform for catching up on what your friends are doing. It is also a research set-up where every user, unbeknown to him or her, may potentially be a test subject.

A Facebook worker recently published a study in a top journal, Proceedings Of The National Academy Of Sciences (PNAS). His co-authors are two Cornell dons. The study was an experiment to alter the mood of Facebook users, which upset some people as no one was explicitly asked for their consent.

The fear was that vulnerable populations such as the depressed or the young (kids aged 13 and above can have Facebook accounts) may have been included.

The experiment on 689,003 users was conducted for a week in January 2012. Facebook tweaked its news feeds to highlight happy posts and stories (for one group) versus unhappy ones (for another group).

It found that feeding users happy posts inspired them to post happy ones themselves. Conversely, feeding them unhappy posts led them to make unhappy ones themselves. So user moods were contagious online, Facebook concluded.

As this was human experimentation done without consent, Cornell's ethics panel should not have approved it, some argue.

They ask: What if the group fed with unhappy posts included users with major depressive illness, who might have been contemplating suicide that week?

Apparently, Cornell's ethics panel exempted the study from its oversight as Facebook had collected the data before its professors got on board. But the paper itself says the Cornell dons were involved in designing the experiment, which must have come before any data collection.

While Facebook's private sector research is not subject to any ethics panel oversight, the PNAS journal's editorial policy itself requires informed consent for any human experimentation study that it publishes. It also inexcusably exempted Facebook.

Facebook argues it had subject consent as users must click "agree" on its 9,405-word terms of service page when signing up for an account. Still, most people don't read the fine print when clicking on "agree" to use almost any online service. So that claim may well be ethically untenable.

Perhaps it was not even legal since, at the material time, the word "research" was not included in one of the purposes for which clicking "agree" permits Facebook to use one's data. The term "research" was only added four months afterwards, as Forbes.com research showed.

At any rate, informed consent always includes the right to withdraw from a study.

But no Facebook subject could have withdrawn since no one knew they were being studied.

The way to opt out of a study is always to be explained on the consent form too, which Facebook's terms of service page does not do.

Moreover, in November 2011, the US Federal Trade Commission (FTC) had just slapped Facebook on the wrist for its "unfair and deceptive" practices with regard to user data. Facebook had agreed to abide by detailed practices which the industry regulator imposed, so Facebook would have been sensitised to possible ethical issues with its experiment at that time.

The reactions to the Facebook study fall into four groups. First are the nonchalant who ask "What's the big deal?" They just don't care.

The second group thinks that Facebook did nothing wrong at all. After all, it manipulates what it feeds its users all the time anyway.

This practice is called "interface A/B testing", which online firms routinely do.

This involves sending half of users to one version of a page, and the other half to another to see which one gets more users in terms of what they click on, respond to or share with others.

This group argues that if the A/B test is acceptable in market research, it should be equally accepted in scientific research.

Every A/B test is a psychological experiment as it tries to create positive or surprising effects in the user to make him or her more likely to buy this or that product or service.

But the Facebook test was aimed instead at also eliciting an adverse effect on one group of users, to change not their consumer behaviour but their daily mood.

With over 1.3 billion users of every emotional disposition, there are bound to be some with major depression.

If the Facebook experiment did tip just one patient over into suicide, the study would have been unethical, if not criminal.

The third group involves those who are upset because they fear that giant corporations like Facebook and Twitter are invading user privacy. Indeed, an advocacy group called Electronic Privacy Information Centre has asked the FTC to investigate Facebook for this.

Regulators may also take this view: Senator Mark Werner (Virginia) is pushing for the same.

The Information Commissioner's Office in Britain and the Office of Data Protection in Ireland are already studying if Facebook broke their data protection laws.

Facebook argues that none of the user data it collected could be linked back to any specific account, so no privacy rights were breached.

The fourth group, comprising social scientists and medical researchers, is concerned about the lack of informed consent and potential harm.

This is because, in their training, they were socialised into the horrors of unchecked human experimentation and the need to be ever looking out for the well-being of the research subject.

This is why universities and other publicly funded research set-ups have ethics panels to review all proposed research to winnow out projects that may lead to the abuse of human subjects. But Facebook is not subject to such oversight.

Previously, in the offline world, only academicians in research institutions - or industry researchers who came out of academe - could carry out psychological experiments on users. And they factored in subject welfare.

But now the huge digital networks which giant firms like Facebook run enable their employees who are not acculturated into academic research norms to carry out online psychological experiments on users without informed consent.

These academic research norms and rules that are meant to protect research subjects represent very costly lessons learned from the horrific human experimentation the Nazis carried out on Jews during World War II.

But in its own research, Big Data is ignoring these rules. Now that this has become known, society needs to debate and decide what rules it wants Big Data to play by in its psychological research into and manipulation of user mood and behaviour.

andyho@sph.com.sg

But now the huge digital networks which giant firms like Facebook run enable their employees who are not acculturated into academic research norms to carry out online psychological experiments on users without informed consent.


This article was first published on July 24, 2014.
Get a copy of The Straits Times or go to straitstimes.com for more stories.

This website is best viewed using the latest versions of web browsers.