FACEBOOK isn't just a platform for catching up on what your friends are doing. It is also a research set-up where every user, unbeknown to him or her, may potentially be a test subject.
A Facebook worker recently published a study in a top journal, Proceedings Of The National Academy Of Sciences (PNAS). His co-authors are two Cornell dons. The study was an experiment to alter the mood of Facebook users, which upset some people as no one was explicitly asked for their consent.
The fear was that vulnerable populations such as the depressed or the young (kids aged 13 and above can have Facebook accounts) may have been included.
The experiment on 689,003 users was conducted for a week in January 2012. Facebook tweaked its news feeds to highlight happy posts and stories (for one group) versus unhappy ones (for another group).
It found that feeding users happy posts inspired them to post happy ones themselves. Conversely, feeding them unhappy posts led them to make unhappy ones themselves. So user moods were contagious online, Facebook concluded.
As this was human experimentation done without consent, Cornell's ethics panel should not have approved it, some argue.
They ask: What if the group fed with unhappy posts included users with major depressive illness, who might have been contemplating suicide that week?
Apparently, Cornell's ethics panel exempted the study from its oversight as Facebook had collected the data before its professors got on board. But the paper itself says the Cornell dons were involved in designing the experiment, which must have come before any data collection.
While Facebook's private sector research is not subject to any ethics panel oversight, the PNAS journal's editorial policy itself requires informed consent for any human experimentation study that it publishes. It also inexcusably exempted Facebook.
Facebook argues it had subject consent as users must click "agree" on its 9,405-word terms of service page when signing up for an account. Still, most people don't read the fine print when clicking on "agree" to use almost any online service. So that claim may well be ethically untenable.
Perhaps it was not even legal since, at the material time, the word "research" was not included in one of the purposes for which clicking "agree" permits Facebook to use one's data. The term "research" was only added four months afterwards, as Forbes.com research showed.
At any rate, informed consent always includes the right to withdraw from a study.
But no Facebook subject could have withdrawn since no one knew they were being studied.
The way to opt out of a study is always to be explained on the consent form too, which Facebook's terms of service page does not do.