A lot of web folk responded virulently against Facebook scientists' paper which revealed they'd manipulated people's timelines to show posts that were either more positive or more negative to see if that in turn made that user either more positive or more negative. (AV Club article breaking the story.)
The immediate backlash I saw in my Twitter timeline was intense, angry -- some horrified, some resigned. This morning, the chatter was becoming more nuanced. Some still quite angry ... others pointing out that emotional manipulation is what design is all about.
That's true up to a point. The purpose of a good design is often to make someone want to buy something, want to follow a link, to interact, read more. Everyone in web development knows about the A/B test -- will more people click this link if it looks like this or if it looks like that? What about the microcopy? Should it say this or should it say that?
The thing is, in web design it's not necessarily flat-out manipulation, it's often simply creating a more clear context for the user. Is it simply now more understandable or obvious? It is rarely completely manipulating a person's state of being. Instead it's a "micro-manipulation" of a specific instance. Buy this product, read this article (and look at these nifty ads), follow this link to another page.
A non-web example of "micro-manipulation" might be signs posted in a parking: Secure Your Vehicle Management Is Not Responsible For Stolen Items.
This is not necessarily a legal truth. It is a manipulation: perhaps people won't sue us if their car is broken into even though this parking lot is completely dark and kind of hidden and easy prey for burglars. We posted a sign. So, we're free of any liability.
It's not necessarily true, but it might keep some people from suing. It's manipulation of a particular situation -- not a manipulation of a person's state of being.
Web A/B testing is the same. Manipulation of a situation, not a person's state of being.
With the Facebook manipulation of people's timelines, however, we have multiple issues and the first and most important is psychological experimentation going on without the users' knowledge. Yes, Facebook claims that they got that permission in their Terms of Service, but there was no explicit permission -- no informed consent -- given for such a blatant experiment. I truly hope they come up on a class action suit for this, although I'm sure they won't.
They manipulated someone's timeline to use a higher number of negative terms to see if that made the person post more negative things. What if they'd done that to someone suffering from depression? What if they made them worse?
This is tinkering far beyond the colours of buttons, beyond the feel-good of using nice pictures.
It is not actually hyperbole to say they were manipulating people's lives in this case. What if you were one of the people who got the negative timeline for three months and you couldn't figure out why you were in such a crappy mood? You snapped at people at home, at work. You just couldn't quite pull yourself out of a funk. Maybe there were repercussions, a demotion at work, a spouse insisting on counseling or even a separation.
Facebook manipulated people without thinking about the consequences on actual lives for those in the negative group.
There was no informed consent and these people were not volunteers for this particular experiment.
Don't get me wrong. As humans we are endlessly manipulating our environment and everyone around us in small and large ways. While many of us try to live authentically and be honest about those manipulations (i.e., bluntly saying "I want this" instead of hiding the desire and trying to bring about the outcome in a more indirect way), many of us are more indirect and even dishonest about what we want.
As Twitter user @ZLeeily stated, the "key issue is they moved beyond tests for product design & into psych research without adapting ethical procedures to suit context."
Relatedly, this was a formal experiment. Facebook didn't just kinda test something: they wrote a scientific paper about it after their experiment. The purpose, then, was the experiment in and of itself, not just making Facebook a better product (as ZLeeily also stated). There are distinct rules about such experimentation and Facebook equating their broad and large Terms of Service as "informed consent" is simply ludicrous.
This was willful experimentation on people who did not know they were part of a psych test. It's unethical. It's shameful.
Update: Apparently, Cornell University was at least partially behind this experiment as well. The last paragraph also states that the Army Research Office in part funded this experiment as well as the James S. McDonnell Foundation.
Leave a comment