Facebook Experimented on Users Without Permission

Typically when you participate in a scientific study, whether it’s an experimental drug, a behavioral study, or a survey for someone’s intro to Psychology class, the person participating in the experiment gives his or her permission. Facebook, on the other hand, recently did an experiment to see whether or not they could manipulate the type of status updates users make, influencing them to be either positive or negative. Essentially, Facebook was trying to alter the moods of their users. All of this was done without the users’ knowledge, and was not passed by an independent ethics committee, who certainly would have had an issue with manipulating the emotions of unwilling participants. Facebook has seen no negative repercussions from this. Are you worried?

Facebook already collects a huge amount of information on users and uses it to predict the kind of things they’d be interested in or could be easily convinced to buy. They use this information to sell ads at a higher cost to eager businesses, who want to target ads better than ever before thought possible. No marketing professional ever thought of a world where potential customers would willingly give intimate details to them, but through companies like Google and Facebook, that’s now a reality. Is it unethical? Perhaps slightly, as most users don’t understand the extent of the information they’re handing over, and are manipulated into giving more and more. It’s not hard to see whether or not experimenting on people without their knowledge or permission is ethical though.

The experiment was simple enough. Facebook used an algorithm to determine whether or not a post or status update was positive or negative. They then selected users, 689,003 to be exact, to participate in their study. These users were shown a disproportionate amount of positive or negative posts. Facebook’s default “Top Stories” feed makes it very easy to hide some posts from users (now you know why Facebook makes it so hard to access their most popular feed, the “Most Recent” feed). Facebook’s theory was that if users were shown more positive and happy posts, they themselves would post more happy status messages, meaning they were likely happier. The other side of that, was whether or not they could do the opposite, make a user more negative by showing them negative posts.

Showing users two versions of a website, the normal and a test version, is called A/B testing, and it’s done frequently in the tech world. That’s not the issue here. Rarely is the goal of this testing to change a user’s perception or mood, and rarely does the company doing it already have so much personal information from the participants. Also, websites frequently ask users if they want to use a beta version of their website, but Facebook gave users no such warning. They were literally trying to manipulate the moods of unwitting users, sometimes to make them more negative or potentially sad, angry, or depressed. Fortunately, their theory was, for the most part, disproven.

Facebook would have needed a much larger sample size to discern whether or not their hypothesis was correct, as the results of the study were not statistically significant. Those shown a more negative feed only posted 0.1% fewer positive words, and those shown a more upbeat feed used only 0.07% fewer negative words. Therefore, people shown a more positive feed were a tiny amount more likely to post positive phrases, and those shown a more negative feed were slightly more likely to post negative things, but the difference was so minimal, they would have needed to test far more people to be certain whether or not it was a fluke.

Facebook couldn’t have known going in that the study would be unsuccessful. In fact, they probably saw the mood manipulation as a better way to increase the effectiveness of their ads. That means they were likely hoping to make some people happier and some people more negative. Perhaps they’d even cross section the results with the people who click ads and purchase a product due to those ads to figure out whether or not people manipulated into posting positive or negative things would be more or less likely to buy products. What if it did work? What if they had discovered that it was easy to control the emotions of their users, making them happier or more depressed or angry? Would we have found out about the study, or would Facebook had kept it a secret, deciding to manipulate users for profit?

Surely you must be thinking, “How are they allowed to do this?”, before realizing that it’s probably in the terms and conditions of Facebook. In fact, if you were to take a look at the lengthy terms and conditions for Facebook, you would find a section enabling the social network to perform experiments on users. Viewers of the TV show South Park certainly have a frame of reference for when terms and conditions go unread and have outrageous sections from the episode “Human CentiPad” (NSFW). The writers of South Park were clearly thinking of an exaggerated situation, but the point they were driving across was related to unethical sections of Terms and Conditions contracts. You’ve given Facebook permission to experiment on you, and you probably don’t even know it.

It gets worse though. When this study was conducted, Facebook had not yet added the clause into their terms and conditions to allow such studies to happen. In fact, they didn’t add anything about research until 4 months after the study concluded. Facebook decided that they had the right to this data, including the results from children under 18 years of age. Facebook could be conducting other studies on their users, including the children using the service, and has not yet received any repercussions for their actions. Due to the clause in their terms and conditions, if you do not want to be a part of future studies, you have one option: stop using Facebook immediately. This particular study may have been done without legal backing from their ToS, but all future ones will be. Any punishment for Facebook due to this first experiment, done on users without their permission and children, has yet to be seen. Perhaps it will never come.

Sources: TechCrunch and Forbes

 

Leave a Reply

Your email address will not be published. Required fields are marked *