"Facebook made users depressed in secret research," the Mail Online reports. The news comes from a controversial experiment where researchers used the social networking site Facebook to explore the effects of "emotional contagion".
Emotional contagion is when emotional states are transferred between people. For example, if everyone in your office is in a good mood, chances are your own mood will be lifted.
To study its effects, researchers reduced the amount of negative or positive content that appeared in users' newsfeeds to see if this changed their emotional posting behaviour.
The study found when positive emotional content was reduced, people subsequently produced fewer posts containing positive words and more posts containing negative words. The opposite pattern occurred when negative emotional content was reduced.
But the effect sizes in the study were very small – just a few percentage points in terms of changes in the positive or negative terms used by individual users.
The study was carried out by researchers from the University of California and Cornell University in the US. Funding sources were not reported, but it would be fair to assume it was funded by Facebook.
The story was picked up widely in the UK media, with most focusing on the ethical aspects of the study.
Some of the reporting was a little over the top, such as the Mail Online's claim that, "Facebook made users depressed". Adding a few extra negative words to your status update is not the same as being clinically depressed.
In reaction to the widespread criticism of the study, Facebook issued a statement saying that the company "never meant to upset anyone".
This was an experimental study among a group of people who use the social networking site Facebook. The researchers were interested in seeing whether "emotional contagion" can occur outside of direct personal interactions.
They did this by reducing the amount of emotional content in Facebook's newsfeed function. This contains posts from people someone has agreed to become friends with on the site.
According to the researchers, what content is shown or omitted in the newsfeed is determined by a ranking algorithm Facebook uses to show, as the researchers put it, "the content they will find most relevant and engaging".
This experiment manipulated the extent to which 689,003 people were exposed to emotional content in their newsfeed on Facebook during one week in January 2012. This was designed to test whether exposure to other people's emotions through the newsfeed subsequently caused people to change their own posting behaviour.
The researchers were particularly interested in seeing whether exposure to certain tones of emotional content caused people to post similar emotional content – for example, whether people were more likely to post negative content if they had been exposed to negative emotional content.
According to the researchers, people who viewed Facebook in English were qualified for selection in the experiment, and participants were selected at random.
Two experiments were carried out:
The researchers report that each of these experiments had a control condition where a similar amount of posts in a person's newsfeed were omitted at random without respect to emotional content.
When a user loaded their newsfeed on Facebook, posts that contained positive or negative emotional content had a 10-90% chance of being omitted for that specific viewing, but remained visible on a person's profile.
Posts were determined to be either positive or negative if they contained at least one positive or negative word, as defined by a word counting software called Linguistic Inquiry and Word Count.
The researchers say use of this software was consistent with Facebook's data use policy, which all users agree to prior to creating an account on the site. Strictly speaking, this constitutes informed consent for the purposes of this research.
They then looked at the percentage of positive or negative words in people's own status updates, and compared each emotional condition to its control group.
The researchers hypothesised that if emotional contagion has an effect through social networks, people in the positively reduced condition should be less positive compared with their control, and vice versa.
They also tested whether the opposite emotion was affected to see if people in the positively reduced condition expressed increased negativity, and vice versa.
Of the posts manipulated, 22.4% contained negative words and 46.8% contained positive words. More than 3 million posts were analysed, containing more than 122 million words, of which 4 million were positive (3.6%) and 1.8 million were negative (1.6%).
The researchers say the emotional expression of the participants did not differ in the week prior to the experiment taking place.
The main findings from this study were that:
Omitting positive and negative emotional content in a person's newsfeed was found to significantly reduce the amount of words a person subsequently produced. This effect was greater when positive words were omitted.
The researchers concluded that this finding was a withdrawal effect, meaning that people who were exposed to fewer emotional posts (positive or negative) in their newsfeed were less expressive overall on the following days.
They say these results show emotional contagion and the emotions expressed by friends through online social networks therefore influence our moods.
The researchers concluded that their results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion through social media.
They also say their work suggests that, in contrast to prevailing assumptions, in-person interaction and non-verbal cues are not strictly necessary for emotional contagion, and that the observation of other people's positive experiences constitutes a positive experience.
Overall, despite its interesting nature, this study provides limited evidence of associations between emotions expressed through the social networking site Facebook and the emotional tone of a person's subsequent posts on the same site.
But there are some important limitations to consider when interpreting these findings, namely that the effect sizes in the study were very small (as the authors note). Also, the words people choose to use when they post a status update may not accurately reflect their general emotional state.
It is also possible that factors other than what people saw in their newsfeed contributed to their subsequent posts, rather than being directly linked to the posts they had just seen.
Probably of greater interest is the subsequent controversy the study has generated. Many people have been shocked that Facebook can filter a person's newsfeed, although this has been common practice for years. As Facebook states, this is often done to show users "the content they will find most relevant and engaging".
It is important to remember that Facebook is not a charity or a public service – it is a commercial enterprise with the primary aim of making a profit.
While social networking can be a positive and engaging experience for some, connecting with other people in the real world has been shown to improve our wellbeing.