The academic journal Proceedings of the National Academy of Sciences has published what it describes as an editorial expression of concern following the public outcry over an article in its latest issue in which researchers manipulated users’ news feeds to explore emotional contagion in social networks.
The article, which has caused widespread controversy since its publication, sought to demonstrate that emotional states can be transferred through online networks by reducing the number of posts expressing either positive or negative expressions in some users’ news feeds. The study ultimately found that emotional contagion does occur in online networks — users exposed to more negative posts were more likely to express negative emotions in their own posts, and similar behaviour was seen in those who saw more positive posts, while those exposed to fewer posts expressing any emotion, positive or negative, were less likely to express emotion in their own posts.
Facebook, as the single largest online social network in the world, offers enormous potential for study in network analysis, and the company has an active research unit. Other networks studied in depth by network scientists include the Hollywood actor network, which examines how actors are linked by their films, and the scientific collaboration network, which ties researchers through academic collaboration.
The problem with the latest Facebook research concerns the fact that users were not informed that they were being used for a psychological experiment; some would most likely have opted out of the study if they were informed. The researchers assert in the article that informed consent is implicit in the fact that every user has agreed to Facebook’s Data Use Policy, though and considerably more than 9,000 words it is unlikely many users have read it.
While the policy mentions that Facebook will use information for research, it does not mention that such research could involve the use of algorithms which could affect users’ moods.
According to PNAS, informed consent and an opportunity to opt out if desired are best practices for research involving human subjects; however Facebook, as a private company, is not bound to the same ethical codes as academic institutions.
“Based on the information provided by the authors, PNAS editors deemed it appropriate to publish the paper,” the statement of concern reads. “It is nevertheless a matter of concern that the collection of the data by Facebook may have involved practices that were not fully consistent with the principles of obtaining informed consent and allowing participants to opt out.”