Total Pageviews

Facebook’s emotional experiments on users aren’t all bad

A graph of Facebook's findings: an overall more positive news feed results in more positive generated content, and the same for a negative news feed and negative content.

Facebook scared some of its privacy-conscious users over the weekend by revealing that it performed a scientific study on manipulating the emotional content of users' News Feeds. Since the study came to light, the company has been accused of acting unethically—even illegally—by subjecting its users to an experiment without notice or consent. While the implications of the study are a little frightening, Facebook's study might actually have been a responsible thing to do.

The study in question monitored "emotional words" to see how the overall mood of a user's News Feeds affected that user's status updates. It turned out that users who saw fewer positive sentiments in their feeds produced fewer positive status updates, and users who saw fewer negative sentiments in feeds produced fewer negative updates. The effect was small, to the tune of one less positive or negative word generated per 1,000 emotional words in News Feeds, but it did exist.

Facebook's defense of the study hasn't been exactly deft, with its authors saying that the effects of the study have been overstated and that the experiment was short and no one was permanently scarred. A few reporters have claimed that since Facebook appeared to receive federal funding from the James S. McDonnell Foundation and the Army Research Office for the study, it violated ethical laws by conducting experiments on its users without their consent. Facebook has since announced that the study did not receive federal funding, so it wasn't subject to those ethical regulations.

Read 8 remaining paragraphs | Comments