According to CTV News, it was January 2012 when Facebook spent a week looking into the idea of emotional contagion. “The participants in the study were randomly selected, and the company used a software program that analyzes language to determine if the posts that were being swept into the users’ feeds were positive or negative in tone.”
“After being exposed to the manipulated feeds, the software then analyzed the status updates of the users to determine the tone of their posts.”
The study's findings were, admittedly, quite interesting. Turns out that people's emotions are in fact influenced by those of the people around them. For example, individuals exposed to mostly positive news tended to follow up with generally positive updates whereas those seeing mostly negative comments leaned towards being negative on their own page.
But as interesting as these results may be, they've been overshadowed by questions related to ethics and whether or not Facebook had the right to play with people's news feeds without ever letting them know what was going on.
As it turns out, yes, they did. While signing up, every single user has to agree to Facebook's Data Use policy, which explicitly states that data collection might be used for Facebook’s internal operations.
What do you think of Facebook's actions - do they need to be more transparent?