As discussed in Beyond Smart: Lawyering with Emotional Intelligence, Facebook has figured in a number of projects touching on emotional intelligence. First, since Mark Zuckerberg evidently believes that, “What [members] really want is the ability to express empathy,” Facebook introduced a “dislike” button for users to show their empathy by responding to sad news.

Facebook has also used access to its membership to participate in research projects related to emotional intelligence. Facebook worked with the Yale Center for Emotional Intelligence and Lady Gaga’s Born This Way Foundation on a large-scale project involving surveying 22,000 students about their emotional well-being in order to design a program for preventing and decreasing online bullying, a form of emotional aggression that indicates a lack of emotional empathy. One of the findings is that four out of the top five emotions students experience at school are predominantly negative.

Further, in a study published in early June 2014 in the Proceedings of the National Academy of Sciences, researchers from Facebook and Cornell University randomly altered the news feeds of 689,003 Facebook users by subtracting either happy or sad words in order to see what happened. Those users were then found to reflect in their own posts fewer happy or sad words in accordance with how their news feeds had been altered. While the impact of the manipulation appears to be very small, a controversy arose at the time because of the fear of potential mass emotional influence and because Facebook didn’t notify users of the experiment or get their consent. What the manipulation proved, however, was how what we hear and see influences at least to some degree our own emotions, consistent with theories of emotional contagion, but probably diluted by the distance and sporadic nature of the interactions.

Much was made of whether there was a review board clearance of the ethics of the research. Yes, there was, according to Susan Fiske, the Princeton University psychology professor who edited the study for publication. The authors of the study confirmed that their local institutional review board had approved it—but apparently on the grounds that “Facebook  manipulates people’s News Feeds all the time”– hardly a reassuring conclusion.

The ongoing nationwide dialogue about the protection and use of our private data will no doubt continue, but let’s also be aware of the ability of massive news and social platforms to actually shape our moods–and therefore our judgement and productivity.