.Facebook’s ‘Social Experiment’ is, Literally, Bad News

facebook_experiment.jpg

There’s an adage that’s particularly relevant in today’s world of “free” technology: If you aren’t paying for a product, you are the product.” And nothing highlights this better than a recent study by Facebook that manipulated users’ news feeds to show mostly bad news.

The study with Cornell and UCSF was published June 17, and revealed that Facebook alterned nearly 700,000 users’ news feeds to see how they respond to a deluge of negativity. The study was authored by members of Facebook’s core data science team, UCSF’s Center for Tobacco Control Research and Education and Cornell’s Communication and Information Science department.

The meat of the study can be found in these excerpts:
For people who had positive content reduced in their News Feed, a larger percentage of words in people’s status updates were negative and a smaller percentage were positive.

music in the park san jose
music in the park san jose

[jump]

Two things to take away from this: Facebook took away positive content in news feeds. Ever wonder why all you hear about is bad news? It’s not just your imagination. It’s actually being manipulated to achieve that result. Secondly, the less positivity users saw, the less they spread. Bad news breeds bad moods.

When negativity was reduced, the opposite pattern occurred.

Ok, that’s pretty self explanatory, but still very telling—take away the bad news and people talk about more positive things.

These results suggest that the emotions expressed by friends, via online social networks, influence our own moods…

How many times have you experienced anger, sadness, happiness, nervousness or any other emotion based on something from Facebook? I don’t think this concept needed a scientific paper to prove, but, hey, here it is.

…constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks.

The word “contagion” here is a leading word to the idea of things “going viral” online. It’s not just videos, photos, websites or products that go viral, real human emotions do, too.

This is a telling study, and will undoubtedly be used in some new-age guru’s next self-awareness book, but it’s also a little creepy and invasive for those who trusted the Palo Alto company to provide social interaction on their own terms, not the company’s. It’s also dangerous—when entire revolutions are built around social media, as was the case with the Arab Spring, what if the social media companies push certain attitudes by selective processing? What if negative posts with the phrase “Muslim Brotherhood” were deleted from just 5 percent of all posts in a given region?

Of course, Facebook wouldn’t do that. But what if the U.S. Government asked them to perform another “social experiment?” It’s a slippery slope.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

spot_img
North Bay Bohemian E-edition North Bay Bohemian E-edition
boheme magazine e-edition