We trust companies to provide us relevant information by ‘personalizing’ our experience, but we give little thought to the downsides. – Tarun Wadhwa, Forbes

Facebook conducted a social experiment designed to manipulate users’ moods without their consent for a week in January 2012, leading many to a discussion of online ethics.

The experiment, which was just revealed in a new study, presented around 700,000 Facebook users who logged on to the site with one of two randomly generated home pages, according to The Atlantic. One of the home pages featured mainly positive words, while the other contained predominantly negative words. The study found that users with the positive home page were more likely to post positive statuses, while users with the negative homepage posted more negative things, The Atlantic wrote.

None of the participants knew this experiment was underway.

“Facebook essentially sought to manipulate people's mood,” wrote Robert Kliztman of CNN. “This is not a trivial undertaking. What if a depressed person became more depressed? Facebook says that the effect wasn't large, but it was large enough for the authors to publish the study in a major science journal.

“This experiment is scandalous and violates accepted research ethics,” he concluded.

While the experiment itself may be controversial, argued Tarun Wadhwa of Forbes, the real worry is in the ease with which Facebook was able to conduct the experiment.

“We trust companies to provide us relevant information by ‘personalizing’ our experience, but we give little thought to the downsides,” he wrote. “While we are aware that websites are ‘optimizing’ content for us, we don’t think about how that constrains our experience. And the enormous amount of personal information that has been collected about us over the last decade will continue to be acted upon — the Facebook experiment was just a glimpse at what is ahead.”

We have divulged too much trust and personal information online, Wadhwa wrote, which has opened us up to manipulation on several fronts. “Companies (can now) use your background, details, and emotional state to coerce you into buying products you don’t need or paying higher prices than you normally would,” he wrote.

Duncan Watts of The Guardian recognizes this everyday manipulation, and argues that because they have the potential to provide such a wealth of knowledge, social experiments like the one conducted by Facebook should be accepted, or even encouraged.

“We are being manipulated without our knowledge or consent all the time — by advertisers, marketers, politicians — and we all just accept that as a part of life,” he wrote. “The only difference between the Facebook study and everyday life is that the researchers were trying to understand the effect of that manipulation.”

We live in a world where a significant amount of personal information is now public, Watts wrote, and because we have already entered this arena, we need to take advantage of it and the social knowledge we can gain from it. Public outrage at experiments like this one can only have negative effects, he wrote.

“If anything, we should insist that companies like Facebook — and governments for that matter — perform and publish research on the effects of the decisions they're already making on our behalf,” Watts wrote. “Now that it's possible, it would be unethical not to. And it would be disastrous if a poorly informed outcry over a single study had the effect of driving them in the opposite direction — either to willful ignorance or to secrecy.”

Farhad Manjoo of The New York Times also believes that these experiments can be beneficial. While many commentators accused the Facebook study of endangering and hurting people, Manjoo believes these experiments can actually be used to protect the public.

View Comments

“It is only by understanding the power of social media that we can begin to defend against its worst potential abuses,” he wrote. “Facebook’s latest study proved it can influence people’s emotional states; aren’t you glad you know that? Critics who have long argued that Facebook is too powerful and that it needs to be regulated or monitored can now point to Facebook’s own study as evidence.”

As social sites divulge the information they have gained from users, users learn as well, he argued.

“If every study showing Facebook’s power is greeted with an outcry over its power, Facebook and other sites won’t disclose any research into how they work,” he wrote. “And isn’t it better to know their strength, and try to defend against it, than to never find out at all?”

Bethan Owen is a writer for the Deseret News Moneywise and Opinion sections. Twitter: BethanO2

Looking for comments?
Find comments in their new home! Click the buttons at the top or within the article to view them — or use the button below for quick access.