Facebook users have reacted angrily to a ‘creepy’ experiment carried out by the social network and two American universities to manipulate their emotions.

The US technology giant secretly altered almost 700,000 users’ news feeds to study the impact of “emotional contagion”.

It tinkered with the algorithm controlling users’ feeds to find out what effect it had on their moods in January 2012.

The aim of the government-sponsored study was to see whether positive or negative words in messages would lead to positive or negative content in status updates.

Its authors wrote: “Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.

“When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

Facebook carried out the research over one week in collaboration with Cornell University and the University of California.The study was partly funded by the US government through its Army Research Office, according to a post on Cornell’s website.

Many users reacted angrily following online reports of the findings, which were published in the June 17 edition of the prestigious Proceedings of the National Academy of Sciences.

Some referred to it as “creepy”, “evil”, “terrifying” and “super disturbing”.

One Twitter user, writing under the handle @susajul, said: “#Facebook manipulated user feeds for massive psych experiment... Yeah, time to close FB act!”

Clay Johnson tweeted: “In the wake of both the Snowden stuff and the Cuba twitter stuff, the Facebook ‘transmission of anger’ experiment is terrifying.”

Erin Kissane wrote: “Get off Facebook. Get your family off Facebook. If you work there, quit.”

Facebook responded to the criticism, telling The Atlantic magazine that they “carefully consider” their research, and have “a strong internal review process”.

A spokesman said: “This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account.

“We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible.

“A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow.

“There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”

Sign up to our free newsletters

Get the best updates straight to your inbox:
Please select at least one mailing list.

You can unsubscribe at any time by clicking the link in the footer of our emails. We use Mailchimp as our marketing platform. By subscribing, you acknowledge that your information will be transferred to Mailchimp for processing.