社交媒体,控制你的情绪并传染给你的朋友

#研究分享#【社交媒体,控制你的情绪并传染给你的朋友】一项通过操控689000个Facebook用户“新鲜事”订阅的研究发现,朋友帖子中评论、视频、图片和网站链接的情绪化内容可以影响用户的心情。如朋友帖子中积极内容的减少,会导致自己积极帖子数量的减少。然而,就是这一项被称为第一个通过社交网络的获取大规模情绪实验证据的研究目前遭受各种“科研伦理问题”的困扰。大量用户和学界人士抨击这项研究在未经用户许可的情况下,人为操纵其朋友圈的帖子,剥夺了用户选择参与或者退出实验的权力,另外,而其目的在于影响用户个人心情,同时潜在意义上对用户造成了困扰甚至伤害。

F1.medium

Experimental evidence of massive-scale emotional contagion through social networks

  1. Jeffrey T. Hancock

Abstract

Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. Emotional contagion is well established in laboratory experiments, with people transferring positive and negative emotions to others. Data from a large real-world social network, collected over a 20-y period suggests that longer-lasting moods (e.g., depression, happiness) can be transferred through networks [Fowler JH, Christakis NA (2008) BMJ 337:a2338], although the results are controversial. In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that, in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion, and that the observation of others’ positive experiences constitutes a positive experience for people.

Keywords:

 

文章来源:http://www.pnas.org/content/111/24/8788/F1.expansion.html

 

Facebook emotion study breached ethical guidelines, researchers say

Lack of 'informed consent' means that Facebook experiment on nearly 700,000 news feeds broke rules on tests on human subjects, say scientists

Poll: Facebook's secret mood experiment: have you lost trust in the social network?

Researchers have roundly condemned Facebook's experiment in which it manipulated nearly 700,000 users' news feeds to see whether it would affect their emotions, saying it breaches ethical guidelines for "informed consent".

James Grimmelmann, professor of law at the University of Maryland, points in an extensive blog post that "Facebook didn't give users informed consent" to allow them to decide whether to take part in the study, under US human subjects research.

"The study harmed participants," because it changed their mood, Grimmelmann comments, adding "This is bad, even for Facebook."

But one of the researchers, Adam Kramer, posted a lengthy defence on Facebook, saying it was carried out "because we care about the emotional impact of Facebook and the people that use our product." He said that he and his colleagues "felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out."

The experiment hid certain elements from 689,003 peoples' news feed – about 0.04% of users, or 1 in 2,500 – over the course of one week in 2012. The experiment hid "a small percentage" of emotional words from peoples' news feeds, without their knowledge, to test what effect that had on the statuses or "Likes" that they then posted or reacted to.

The results found that, contrary to expectation, peoples' emotions were reinforced by what they saw - what the researchers called "emotional contagion".

But the study has come in for severe criticism because unlike the advertising that Facebook shows - which arguably aims to alter peoples' behaviour by making them buy products or services from those advertisers - the changes to the news feeds were made without users' knowledge or explicit consent.

Max Masnick, a researcher with a doctorate in epidemiology who says of his work that "I do human-subjects research every day", says that the structure of the experiment means there was no informed consent - a key element of any studies on humans.

"As a researcher, you don’t get an ethical free pass because a user checked a box next to a link to a website’s terms of use. The researcher is responsible for making sure all participants are properly consented. In many cases, study staff will verbally go through lengthy consent forms with potential participants, point by point. Researchers will even quiz participants after presenting the informed consent information to make sure they really understand.

"Based on the information in the PNAS paper, I don’t think these researchers met this ethical obligation."

Kramer does not address the topic of informed consent in his blog post. But he says that "my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety."

When asked whether the study had had an ethical review before being approved for publication, the US National Academy of Sciences, which published the controversial paper in its Proceedings of the National Academy of Sciences (PNAS), told the Guardian that it was investigating the issue.

文章作者:

文章链接:http://www.theguardian.com/technology/2014/jun/29/facebook-users-emotions-news-feeds

 

In case you missed it, a storm of controversy involving Facebook blew up over the weekend, after news emerged of a psychological study that was conducted on hundreds of thousands of users of the social network without their knowledge, in which researchers manipulated the emotional cues those users saw in their Facebook streams, and then tracked their subsequent behavior. Reactions to the news ranged from resignation or even acceptance to disgust at the company’s decision, and the unethical practice of altering the emotional status of thousands of users without permission.

What did the experiment involve?

As The Guardian explains, the study was done by a team of researchers that included a data scientist working for Facebook. It blocked specific kinds of emotional content from the news feeds of 689,003 people or about 0.04 percent of Facebook’s total user base, for a week in January of 2012. The study hid “a small percentage” of emotional words from peoples’ streams — without their knowledge — in order to see if doing so had any effect on the statuses they posted or the content they “liked” or shared.

The research was published in the June issue of a prominent scientific journal (the Proceedings of the National Academy of Sciences or PNAS) and was written up inNew Scientist magazine. The journal’s editor said the data analysis was approved by a review board at Cornell university, but that the actual collection of the data was only approved by an internal Facebook review. There were early reports that the study was partially funded by the research office of the U.S. Army, but thatappears not to be the case.

Does everyone who works at Facebook just have the "this is creepy as hell" part of their brain missing?—
sarah jeong (@sarahjeong) June 28, 2014

What do the results of the study show?

According to the researchers, including Facebook data scientist Adam Kramer, the experiment showed that users’ emotions were in fact reinforced, at least to some extent, by what they saw in their Facebook news feed, a phenomenon the research team called “emotional contagion.” They said this provides support for the idea that emotional states can be transferred to others without their awareness, and that this process can occur “without direct interaction between people — exposure to a friend expressing an emotion is sufficient” and without non-verbal cues:

“When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

facebook_results

Why is this controversial?

Scientists and other experts have criticized the fact that Facebook conducted the research, which appears to have successfully manipulated the emotions of users, without giving those users any information that they were being used in an experiment, and without giving them the ability to opt out. Law professor James Grimmelmann says that this lack of “informed consent” is a breach of the ethical standards that typically govern such research, since the study arguably harmed participants (even in a minor way) by altering their mood, and did so without their permission. Grimmelmann added that “this is bad, even for Facebook.”

Impressive achievement by Facebook to snatch back the title of most dystopian nightmarish tech company.—
Tom Gara (@tomgara) June 29, 2014

Sociologist Zeynep Tufekci of the University of North Carolina, who specializes in the effects of social media, wrote in a post on Medium — and in a related research paper that was just accepted for publication — that what is at stake isn’t just the status of a single piece of research involving the Facebook news feed, but the potential for much more invasive and disturbing uses of the data that users are providing to such networks.

“These large corporations (and governments and political campaigns) now have new tools and stealth methods to quietly model our personality, our vulnerabilities, identify our networks, and effectively nudge and shape our ideas, desires and dreams. That is one of the biggest shifts in power between people and big institutions, perhaps the biggest one yet of 21st century. We should care that this data is proprietary, with little access to it by the user, little knowledge of who gets to purchase, use and manipulate us with this kind of data.”

In his post, entitled “As Flies to Wanton Boys….” James Grimmelmann says that the argument that “Facebook already advertises, personalizes, and manipulates is at heart a claim that our moral expectations for Facebook are already so debased that they can sink no lower. I beg to differ.” British journalist Laurie Penny said in a post at The New Statesman that:

“Nobody has ever had this sort of power before. No dictator in their wildest dreams has been able to subtly manipulate the daily emotions of more than a billion humans so effectively. There are no precedents for what Facebook is doing here. Facebook itself is the precedent.”

What is Facebook’s defense?

In a nutshell, Facebook has argued that it was entitled to conduct the study because its usage policies include a line that refers to the data supplied by users potentially being used for research (Note: According to Kashmir Hill at Forbesmagazine, Facebook didn’t add the line about research until after the emotional contagion study was completed). The editor of the scientific journal in which it appeared said that the review board believed it was justified “on the grounds that Facebook apparently manipulates people’s News Feeds all the time.”

The FB furore ought to surprise no one. It manipulates its news algorithm all the time – part of a study or not : slate.com/articles/healt…
emily bell (@emilybell) June 29, 2014

Not everyone buys this explanation, however: Max Masnick, a researcher with a doctorate in epidemiology, says in a post of his own on Facebook that the structure of the experiment means there was no informed consent, something that is a crucial element of any study that involves research on humans. “As a researcher, you don’t get an ethical free pass because a user checked a box next to a link to a website’s terms of use,” he said.

A statement from a Facebook spokesperson said the research “was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process.”

What do the researchers say?

Kramer, the lead Facebook scientist involved in the research, posted a lengthy defence of the study on Facebook, saying it was done in part because “we care about the emotional impact of Facebook and the people that use our product.” He said that he and his colleagues believed it was important to investigate the theory that “seeing friends post positive content leads to people feeling negative or left out.” Kramer did say that he and his co-authors were sorry for any anxiety their paper may have caused, and admitted that “in hindsight, the research benefits of the paper may not have justified all of this anxiety.”

What do supporters of Facebook think?

Some argue that the research shouldn’t be that controversial, since Facebook manipulates the news feed of its users all the time — by tweaking the algorithms that highlight certain kinds of content, including trying to de-emphasize “low quality” content from viral-sharing mills and promote “high quality” content from news outlets. Others, including venture capitalist Marc Andreessen, say what Facebook did isn’t really that different from the kind of A/B testing that software companies and even media companies engage in all the time.

Run a web site, measure anything, make any changes based on measurements? Congratulations, you're running a psychology experiment!—
Marc Andreessen (@pmarca) June 28, 2014

Tal Yarkoni, a researcher in psychology at the University of Texas, noted in a post that the amount of fiddling that the Facebook research team engaged in was extremely small: “These effects, while highly statistically significant, are tiny. The largest effect size reported had a Cohen’s d of 0.02 — meaning that eliminating a substantial proportion of emotional content from a user’s feed had the monumental effect of shifting that user’s own emotional word use by two hundredths of a standard deviation. In other words, the manipulation had a negligible real-world impact on users’ behavior.”

Yarkoni also argues that just because some change in behavior was seen after the manipulations doesn’t necessarily mean that the emotional state of those users was actually altered: “The fact that users in the experimental conditions produced content with very slightly more positive or negative emotional content doesn’t mean that those users actually felt any differently. It’s entirely possible — and I would argue, even probable — that much of the effect was driven by changes in the expression of ideas or feelings that were already on users’ minds.”

@Asher_Wolf @mathewi I don't really accept the premise that words posted to Facebook equal someone's mood, is all I'm saying


Rusty Foster (@rustyk5) June 29, 2014

Michelle Meyer, a fellow at the Health Law Policy center at Harvard Law School, argues in a post that — contrary to what many academics have said — the Facebook study was not necessarily unethical, since it involved minimal harm if any to users, and that even if it had been reviewed by an academic research board before it was conducted that it might very well have been approved.

What should Facebook do?

In addition to possibly apologizing to users for not asking them for permission, Kashmir Hill — who writes about privacy for Forbes magazine — argues that the social network should have some kind of explicit opt-in process for such research, the way other services do:

“When I signed up for 23andMe — a genetic testing service — it asked if I was willing to be part of 23andWe, which would allow my genetic material to be part of research studies. I had to affirmatively check a box to say I was okay with that. I think Facebook should have something similar. While many users may already expect and be willing to have their behavior studied… they don’t expect that Facebook will actively manipulate their environment in order to see how they react. That’s a new level of experimentation, turning Facebook from a fishbowl into a petri dish.”

Paul Bernal, a lecturer in intellectual property and media law, suggests in a tongue-in-cheek post that Facebook should change its user policies to include language that requires users to agree that “by using Facebook, you consent to having your emotions and feelings manipulated, and those of all your friends (as defined by Facebook) and relatives, and those people that Facebook deems to be connected to you in any way. The feelings to be manipulated may include happiness, sadness, depression, fear, anger, hatred, lust and any other feelings that Facebook finds itself able to manipulate. Facebook confirms that it will only manipulate those emotions in order to benefit Facebook, its commercial or governmental partners and others.”

If Facebook does choose to apologize for its behavior, it will join a long list of apologies the social network has made during its relatively brief history, a list of which Mike Elgan has helpfully compiled.

文章作者:

文章链接:http://gigaom.com/2014/06/30/heres-what-you-need-to-know-about-that-facebook-experiment-that-manipulated-your-emotions/


Comments are closed.



无觅相关文章插件