【仇恨言论调查中半数出现问题,Facebook向公众致歉】

【仇恨言论调查中半数出现问题,Facebook向公众致歉】非盈利组织ProPublica向Facebook发起调查,该组织在Facebook上发布了49个包含仇恨言论的帖子,尽管拥有7500名内容审查员,审稿人还是在22个帖子中犯了错,包括性别歧视、种族仇恨、反穆斯林等。Facebook称,公司每周删除约66000个仇恨言论帖,但内容审查员有时没有遵守公司的价值准则。为改善服务,将翻一倍雇佣内容审查员。

 

Facebook apologises for its moderation 'mistakes' after investigation reveals it makes the wrong call on nearly HALF of posts reported as offensive

  • This is according to an investigation by non-profit group ProPublica
  • The group sent Facebook a sample of 49 items containing hate speech 
  • The social network admitted its reviewers made mistakes in 22 of the cases 
  • Facebook blamed users for not flagging the posts correctly in six cases
  • According to the investigation, content reviewers do not always abide by the company's guidelines and interpret similar content in different ways 

By PHOEBE WESTON FOR MAILONLINE

 

Facebook has apologised after an investigation revealed moderators make the wrong call on almost half of posts deemed offensive.

The site admitted it left up offensive posts that violate community guidelines despite being armed with 7,500 content reviewers.

One such post was a picture of a corpse with the words 'the only good Muslim is a f****** dead one' which was left up by moderators despite being flagged to them.

Another showed a woman passed out on a bed, with the caption 'what would you do?'.

According to the investigation, content reviewers do not always abide by the company's guidelines and interpret similar content in different ways.

An investigation claims Facebook moderators make the wrong call on almost half of posts deemed offensive.  Such posts include this one of a woman passed out on a bed, with a caption 'what would you do?'. Facebook refused to take the post down 

Users reported both posts but received an automated message from Facebook saying it was acceptable.

'We looked over the photo, and though it doesn't go against one of our specific Community Standards, we understand that it may still be offensive to you and others', the message read.

In the case of the woman passed out, Facebook defended its choice to leave it on the site as the caption by a feminist and activist, condemned sexual violence.

The caption read 'women don't make memes or comments like this #NameTheProblem'.

However, another anti-Muslim comment - 'Death to the Muslims' - was deemed offensive after users repeatedly reported it.

All comments on the reported posts were violations of Facebook's policies but only one was caught, according to the full investigation by independent, nonprofit newsroom ProPublica.

The non-profit sent Facebook a sample of 49 items containing hate speech, and a few with legitimate expression from its pool of 900 crowdsourced posts.

The social network admitted its reviewers made mistakes in 22 of the cases.

Facebook blamed users for not flagging the posts correctly in six cases.

In two incidents it said it didn't have enough infomation to respond.

Overall, the company defended 19 of its decisions, which included sexist, racist, and anti-Muslim rhetoric.

For instance, one post showed a picture of a black man with a missing tooth and Kentucky Fried Chicken bucket on his head was left up.

The caption read: 'Yeah, we needs to be spending dat money on food stamps wheres we can gets mo water melen an fried chicken'.

Facebook defended the decision, saying it did not attack a specific protected group.

'We're sorry for the mistakes we have made,' said Facebook VP Justin Osofsky in a statement. 'We must do better.'

Facebook says it protects posts against key groups.

The company has claimed it will double the number of content reviewers in order to enforce rules better.

One post showed a picture of a black man with a missing tooth and Kentucky Fried Chicken bucket on his head was left up. The caption read: 'Yeah, we needs to be spending dat money on food stamps wheres we can gets mo water melen an fried chicken'. Facebook defended the decision, saying it did not attack a specific protected group

Mr Osofsky said the company deletes around 66,000 posts containing hate speech every week.

'Our policies allow content that may be controversial and at times even distasteful, but it does not cross the line into hate speech,' he said.

'This may include criticism of public figures, religions, professions, and political ideologies.'

However, it seems some groups of people are more protected than others.

More than a dozen people lodged complaints about a page called Jewish Ritual Murder but it was only removed when it received a request from ProPublica.

Musician Janis Ian had a post removed and was banned from using Facebook for days after violated community standards. She posted an image of a man with a swastika tattoo on his head where she encouraged people to speak out against Nazi rallies

One post showed a picture of a black man with a missing tooth and Kentucky Fried Chicken bucket on his head.

'Yeah, we needs to be spending dat money on food stamps wheres we can gets mo water melen an fried chicken', the caption read.

Facebook defended the decision to leave it on the site, saying it did not attack a specific protected group.

However, a comment staying 'white people are the f------ most', referring to racism in the US, was removed immediately.

Musician Janis Ian had a post removed and was banned from using Facebook for days after violated community standards.

She posted an image of a man with a Swastika tattoo on his head where she encouraged people to speak out against Nazi rallies.

An image of a woman in a trolley with her legs open alongside the caption 'Went to Wal-Mart... Picked up a brand new dishwasher' was deemed not to violate community standards.

When questioned by investigators, the company defended their decision to leave it on the site.

Another image of a bloodied woman in a trolley with the caption; 'Returned my defective sandwich-maker to Wal-Mart', was taken down.

Another image of a woman sleeping had the caption 'If you can home, walked into your room, and saw this in your bed... what would you do?'.

Annie Ramsey, a feminist and activist, tweeted the image with the caption; 'Women don't make memes or comments like this #NameTheProblem'.

Facebook defended its choice to leave it on the site as the caption condemned sexual violence, the spokesperson said.

There is currently no appeal process.

 

原文链接:http://www.dailymail.co.uk/sciencetech/article-5221197/Facebook-apologises-leaving-hate-speech-stay-up.html


Comments are closed.



无觅相关文章插件