【Lie to it:算法解读肢体语言】

【Lie to it:算法解读肢体语言】意大利摩德纳和雷焦艾米利亚大学研究人员研发了一款算法程序通过辨识人类肢体语言预测人们内心真实想法与潜在偏见。据了解,研发人员让白人学生分别与黑人和白人对话并填写反应潜意识的问卷,同时利用传感器记录肢体语言,而后结合肢体语言数据和问卷结果研发了该程序。

 

Can a computer tell if you're RACIST? Algorithm can detect hidden prejudice from a person's body language

0730
By SHIVALI BEST
While many people have prejudices against certain groups, it can often be easy to hide these in public.
But a new computer programme may soon be able to reveal these hidden biases.
Researchers have created a programme that scrutinises people's body language for signs of racial prejudice.
While programmes already exist that can accurately read emotions from facial expressions, a comparable programme to read body language had not been created until now.
Researchers from the University of Modena and Reggio Emilia in Italy wanted to see if an algorithm could accurately predict if someone was racist.
Their study involved 32 white university students, who were asked to fill in two questionnaires – one to uncover their open bias, and the other to reveal their subconscious bias.
Having filled in the questionnaires, the participants were then filmed having a conversation with a white person, and a conversation with a black person.
Each conversation involved the discussion of a neutral subject for three minutes, then a more sensitive subject, such as immigration, for another three minutes.
Using a GoPro camera and a Microsoft Kinect, the researchers captured their movements, while sensors estimate their skin response and heart rate.
The algorithm searched for correlations between the participants' questionnaire answers and their body language during the conversations.
The results suggested that those who showed strong hidden racial biases tended to stand further away from the black conversation partner.
In contrast with this, those who showed no racial bias tended to pause more and use their hands as they spoke, suggesting they were more comfortable.
To test the computer's results, it then looked back at the same data and tried to predict who would have scored high or low on the hidden bias questionnaire.
The computer was correct 82 per cent of the time.
0731
Andrea Palazzi, who worked on the study, told MailOnline: 'With current technology, using relatively cheap hardware we can objectively measure a lot of aspects of nonverbal behaviour (e.g. posture, interpersonal distance, gestures etc.).
'If any other kind of inner prejudice leaks on the outside some information that can be measured, a machine is in theory able to learn to detect this behaviour.'
The researchers hope to now conduct further experiments to see if the computer programme could be used in real-life situations.
In their paper, presented at the International Joint Conference on Pervasive and Ubiquitous Computing in Heidelberg, Germany, the researchers write: '[While] this study has been tailored on prejudice, the same approach and set of technologies could be used in a large variety of application domains.
'For example, it could be used in schools to identify children with anxiety issues, in self-driving cars to assess driver attention level, [and at] border control to spot possibly dangerous behavioural outliers.'
http://www.dailymail.co.uk/sciencetech/article-3813425/Can-computer-tell-RACIST-Algorithm-detect-hidden-prejudice-person-s-body-language.html#comments

Comments are closed.



无觅相关文章插件