【人形情绪机器人 临床医学的最新培训工具】

【人形情绪机器人  临床医学的最新培训工具】近日,研究人员设计出一款可以表达各种情绪的人形机器人,实习医生可以对其进行技能练习。除了可以呼吸、流血以及产生一些药物反应外,如果在这一过程中出现任何疼痛感,机面人完全可以模仿人类面部痛苦的表情,以此来帮助实习医生尽快熟悉各种临床操作。

Robotic head of sci-fi author Philip K Dick being used to teach doctors how to recognise pain in patients

  • The robots were designed to help medical professionals diagnose patients  
  • Researchers designed a virtual avatar and robot based on real facial expressions
  • The robot and avatar are based on Philip K Dick, a popular science fiction writer  
  • They tested both on clinicians and people without a medical background
  • People detected emotions more accurately in the virtual avatar than in the robot
  • Medical clinicians were not as good at detecting pain and anger non-clinicians 
  • The researchers say that the clinical community can benefit from their work because they're developing new training tools to improve patient reading skills

Humanoid, facially expressive robots have been designed by researchers to help medical professionals improve their diagnosing skills.

While robotic patient simulators (RPS's) are already used to train doctors, their faces don't move and don't express emotions.

So researchers created a robot with rubber skin that can move its facial features to express real human emotions.  

Scroll down for video 

Top row: Sample frames from the videos of people for expressing three emotions. From left to right pain, anger and disgust. Middle row: The Philip K D humanoid robot expressing the corresponding emotions. Bottom row: The Philip K D robot expressing the corresponding emotions 

Top row: Sample frames from the videos of people for expressing three emotions. From left to right pain, anger and disgust. Middle row: The Philip K D humanoid robot expressing the corresponding emotions. Bottom row: The Philip K D robot expressing the corresponding emotions

The research team, led by Dr Laurel Riek, an associate professor of computer science and engineering at UC San Diego, designed the robot to be able to express pain, disgust and anger.

They also created a virtual avatar for an alternative training option.

To design these, the researchers used a face tracking-software to extract facial features from videos of people expressing three natural, non-acted out emotions - pain, anger and disgust.

They converted these expressions into 66 moving points that were mapped onto a Hanson Robotics humanoid robot of Philip K Dick, a science fiction writer.

The researcher then showed videos of the robot and the virtual avatar to 102 people, 51 of which were clinicians, such as doctors, pharmacists and nurses, while the other 51 didn't have a background in medical research.

Left: The virtual avatar model of science fiction writer Philip K Dick. Right: The humanoid robot of Philip K Dick, created by Hanson Robotics 

Left: The virtual avatar model of science fiction writer Philip K Dick. Right: The humanoid robot of Philip K Dick, created by Hanson Robotics

The researchers found that clinicians were not as good at detecting pain and anger as people who had no background in medical research.

In fact, researchers only correctly detected pain in the virtual avatar 54 per cent of the time, whereas non-clinicians correctly detected pain 83 per cent of the time.

This finding is supported by previous research that found that a significant decline in empathy occurs during the third year of medical school - ironically when the curriculum shifts towards patient-care activities.

Sample frames from the Binghamton Pittsburgh 4D Spontaneous Expression Database (BP4D). The BP4D is a fully labeled database of realistic expressions. From left to right: Disgust, happiness, pain and anger 

Sample frames from the Binghamton Pittsburgh 4D Spontaneous Expression Database (BP4D). The BP4D is a fully labeled database of realistic expressions. From left to right: Disgust, happiness, pain and anger

The researchers for this robot study wrote that the clinical community can benefit from their work because they're developing new training tools for students to improve their skills in reading patients' faces.

The findings of the study also revealed that all participants were less accurate at detecting pain from a humanoid robot in comparison to a virtual avatar.

It also found that disgust was the emotion that participants most often got wrong - in the avatar, clinicians correctly identified disgust only 20 per cent of the time, and non-clinicians just 12 per cent of the time.

The face-tracking software extracted 66 facial features to map three emotions: pain, anger and disgust

The researchers wrote that one of the limitations of their work was that the robot didn't have any range of motion not wrinkles around the nose area or cheeks.

As such, they couldn't map certain points of facial expression that are important for expressing pain and disgust.

Another reason for low detection of accuracy of some facial expressions was that the videos the expressions came from were based on naturalistic data sets - meaning the emotions were actually elicited instead of simply acted out.

Clinicians practicing their bedside procedural skills using robotic patient simulator systems. These systems can breathe, bleed, respond to medications, and speak - but they're faces don't move

When the videos were made, real emotions such as sadness were triggered through a documentary video about an emergency situation, and pain was elicited by making participants submerge their hands in ice cold water.

'It is well understood in the affective computing community that naturally evoked emotions overall have far lower intensities compared to acted datasets,' the researchers wrote in the study.

Despite some of these issues, later this year the researchers plan to test the robot with medical students at UC San Diego for simulated medical scenarios.

http://www.dailymail.co.uk/sciencetech/article-4302542/Researchers-create-robot-shows-PAIN-teach-doctors.html


发表评论

登录 后发表评论.



无觅相关文章插件