【如果AI能感知你的情绪,将会怎样?】

【如果AI能感知你的情绪,将会怎样?】剑桥大学的研究人员希望通过深度学习,能教会计算机根据面部识别解读情绪,综合考虑肢体语言,说话声调等一系列因素,如果AI助理如Alexa,Siri等具备情绪识别功能,则会在你被它们提供的错误答案惹恼时向你道歉,而这项技术可以广泛应用于精神健康的监测,自动驾驶汽车也可根据驾驶员的情绪更灵活地切换驾驶模式以及乘车人是否具备驾驶条件。研究人员们希望机器可以分辨更复杂的情绪,如兴奋,嫉妒,自豪等等。

http://spectrum.ieee.org/view-from-the-valley/robotics/artificial-intelligence/wouldnt-you-like-alexa-better-if-it-knew-when-it-was-annoying-you

Wouldn’t You Like Alexa Better if It Knew When It Was Annoying You?

An image of the Mona Lisa, with facial features labeled that can be decoded to determine emotion
Illustration: Affectiva

What could your computer, phone, or other gadget do differently if it knew how you were feeling?

Rana el Kaliouby, founder and CEO of Affectiva, is considering the possibilities of such a world. Speaking at the Computer History Museumlast week, el Kaliouby said that she has been working to teach computers to read human faces since 2000 as a PhD student at Cambridge University.

“I remember being stressed,” she says.  “I had a paper deadline, and “Clippy” [that’s Microsoft’s ill-fated computer assistant] would pop up and do a little twirl and say ‘It looks like you are writing a letter.’ I would think, ‘No I’m not!’”

(“You may,” Computer History Museum CEO John Hollar interjected, “be one of the few advanced scientists inspired by Clippy.”)

That was a piece of what led her to think about making computers more intelligent. Well, that, plus the fact that she was homesick. And the realization that, because she was spending more time with her computer than any human being, she really wanted her computer to understand her better.

Computer Museum CEO John Hollar interviews Affectiva founder Rana El-Kaliouby
Photo: Tekla Perry
Computer History Museum CEO John Hollar interviews Affectiva founder Rana el Kaliouby

Since then, she’s been using machine learning, and more recently deep learning, to teach computers to read faces, spinning Affectiva out of the MIT Media Lab in 2009 to commercialize her work. The company’s early customers are not exactly changing the world—they are mostly advertisers looking to better craft their messages. But that, she says, is just the beginning. Soon, she says, “all of our devices will have emotional intelligence”—not just our phones, but “our refrigerators, our cars.”

Early on, el Kaliouby focused on building smart tools for individuals with autism. She still thinks emotional intelligence technology—or EI—will be a huge boon to this community, potentially providing a sort of emotional hearing aid.

It’ll also be a mental healthcare aid, el Kaliouby predicts. She sees smart phones with EI as potentially able to regularly check a person’s mental state, providing early warning of depression, anxiety, or other problems. “People check their phones 15 times an hour. That’s a chance to understand that you are deviating from your baseline.”

Computer Museum CEO John Hollar
Photo: Tekla Perry
Affectiva’s software decodes Computer History Museum CEO John Hollar’s expressions in real time

Cars, she said, will need to have emotional intelligence as they transition to being fully automated; in the interim period, they will sometimes need to hand control back to a human driver, and need to know if the driver is ready to take control.

Smart assistants like Siri and Alexa, she says, “need to know when [they] gave you the wrong answer and you are annoyed, and say ‘I’m sorry.’”

Online education desperately needs emotional intelligence, she indicated, to give it a sense of when students are confused or engaged or frustrated or bored.

And the killer app? It just might be dating. “We have worked with teenagers who just want to have a girlfriend, but couldn’t tell if girls were interested in them,” el Kaliouby says. A little computer help reading their expressions could help with that. (Pornography and sex robots will likely be a big market as well, el Kaliouby says, but her company doesn’t plan on developing tools for this application. Nor for security, because that violates Affectiva’s policy of not tracking emotions without consent.)

While Affectiva is focusing on the face for its clues about emotions, el Kaliouby admits that the face is just part of the puzzle—gestures, tone of voice, and other factors need to be considered before computers can be completely accurate in decoding emotions.

And today’s emotional intelligence systems are still pretty dumb. “I liken the state of the technology to a toddler, el Kaliouby says. “It can do basic emotions. But what do people look like when inspired, or jealous, or proud? I think this technology can answer these basic science questions—we’re not done.”


Comments are closed.



无觅相关文章插件