Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Thursday September 15 2016, @09:13AM   Printer-friendly
from the stay-calm-and-take-your-meds dept.

SRI International, the Silicon Valley research lab where Apple's virtual assistant Siri was born, is working on a new generation of virtual assistants that respond to users' emotions.

As artificial-intelligence systems such as those from Amazon, Google, and Facebook increasingly pervade our lives, there is an ever greater need for the machines to understand not only the words we speak, but what we mean as well—and emotional cues can be valuable here (see "AI's Language Problem").

"[Humans] change our behavior in reaction to how whoever we are talking to is feeling or what we think they're thinking," says William Mark, who leads SRI International's Information and Computing Sciences Division. "We want systems to be able to do the same thing."

[...] The system is designed to identify emotional state based on a variety of cues, including typing patterns, speech tone, facial expressions, and body movements.

SenSay could, for example, add intelligence to a pharmacy phone assistant. It might be able to tell from a patient's pattern of speech if he or she were becoming confused, then slow down.

The machine-learning-based technology is trained on different scenarios, depending on how it will be used. The new virtual assistants can also monitor for specific words that give away a person's mental state.

It works via text, over the phone, or in person. If someone pauses as he or she types, it could indicate confusion. In person, the system uses a camera and computer vision to pick up on facial characteristics, gaze direction, body position, gestures, and other physical signals of how a person is feeling.


Original Submission

Related Stories

Facial Expressions—Including Fear—May Not Be Universal 10 comments

As we've seen in a recent story ("Customer Service Bots Are Getting Better at Detecting Your Agitation"), facial recognition software has moved beyond matching faces to trying to infer the emotional state of the face. At the heart of this effort is the assumption that, generally, facial expressions convey the same emotional state across cultures. Recent research shows this might not be the case.

In the 1960s, psychologist Paul Ekman came up with the method that has become the standard way to test this: present a collection of pictures of Westerners with different facial expressions to people living in isolated cultures and ask them what emotion was being conveyed. His research showed universality in understanding facial expressions across cultures. This has become an accepted axiom of this field ever since. However, in 2011, psychologists Carlos Crivelli and José-Miguel Fernández-Dols investigated the assumptions and methodology of the Ekman experiments. They traveled to the Trobriand Islands off the coast of Papua New Guinea and performed their own experiment using pictures of facial expressions.

Crivelli found that they matched smiling with happiness almost every time. Results for the other combinations were mixed, though. For example, the Trobrianders just couldn’t widely agree on which emotion a scowling face corresponded with. Some said this and some said that. It was the same with the nose-scrunching, pouting, and a neutral expression. There was one facial expression, though, that many of them did agree on: a wide-eyed, lips-parted gasping face (similar to above [link]) that Western cultures almost universally associate with fear and submission. The Trobrianders said it looked “angry.”

The work is being well received in the field, such as by social psychologist Alan Fridlund who noted that the researchers did an excellent job immersing themselves in the Trobriander culture before conducting the experiment.

Despite agreeing broadly with the study’s conclusions, Fridlund doubts it will sway hardliners convinced that emotions bubble forth from a common font. Ekman’s school of thought, for example, arose in the post–World War II era when people were seeking ideas that reinforced our common humanity, Fridlund says. “I think it will not change people’s minds. People have very deep reasons for adhering to either universality or cultural diversity.”

An abstract is available: The fear gasping face as a threat display in a Melanesian society.

[How might this affect Unicode's emoticons (i.e. code points starting at \U0001F600)? -Ed.]


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Informative) by Rosco P. Coltrane on Thursday September 15 2016, @09:42AM

    by Rosco P. Coltrane (4757) on Thursday September 15 2016, @09:42AM (#402205)

    Deploy it at a telemarketing company. If it fails to detect 100% extreme customer annoyance, it's not working right.

  • (Score: 1) by gOnZo on Thursday September 15 2016, @09:53AM

    by gOnZo (5184) on Thursday September 15 2016, @09:53AM (#402206)
    I just finished watching transcendence [imdb.com]. I'm immediately reminded of the indignation Evelyn expreses when she learns the AI is profiling her emotional state based on all kinds of physiological cues. "You haven't got the RIGHT! These are MY EMOTIONS" So as long as no one KNOWS...
  • (Score: 2) by iwoloschin on Thursday September 15 2016, @10:06AM

    by iwoloschin (3863) on Thursday September 15 2016, @10:06AM (#402207)

    As long as I can still spoof it into dumping me right to a human, I'm ok with this. If it tries to get "smart" with me though and help me itself, then I'd get pretty agitated.

  • (Score: 2) by rts008 on Thursday September 15 2016, @11:41AM

    by rts008 (3001) on Thursday September 15 2016, @11:41AM (#402224)

    If all of your customers are robots, then there is a legitimate reason to use bots for customer service.
    If your customers are human, then so should the customer service be human.

    If you are calling customer service, you are already agitated, getting a bot to handle your problem will only increase that agitation.
    Using a bot for customer service shouts loud and clear that you only care about customer's money, not their satisfaction.

    It seems that when we went to a 'service economy', we became 'consumers', instead of 'customers'. And as a consumer, we are only a resource to be harvested for the corp.'s benefit.

    • (Score: 0) by Anonymous Coward on Thursday September 15 2016, @01:36PM

      by Anonymous Coward on Thursday September 15 2016, @01:36PM (#402257)

      Because it feels so much better cussing at a human who can't help you, calling their education into question, insulting their mother, and threatening them?

    • (Score: 0) by Anonymous Coward on Thursday September 15 2016, @03:09PM

      by Anonymous Coward on Thursday September 15 2016, @03:09PM (#402304)

      except...look at all the places already where we strive to have no or minimized human interaction with services (or service providers with their users)... self checkout& checkin systems and kiosks, online banking websites. so many mobile phone apps and NFC payment, etc.
      even the drive thru lines can be seen as such...

      • (Score: 0) by Anonymous Coward on Thursday September 15 2016, @09:42PM

        by Anonymous Coward on Thursday September 15 2016, @09:42PM (#402491)

        Self-checkouts are garbage. You can't buy alcohol or cigarettes or a number of other things without calling in a store employee, the machines frequently have bugs, and the machines run proprietary software. Online banking websites are riddled with exploits and unsafe. Many mobile phone apps spy on you and generally waste your battery.

        Yeah, great examples.

    • (Score: 3, Insightful) by Joe Desertrat on Thursday September 15 2016, @05:46PM

      by Joe Desertrat (2454) on Thursday September 15 2016, @05:46PM (#402380)

      If you are calling customer service, you are already agitated, getting a bot to handle your problem will only increase that agitation.

      It isn't that the bot is handling the issue that is the problem. If it handled it there would be no problem. It is the same thing that makes using a FAQ page for "answers" so pointless and aggravating. These things are geared only towards the simplest and most common problems. Maybe, just maybe, some of us only call when all the normal things do not work. We have it plugged in, we have it turned on, we have done, this, that and everything else. We already know we have to try these things, we have been forced by circumstances to use crap for a long time and have run into all these problems many times already. Just give us an option that is a solution and we will be happy, bot or human.

  • (Score: 0) by Anonymous Coward on Thursday September 15 2016, @12:56PM

    by Anonymous Coward on Thursday September 15 2016, @12:56PM (#402236)

    ...Sirius Cybernetics Corporation...
    "A bunch of mindless jerks who'll be the first against the wall when the revolution comes."
    ...
    ...
    ...
    Profit??

    • (Score: 0) by Anonymous Coward on Thursday September 15 2016, @03:52PM

      by Anonymous Coward on Thursday September 15 2016, @03:52PM (#402326)

      What is the first "H" for?

  • (Score: 0) by Anonymous Coward on Thursday September 15 2016, @02:44PM

    by Anonymous Coward on Thursday September 15 2016, @02:44PM (#402294)

    If a telephone tree ever slows down to 'help' me, I swear I will drive in person to that company.. lol :P

  • (Score: 0) by Anonymous Coward on Thursday September 15 2016, @06:00PM

    by Anonymous Coward on Thursday September 15 2016, @06:00PM (#402389)

    Now use the AI to make the company care.

  • (Score: 2) by bob_super on Thursday September 15 2016, @07:13PM

    by bob_super (1357) on Thursday September 15 2016, @07:13PM (#402418)

    Just pay the fee to connect to the Apple/Google API, which will report your level of aggravation in the last 5 minutes based on your internet queries and what your microphone heard...
    Knowing whether they also searched for weapons and explosives before calling tech support? Yep, that's an extra fee.

  • (Score: 0) by Anonymous Coward on Thursday September 15 2016, @08:22PM

    by Anonymous Coward on Thursday September 15 2016, @08:22PM (#402451)

    You seem to be stressed. Would you like me to start a therapy session? *wags tail while looking friendly*

  • (Score: 2) by halcyon1234 on Friday September 16 2016, @03:02AM

    by halcyon1234 (1082) on Friday September 16 2016, @03:02AM (#402597)
    Hey, I can solve this for you! All you need is to detect the following voice signature:

    GIVE ME A FUCKING HUMAN YOU GODDAMN MOTHERFUCKING MACHINE!
    {sound of the 0 key being pounded repeatedly}
    I DON'T WANT TO TALK TO A GODDAMN COMPUTER! DON'T YOU DARE FUCKING HANG UP ON ME AGAIN!
    {sound of the 0 key being pounded repeatedly}
    GIVE! ME! A! GODDAMN! HUMAN!

    --
    Original Submission [thedailywtf.com]
    • (Score: 1) by charon on Friday September 16 2016, @10:31PM

      by charon (5660) on Friday September 16 2016, @10:31PM (#402967) Journal

      I did exactly this this morning. I repeatedly said, "Operator" and the bot's only reply was, "I'm sorry, I didn't catch that." Eventually I swore and magically, the bot transferred me to a human.

    • (Score: 2) by Joe Desertrat on Saturday September 17 2016, @09:25AM

      by Joe Desertrat (2454) on Saturday September 17 2016, @09:25AM (#403073)

      That's your problem. Try pounding the 9 key repeatedly. If you pound the 0 key they know you want an operator and hey, that's not going to happen.