from the stay-calm-and-take-your-meds dept.
SRI International, the Silicon Valley research lab where Apple's virtual assistant Siri was born, is working on a new generation of virtual assistants that respond to users' emotions.
As artificial-intelligence systems such as those from Amazon, Google, and Facebook increasingly pervade our lives, there is an ever greater need for the machines to understand not only the words we speak, but what we mean as well—and emotional cues can be valuable here (see "AI's Language Problem").
"[Humans] change our behavior in reaction to how whoever we are talking to is feeling or what we think they're thinking," says William Mark, who leads SRI International's Information and Computing Sciences Division. "We want systems to be able to do the same thing."
[...] The system is designed to identify emotional state based on a variety of cues, including typing patterns, speech tone, facial expressions, and body movements.
SenSay could, for example, add intelligence to a pharmacy phone assistant. It might be able to tell from a patient's pattern of speech if he or she were becoming confused, then slow down.
The machine-learning-based technology is trained on different scenarios, depending on how it will be used. The new virtual assistants can also monitor for specific words that give away a person's mental state.
It works via text, over the phone, or in person. If someone pauses as he or she types, it could indicate confusion. In person, the system uses a camera and computer vision to pick up on facial characteristics, gaze direction, body position, gestures, and other physical signals of how a person is feeling.
Related Stories
As we've seen in a recent story ("Customer Service Bots Are Getting Better at Detecting Your Agitation"), facial recognition software has moved beyond matching faces to trying to infer the emotional state of the face. At the heart of this effort is the assumption that, generally, facial expressions convey the same emotional state across cultures. Recent research shows this might not be the case.
In the 1960s, psychologist Paul Ekman came up with the method that has become the standard way to test this: present a collection of pictures of Westerners with different facial expressions to people living in isolated cultures and ask them what emotion was being conveyed. His research showed universality in understanding facial expressions across cultures. This has become an accepted axiom of this field ever since. However, in 2011, psychologists Carlos Crivelli and José-Miguel Fernández-Dols investigated the assumptions and methodology of the Ekman experiments. They traveled to the Trobriand Islands off the coast of Papua New Guinea and performed their own experiment using pictures of facial expressions.
Crivelli found that they matched smiling with happiness almost every time. Results for the other combinations were mixed, though. For example, the Trobrianders just couldn’t widely agree on which emotion a scowling face corresponded with. Some said this and some said that. It was the same with the nose-scrunching, pouting, and a neutral expression. There was one facial expression, though, that many of them did agree on: a wide-eyed, lips-parted gasping face (similar to above [link]) that Western cultures almost universally associate with fear and submission. The Trobrianders said it looked “angry.”
The work is being well received in the field, such as by social psychologist Alan Fridlund who noted that the researchers did an excellent job immersing themselves in the Trobriander culture before conducting the experiment.
Despite agreeing broadly with the study’s conclusions, Fridlund doubts it will sway hardliners convinced that emotions bubble forth from a common font. Ekman’s school of thought, for example, arose in the post–World War II era when people were seeking ideas that reinforced our common humanity, Fridlund says. “I think it will not change people’s minds. People have very deep reasons for adhering to either universality or cultural diversity.”
An abstract is available: The fear gasping face as a threat display in a Melanesian society.
[How might this affect Unicode's emoticons (i.e. code points starting at \U0001F600)? -Ed.]
(Score: 3, Informative) by Rosco P. Coltrane on Thursday September 15 2016, @09:42AM
Deploy it at a telemarketing company. If it fails to detect 100% extreme customer annoyance, it's not working right.
(Score: 1) by gOnZo on Thursday September 15 2016, @09:53AM
(Score: 3, Insightful) by hoeferbe on Thursday September 15 2016, @02:06PM
(Score: 2) by iwoloschin on Thursday September 15 2016, @10:06AM
As long as I can still spoof it into dumping me right to a human, I'm ok with this. If it tries to get "smart" with me though and help me itself, then I'd get pretty agitated.
(Score: 2) by rts008 on Thursday September 15 2016, @11:41AM
If all of your customers are robots, then there is a legitimate reason to use bots for customer service.
If your customers are human, then so should the customer service be human.
If you are calling customer service, you are already agitated, getting a bot to handle your problem will only increase that agitation.
Using a bot for customer service shouts loud and clear that you only care about customer's money, not their satisfaction.
It seems that when we went to a 'service economy', we became 'consumers', instead of 'customers'. And as a consumer, we are only a resource to be harvested for the corp.'s benefit.
(Score: 0) by Anonymous Coward on Thursday September 15 2016, @01:36PM
Because it feels so much better cussing at a human who can't help you, calling their education into question, insulting their mother, and threatening them?
(Score: 0) by Anonymous Coward on Thursday September 15 2016, @03:09PM
except...look at all the places already where we strive to have no or minimized human interaction with services (or service providers with their users)... self checkout& checkin systems and kiosks, online banking websites. so many mobile phone apps and NFC payment, etc.
even the drive thru lines can be seen as such...
(Score: 0) by Anonymous Coward on Thursday September 15 2016, @09:42PM
Self-checkouts are garbage. You can't buy alcohol or cigarettes or a number of other things without calling in a store employee, the machines frequently have bugs, and the machines run proprietary software. Online banking websites are riddled with exploits and unsafe. Many mobile phone apps spy on you and generally waste your battery.
Yeah, great examples.
(Score: 3, Insightful) by Joe Desertrat on Thursday September 15 2016, @05:46PM
If you are calling customer service, you are already agitated, getting a bot to handle your problem will only increase that agitation.
It isn't that the bot is handling the issue that is the problem. If it handled it there would be no problem. It is the same thing that makes using a FAQ page for "answers" so pointless and aggravating. These things are geared only towards the simplest and most common problems. Maybe, just maybe, some of us only call when all the normal things do not work. We have it plugged in, we have it turned on, we have done, this, that and everything else. We already know we have to try these things, we have been forced by circumstances to use crap for a long time and have run into all these problems many times already. Just give us an option that is a solution and we will be happy, bot or human.
(Score: 0) by Anonymous Coward on Thursday September 15 2016, @12:56PM
...Sirius Cybernetics Corporation...
"A bunch of mindless jerks who'll be the first against the wall when the revolution comes."
...
...
...
Profit??
(Score: 0) by Anonymous Coward on Thursday September 15 2016, @03:52PM
What is the first "H" for?
(Score: 0) by Anonymous Coward on Thursday September 15 2016, @02:44PM
If a telephone tree ever slows down to 'help' me, I swear I will drive in person to that company.. lol :P
(Score: 0) by Anonymous Coward on Thursday September 15 2016, @06:00PM
Now use the AI to make the company care.
(Score: 2) by bob_super on Thursday September 15 2016, @07:13PM
Just pay the fee to connect to the Apple/Google API, which will report your level of aggravation in the last 5 minutes based on your internet queries and what your microphone heard...
Knowing whether they also searched for weapons and explosives before calling tech support? Yep, that's an extra fee.
(Score: 0) by Anonymous Coward on Thursday September 15 2016, @08:22PM
You seem to be stressed. Would you like me to start a therapy session? *wags tail while looking friendly*
(Score: 2) by halcyon1234 on Friday September 16 2016, @03:02AM
GIVE ME A FUCKING HUMAN YOU GODDAMN MOTHERFUCKING MACHINE!
{sound of the 0 key being pounded repeatedly}
I DON'T WANT TO TALK TO A GODDAMN COMPUTER! DON'T YOU DARE FUCKING HANG UP ON ME AGAIN!
{sound of the 0 key being pounded repeatedly}
GIVE! ME! A! GODDAMN! HUMAN!
Original Submission [thedailywtf.com]
(Score: 1) by charon on Friday September 16 2016, @10:31PM
I did exactly this this morning. I repeatedly said, "Operator" and the bot's only reply was, "I'm sorry, I didn't catch that." Eventually I swore and magically, the bot transferred me to a human.
(Score: 2) by Joe Desertrat on Saturday September 17 2016, @09:25AM
That's your problem. Try pounding the 9 key repeatedly. If you pound the 0 key they know you want an operator and hey, that's not going to happen.