Interacting with other people is almost always a game of reading cues and volleying back. We think a smile conveys happiness, so we offer a smile in return. We think a frown shows sadness, and maybe we attempt to cheer that person up.
—————————————————————————————————————————————–
Find jobs in R & D, Medicine, engineering and a wide variety of scientific fields and others in our jobs page.
—————————————————————————————————————————————–
Explore resources for science students engaged in life science courses and other scientific fields with practice tests, mcqs at our Student Zone.
—————————————————————————————————————————
Some businesses are even working on technology to determine customer satisfaction through facial expressions.
But facial expressions might not be reliable indicators of emotion, research indicates. In fact, it might be more accurate to say we should never trust a person’s face, new research suggests.
“The question we really asked is: ‘Can we truly detect emotion from facial articulations?’” said Aleix Martinez, a professor of electrical and computer engineering at The Ohio State University.
“And the basic conclusion is, no, you can’t.”
Martinez, whose work has focused on building computer algorithms that analyze facial expressions, and his colleagues presented their findings today (Feb. 16, 2020) at the annual meeting of the American Association for the Advancement of Science in Seattle.
The researchers analyzed the kinetics of muscle movement in the human face and compared those muscle movements with a person’s emotions. They found that attempts to detect or define emotions based on a person’s facial expressions were almost always wrong.
“Everyone makes different facial expressions based on context and cultural background,” Martinez said. “And it’s important to realize that not everyone who smiles is happy. Not everyone who is happy smiles. I would even go to the extreme of saying most people who do not smile are not necessarily unhappy. And if you are happy for a whole day, you don’t go walking down the street with a smile on your face. You’re just happy.”
It is also true, Martinez said, that sometimes, people smile out of an obligation to the social norms. This would not inherently be a problem, he said — people are certainly entitled to put on a smile for the rest of the world — but some companies have begun developing technology to recognize facial muscle movements and assign emotion or intent to those movements.
The research group that presented at AAAS analyzed some of those technologies and, Martinez said, largely found them lacking.
“Some claim they can detect whether someone is guilty of a crime or not, or whether a student is paying attention in class, or whether a customer is satisfied after a purchase,” he said. “What our research showed is that those claims are complete baloney. There’s no way you can determine those things. And worse, it can be dangerous.”
The danger, Martinez said, lies in the possibility of missing the real emotion or intent in another person, and then making decisions about that person’s future or abilities.
For example, consider a classroom environment, and a teacher who assumes that a student is not paying attention because of the expression on the student’s face. The teacher might expect the student to smile and nod along if the student is paying attention. But maybe that student, for reasons the teacher doesn’t understand — cultural reasons, perhaps, or contextual ones — is listening intently, but not smiling at all. It would be, Martinez argues, wrong for the teacher to dismiss that student because of the student’s facial expressions.
After analyzing data about facial expressions and emotion, the research team — which included scientists from Northeastern University, the California Institute of Technology and the University of Wisconsin — concluded that it takes more than expressions to correctly detect emotion.
Facial color, for example, can help provide clues.
“What we showed is that when you experience emotion, your brain releases peptides — mostly hormones — that change the blood flow and blood composition, and because the face is inundated with these peptides, it changes color,” Martinez said.
The human body offers other hints, too, he said: body posture, for example. And context plays a crucial role as well.
In one experiment, Martinez showed study participants a picture cropped to display just a man’s face. The man’s mouth is open in an apparent scream; his face is bright red.
“When people looked at it, they would think, wow, this guy is super annoyed, or really mad at something, that he’s angry and shouting,” Martinez said. “But when participants saw the whole image, they saw that it was a soccer player who was celebrating a goal.”
In context, it’s clear the man is very happy. But isolate his face, Martinez said, and he appears almost dangerous.
Cultural biases play a role, too.
“In the U.S., we tend to smile a lot,” Martinez said. “We are just being friendly. But in other cultures, that means different things — in some cultures, if you walked around the supermarket smiling at everyone, you might get smacked.”
Martinez said the research group’s findings could indicate that people — from hiring managers to professors to criminal justice experts — should consider more than just a facial expression when they evaluate another person.
And while Martinez said he is “a big believer” in developing computer algorithms that try to understand social cues and the intent of a person, he added that two things are important to know about that technology.
“One is you are never going to get 100 percent accuracy,” he said. “And the second is that deciphering a person’s intent goes beyond their facial expression, and it’s important that people — and the computer algorithms they create — understand that.”
Source: Ohio State University.
Published on February 28, 2020