Google Search Box.
Happy or Sad? Recognizing Emotion Through Facial Skin Color


Happy or Sad? Recognizing Emotion Through Facial Skin Color


Happy or Sad? Recognizing Emotion Through Facial Skin Color

Members of the Vision and Learning Lab, including (from left) Olac Fuentes, Ph.D., associate professor of computer science, undergraduate Juanita Ordoñez, and doctoral student Maria Jimenez, are studying how to recognize emotion through facial skin color.
Photo by J.R. Hernandez / UTEP News Service

Watch a YouTube video of a giggling baby and you'll likely end up with a grin on your face. Your mouth will turn upward into a smile and you might even let out a chuckle yourself. The skin on your face may change color, too, turning a different hue as you experience positive emotions.

Computer scientist Olac Fuentes, Ph.D., is keenly interested in subtle facial changes like these and how they offer clues about what a person is feeling. The University of Texas at El Paso researcher is particularly focused on how the human face fluctuates between colors depending on the emotion being felt.

"Changes in a person's emotional state result in changes in blood flow in the face," Fuentes explained. "For instance, if someone experiences embarrassment, their face will blush."

According to Fuentes, who specializes in computer vision, understanding facial colors and the emotions they are linked to could lead to several high-tech applications in diverse areas such as human-computer interaction, computer graphics and security.

"Facial expression can be controlled," the associate professor of computer science said. "You can pretend to be happy and fake a smile, but color is a lot more difficult to fake."

In pursuit of a new, potentially more accurate way to detect emotions, Fuentes and students within the Vision and Learning Lab are conducting a number of experiments. Their first mission is to verify that skin color can be used to recognize emotions.

In 2013, the team received an Interdisciplinary Research (IDR) grant from the Office of Research and Sponsored Projects (ORSP). The funds kicked off a collaboration with UTEP psychologist Stephen Crites, Ph.D., who helped design an experiment to study how facial color changes with different stimuli.

More than 100 volunteers of different ages and ethnicities were recruited to watch three different videos on a computer screen. Each video was selected to elicit a certain reaction out of the viewer: positive, neutral or negative. For instance, one clip featured a cute puppy playing. Another depicted calming ocean waves gently hitting the shore, while an additional clip was from the movie 127 Hours; it specifically showed the gruesome scene where actor James Franco is forced to cut off his own arm.

As the participants watched the clips one-by-one, a video camera placed on top of the computer monitor recorded their reactions. Once the clip was over, the participants logged how they were feeling: happy, sad, normal or another way. Geovany Ramirez, Ph.D., a doctoral student at the time, then analyzed each of the video recordings, studying how the color of each person's face changed in reaction to the videos. Colors were taken from three specific regions of the face: the left cheek, the right cheek and the forehead. These colors were then analyzed and linked with the emotion the participant said they were feeling.

"At that point in the study, we just wanted to make sure that color changes do occur as human emotion changes," said computer science senior Paola Gallardo, who presented the research at last year's SACNAS conference. "We needed to make sure that skin color changes are reliable for inferring human emotion."

The team then fed the data – the facial colors and the emotions they were tied to – to a machine-learning algorithm. The system learned which colors were associated with which emotions, and then took tests to see if it could accurately predict how a human was feeling by analyzing skin color.

Its ability to detect a positive, neutral or negative mood was startling, detecting emotions accurately 77 percent of the time.

Doctoral student Maria Jimenez is now taking the research a step further. With the help of undergraduate Juanita Ordoñez, the lab will recruit volunteers to watch the recordings of the former participants.

"In this second experiment we are going to go in and digitally alter the video recordings, attenuating or exaggerating the color of the individuals' cheeks and forehead," Jimenez said. "The viewers will then try to identify how the person is feeling based on the manipulated skin colors."

Jimenez suspects the participants will be able to accurately detect the individuals' moods. The new data will become part of her dissertation and may lead to further clarification of which colors are tied to certain emotions.

"This research could be very useful," she explained. "Especially in surveillance situations that try to figure out if a person is lying or is nervous."

The scientists are hoping the research pans out, saying that such studies could lead to an exciting development in technology where computers can recognize exactly how you feel.


The University of Texas at El Paso
College of Engineering
Engineering Building Room A148
500 W University Ave
El Paso, TX 79968

Phone: (915) 747-6444
Fax: (915) 747-5437

Social Media spacer playlist spacer Events Calendar Business Center Give Now