Brain digital design, vector illustration eps 10

Are you ready to have your face read by a computer?

People interested in emotional intelligence will be fascinated by this episode of Big Picture Science, a radio program produced by the SETI Institute in Mountain View. Hosts Dr. Seth Shostak and Molly Bently speak with researchers who are teaching computers to recognize subtle emotions. On the way, they are learning more about how our brains do the same thing. Will your laptop know when you’re feeling “happily confused”? In the program, that is what Aleix Martinez – cognitive neuroscientist at The Ohio State University is exploring.

For years, cognitive scientists thought that facial expressions fell into 6 simple categories: surprised, happy, sad, disgusted, fearful, and angry. three optimistic facesBut it turns out there are upwards of 21 facial expressions and corresponding emotions.  Martinez’ students practiced showing expressions in the mirror after reading scenarios designed to evoke specific emotions. These included happily surprised, fearfully angry, angrily disgusted, even happily disgusted. Turns out our facial muscles contract involuntarily in response to these emotions, sometimes in a fraction of a second, and often before we are conscious of them.

Are you “happily disgusted” or “angrily terrified”?

 

yellow balls with placards in their hands, vector and illustration

A happily disgusted face could be a response to a joke with a gross punch line. Students were photographed and the expressions analyzed and categorized by a computer. It is some of the same technology used by video game animators to make characters look realistic. Martinez is trying to correlate these expressions to neuropathways in our brains so that some day computers might help diagnose emotional disturbances.

What happens to the face during certain emotions? There are 46 muscles involved in making emotional expressions, causing wrinkles around the nose, or nostrils to contract, or the mouth to open. Martinez says these emotional facial expressions are remarkably consistent in humans. If a computer can be taught to recognize and name subtle emotions playing across our faces, can we be taught the same thing?

Watch this short video by Josh Freedman about Emotions and Where They Come From

feeling_faces_LAt Six Seconds we teach emotional literacy as one of the core competencies of “Know Yourself”. Six Seconds has some great resources for learning and practicing emotional literacy here: http://6seconds.org/2013/03/07/enhance-emotional-literacy/

We already know people pay a lot of attention to the configuration of facial features such as the distances between eyes, brows, nose, mouth, etc. to understand others’ emotional states. Small babies can react to these cues, so this is likely hard-wired as a survival skill. Studies have shown mothers who mirror their baby’s expressions and vice versa have more contented, happy babies, and their brain development is linked to this early empathetic interaction. Some people lacking mirror neurons, such as those with autism, sometimes have a more difficult time recognizing basic emotions in others.

 

complex_emotionsBut why teach a computer to do what humans can already do? Dr. Martinez wants to learn more about our chemical and genetic markers for emotional or brain disorders. If a computer model can tell the difference between clinical depression, PTSD and autism, then doctors will have a better chance of treating the right problem with the right approach. This doesn’t mean the doctor would ignore their own diagnostic conclusions, but it could be another tool to help diagnose and effectively treat brain chemical disorders. Eventually,

Sad Teenager with TabletMartinez wants to create an app where you could hold up an Iphone to someone’s face, and it would interpret the emotions connected to the facial expressions for you. While that may be causing more problems socially than it solves, it is an intriguing idea for the socially awkward. Through training we can improve our brain’s capacity for emotional literacy, so we may not need an Iphone to tell if someone is sad or not.

Rachel Goodman

Peabody Award-winning broadcaster and communications professional, editor, producer, and writer for effective outcomes. Ms. Goodman has been a radio producer for much of her career, specializing in short features and documentaries. Some of her work includes Southern Songbirds: the Women of Early Country Music, Pastures of Plenty: A History of California's Farmworkers, and The Boomtown Chronicles: Reflections on a Changing California. Ms. Goodman teaches journalism at Cabrillo College in Santa Cruz County. Her goals are to facilitate positive change in the world through effective communication, and to continue conducting her work with the highest level of integrity possible.

Pin It on Pinterest

Share This