Emotion recognition

An article in the New York Times recently, “When Algorithms Grow Accustom to Your Face“, discussed the advances in identifying the micromodifications in our facial expression and mapping them to the emotional states that produce them. While the article focused on the privacy concerns that this raises, as well as the various uses that the techniques could be put to, what caught my attention was the fact that we are trying to make explicit what we do, for the most part, at a subconscious level every time we interact with another person.

Through the effort to create algorithms that associate face shape, skin tone and other factors with emotions, we are having to ask ourselves, “How did I know that my wife didn’t like the idea of going to the gym, even if she did not say anything?” There were a whole range of changes to her face as I was speaking that stimulated in me memories of other encounters, which lead me to feel she does not like the idea. 99% of the time, we are not aware of this process. It happens and we respond accordingly. I drop the idea of going to the gym and instead suggest we go to dinner.

Almost HumanThe technology here is leading us down a path of self knowledge. If I want to build an emotion recognition program, or create a synthetic cop, like in the US TV series Almost Human, I need to understand my emotions better, how they are created, how they manifest and how I understand others’ emotions. In the end, leaving aside the uses of the knowledge for a moment, and these are worth discussing, the knowledge itself is important.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s