BLOG

Emotion-enabled Affdex SDK Emotional Intelligence

The Promise of Emotion-Enabled Augmented Reality (AR)

10.02.15

By: Forest Handford, DevOps Lead

According to internet tests, when it comes to extroversion and introversion I’m ‘balanced’.

Community is very important to me. When my wife and I moved to the Boston area we chose to live in a co-housing community so that we, and our children, would be able to develop close relationships to our neighbors.

I just returned from a developer conference. The conference didn’t have many opportunities for meeting people. I only knew one person at the conference, so when we went to the cocktail hour, and after-party, we spent most of the time together. Everyone else seemed to be in private conversation or wanted to be alone.

Augmented Reality (AR) is a view of the real world with an overlay of computer generated information. The most wide-spread use of AR today is in cars with Head Up Displays (HUDs) that show current speed on the windshield. If you’ve seen any movies with Iron Man, you have seen how the character Tony Stark sees the world through an interactive AR visor.

With AR, the conference experience would be very different. With an AR headset, phone, or tablet, AR software can use facial recognition technology or identity information broadcast via Bluetooth to identify people in view. Once identified, the device can show social information about the people including their Facebook relationship status and their current job from their LinkedIn profile. This information will appear to the user above or beside the people being viewed.

Since the inception of the chat room, chat users have been able to indicate if they don’t want to be disturbed. Chat rooms also allow for private and public conversations. AR will bring some of the features of chat rooms to the real world. We will be able to have indicators appear over our heads to give statuses like do not disturb, free hugs, or let’s talk. It will allow me to get through the fear of being rejected from a social interaction.

Affectiva’s software can emotion enable AR. People (like my son) who are on the autism spectrum have trouble reading other people’s emotions. Seeing words or icons to show another’s emotions will help people to know what is appropriate to say. Emotion recognition will give rapid feedback on how an interaction is going.

Imagine emotion recognition and availability information in extreme examples, like seeing a person crying in a parking lot. Weeks ago I saw an older woman crying outside my office building as I was walking in. She was alone, and I worried she needed help. I was afraid to ask, but I set my fears aside and walked up to her. She appreciated my gesture, but said she would be fine and her husband would be along soon. With emotion enabled AR, I could have had far more details to help me through the situation. It would have helped me know if I should approach her. It would have also let me know how she truly felt about my talking to her.

Before I travel to a country where locals speak another language, I spend months learning their language. Learning another’s language is a way of saying I value their culture. Language has long been a key indicator of ingroup membership. We have evolved to trust, and be more kind, to people within our ingroup than to outsiders. Alexander Haslam is a professor of psychology at the University of Queensland in Australia. His research team created a study where people attending a sports game passed an actor who appeared to be hurt. They found that if the actor wore a shirt with the local team logo they were significantly more likely to be helped than if they were wearing a plain shirt or a rival shirt.

With AR, I will travel to another country and see the translation of the foreign languages as I hear them. The people there will similarly get translations of my words with their own AR devices. Breaking down the communication barrier can cause people to accept each other as ingroup members. With emotion recognition, we can see when others react to something the way we do, thus also creating an ingroup feeling. For users uncomfortable with sharing social media and emotional information, their AR profile will allow them to opt-out of social media sharing and emotion recognition. Other user’s devices will be programmed to respect users who opt-out.

I’m really excited about how AR will enable us to reduce social barriers and increase empathy. If you’re working on AR designs, you can try our mobile SDKs for free to emotion enable your work. All of the technologies exist today to make this happen, it’s just a matter of putting them together in an affordable and efficient device. Google Glass is the closest the world has come to a consumer device, but sadly it is only being pursued for commercial uses.

Follow Forest on Twitter: @ForestJay

Emotion-enabled Affdex SDK Emotional Intelligence