By: Boisy G. Pitre, Mobile Visionary
Communication is one of the drivers of the technological progress that we enjoy today.
The ease in which I can reach out to another person over audio and video communications services such as FaceTime, Skype, or simply via voice is due to the innovations that have blossomed in the telecommunications industry. At Affectiva, we excel at bridging human emotions with digital experiences, and communications is certainly a large part of that. A video call with someone you care about or among a group of colleagues provides the opportunity to capture a number of emotions from all of the participants. This information, either gathered from analyzing facial expressions or voice analysis within the context of the conversation, adds additional depth and clarity to what is being discussed and how it is being perceived. It is changing the way we understand each other.
There is continued innovation happening within the telecommunications space. Twilio is empowering its customers with new and interesting ways of providing communications within specific contexts. Imagine being able to connect via voice or video immediately to your airline to inquire about a reservations issue right from within your web browser. In such a scenario, there is no need to pick up your phone and make a call, potentially losing the context in which the concern was raised. Twilio is providing solutions like these, and is empowering their developer community to find new and interesting ways to solve existing communications problems.
Two weeks ago at SIGNAL, Twilio product manager Evan Cummack and I demonstrated an integration of Affectiva’s emotion recognition SDK for iOS with Twilio’s own iOS SDK, creating a fully functional video chat app. By merely typing in a “handle” instead of a phone number, I was quickly able to establish a video call from my iPhone to Evan’s. With the call started, we could see each other’s face, hear each other’s voice, and with Affectiva’s Emotion AI technology, see a representation of the emotional state of the other via an emoticon placed on the screen. Through the simplicity of both of our SDKs, we were able to get to a simple prototype in less than a day (the project is available on GitHub).
Here’s the video of our presentation:
Seeing an iconic representation of one’s emotions on the same screen as the face may appear at first glance to have limited utility. However, the demo is meant to illustrate a simple integration. Real world possibilities are more useful when you ponder how to use emotional data within the context of the call. Valence, which is the measure of the overall positivity or negativity of one’s facial expression, might be tracked for both participants throughout the call in order to gain insights just how well the communication went. This is especially useful in customer support environments where satisfaction is constantly gauged.
The communication space is growing by leaps and bounds, and emotion sensing is destined to be a part of that growth. We’d love to talk to you about your ideas for leveraging Affectiva’s emotion technology with the Twilio SDK!
Email me or Tweet at me with your ideas.
And … you can give this a whirl yourself: download our free SDKs here you so can add emotion-sensing and analytics capabilities to your own apps and digital experiences.