By: Ashley McManus, Marketing Manager
A few weeks back we attended MobileBeat 2016 in San Francisco to learn more about artificial intelligence (AI) and chatbots. While the purpose of chatbots seems to be limited today, we envision a world of possibilities with this technology.
For example, Apple’s Siri and Amazon’s Alexa are both at their most elementary level, chatbots. Using some sort of input – in these cases, it is your voice – Siri looks up information for you, and Alexa performs an action you want her to do. But what if these bots could improve their interactions with us with machine learning on some more sophisticated inputs, such as our emotions?
Level-Setting: What are Chatbots, and What’s the Point?
Let’s take a step back for a minute: under the hood, it can be argued that the more well-known systems like Siri and Alexa could be considered glorified chatbot technology, but these conversational interfaces have their time and place. Bots were developed for a specific reason and purpose, developed to help brands and companies leverage the fact that chat applications are the most engaged with apps across all platforms. This allows companies to tap into chat as a platform, which was previously unavailable. The restriction of conversational interfaces is that information exchange is slower – the industry is still sorting it out and during that process the best applications are those “I’m feeling lucky” responses, or direct request use cases where the response is known or very easy to pick. For example, hailing a taxi to your exact location.
However, these bots actually can be utilized in much more creative ways than just asking a question and receiving an automatic result, or requesting that a simple task be complete. Use cases vary from weather bots you can text, social bots that post to your social profiles on your behalf based on your preferences, to customer support reps that triage requests or auto-file tickets. For this last instance, they have the ability to single-handedly drive customer relationship management – a bot can manage thousands of individual customer relationships at once, and through machine learning, nurture and convert those relationships over time.
Chatbot Technology Today: Shortcomings
Because bots have been designed at the most basic level, that’s where they also have shortcomings. Your bot is limited to only responding in the ways its developer intended, which can lead to frustrating conversations where you have to constantly rephrase yourself in order to best be understood. In this way, they are similar to apps and websites today – as no universal chatbot exists that will do all things you need it to do – like your virtual butler or personal assistant.
The functionality to perform specific tasks exists for chatbots now, but it is not robust enough to do the natural language processing to understand our intent. Bots are evolving along a similar curve that the Play Store had – in that it is difficult to discover the bot most relevant to what you are looking to do – right now, it’s just a one-way conversation and a hit-or-miss result. In this way, chatbots – like much of the technology surrounding us every day – has a high IQ, but low EQ. This reinforces the need for a universal bot – or the bot that can behave like a real person. This requires lots of context, and bots are getting smarter about context already, such as knowing your schedule, your location, etc. But we think that understanding your emotional context would help make the interaction between bots and humans a much more natural experience.
Filling the Gap with #EmotionAI
Bots today are trying to pull in more and more of this context so that they can improve the relevance and quality of their actions – while also avoiding you having to repeat yourself. As a simple example, a bot trying to schedule a dinner should factor in your current schedule and potentially those of invitees.
According to Natalie Monbiot, who acts as SVP Managing Partner, Strategic Innovation at UM Worldwide, this next wave of chatbot technology is on the horizon. In a recent post she published on Early Learnings of the Bot-Verse, she emphasizes how that important conversations are as a highly effective medium for the chatbot – that is, more than just a one-way communication. This is because they contain implicit understanding of context and tone – similar to in a real-live person relationship, as this understanding only gets better the more that you talk to that person. She argues that this same concept applies to bots – as long as they are built with machine learning.
Part of what we need to do is to improve the inputs that we provide to bots so that we can be more proactive on what they are responding with – and at Affectiva we see one of those points is emotion. We are excited about playing with chatbots to make our engagements with them even more personal and nuanced. These kinds of applications could range from interacting with your car to how you will one day walk into your house and have it know how you feel with the growing connected devices involved in the Internet of Things (IoT) movement.
Looking to the future: what’s next for bots?
After this conference, it’s clear that we aren’t quite there yet. The technology must advance to make these concepts a reality. Keeping this understanding in mind, it’s conceivable to consider that eventually bots will replace some of the apps we know and love. This will result in the future “superbots”, which will be butlers with generalized super bot capabilities that learn from your behavior and serve you all of the information you will ever need about anything and make that dinner reservation on your behalf. They are a new channel for how we communicate with technology, and these conversational interfaces are the way of the future.
The Bottom Line
Bots can do better. If we ever want their capabilities to extend beyond informational exchanges, and have a more natural personalized interaction, we need to look at emotion sensing technologies to help close the feedback loop to reinforce positive interactions and discouraging those that are negative. Natural language processing will play a role, but non-verbal signal also plays an important part. It is the difference between seeing a conversation in person, and reading a transcript.
When looking to develop the next generation of chatbots, we need to add another layer of context into their interface – and we believe that context can be developed with emotion AI. It will help bots to become more conversational and adaptive to learn from our engagements with them, similar to our human relationships. How we react emotionally to the information they give us will be instrumental in helping them to improve upon that next engagement with us.