BLOG

Emotion AI Affectiva Automotive AI

Trust is a Two-Way Street: Join Us at Emotion AI Summit 2018

08.09.18

Many of us spend time every day in our cars, and so the extensive technological changes taking place with how we navigate our vehicles are deeply personal. Self-driving cars, automated driver-assist features, and enhanced connectivity are leading an explosion of change in our in-cabin experienceall fueled by AI.

It’s a new automotive frontier, but how can we achieve real trust in AI?

Join us at the 2018 Emotion AI Summit, which will include a special exploration into the future of the automotive experience and how our relationship with our cars is about to change:

 

AUTOMOTIVE PANEL

panel_buildtrust (1)

Building Trust in Next Generation Vehicles

Moderator: Regina Savage, Managing Director, Morgan Stanley

Panelists:

  • Karl Iagnemma, PhD, President & Co-Founder at nuTonomy (recently acquired by Aptiv)
  • Ola Bostrom, PhD, Vice President Research & Patents at Veoneer
  • Bryan Reimer, PhD, Research Scientist at the MIT AgeLab, Associate Director at the New England University Transportation Center

 

DEMOS

The exhibit area will feature hands-on demos, including Affectiva's car simulator, which shows how we can measure in real time driver drowsiness, distraction and your emotional state.


WORKSHOPS

workshop_adamandnils

How AI Helps Next Generation Vehicles Become More Human
  • Nils Lenke, PhD, Senior Director Innovation Management at Nuance Communications
  • Adam Emfield, Principal UX Manager at Nuance Communications

Dr. Nils Lenke and Adam Emfield of Nuance Communications will explore two areas of improvements in communication between humans and our vehicles. The first is how multi-modality (combining visual and auditory expressions) is a key feature of all human communication, and how these capabilities can be re-built with technology. Second, Nils and Adam will discuss how a whole dimension of this communication is still missing—that is, the emotional state of a speaker being expressed by non-verbal signals. So how will we be able to know the emotional states of drivers and passengers? And how will the systems themselves express emotions?

 

speaker_lexfridman

Cognitive Load Estimation from Video with Deep Learning
  • Lex Fridman, PhD, Research Scientist at MIT

This talk will explore how deep learning approaches can be used for perceiving and estimating a person's cognitive load from video of their face. We will discuss why this is an important and useful metric for multiple applications, but especially in the context of semi-autonomous driving. 

Download Emotion AI Summit 2018 content now

Emotion AI Affectiva Automotive AI