What if a vehicle could be designed and engineered in VR—allowing customers to sit in a car before it exists, and experience the in-vehicle experiences of tomorrow? What may have once seemed futuristic is today becoming a reality with the Unity development platform.
In our latest Affectiva Asks podcast, we interviewed Danny Lange, VP of AI and Machine Learning at Unity. During the interview, he talks to us about his work with car companies at Unity, some of the projects and challenges he has seen OEMs and Tier 1s encounter, and his thoughts around technical hurdles to overcome around the design of autonomous vehicles.
Danny Lange speaking at the 2017 Emotion AI Summit
Let's start with your background. Can you speak to your career trajectory and how you arrived at your role at Unity today?
I'm an engineer and I have worked at Microsoft, and IBM, and Amazon, and Uber on applying machine learning, analytics, and big data in various ways. A few years ago, I started thinking about how to escape the boundary of human data, human-generated data. I wanted to move into simulation—or creating environments where I can actually simulate massive amounts of data, and in that effort, progress the field of AI. That's really what we're doing at Unity now.
Can you describe your role at Unity, and what you're currently working on?
We are trying to do two things: we're trying to bring AI to games and gameplay, but we're also trying to use games and the concept of gaming to progress AI. We have a big focus on reinforcement learning, and the whole theme that we have around our development is to create a flywheel of AI where the game interacts with humans, and produces data. The AI takes action on that, and interacts with humans in this flywheel where the system learns more and more about the user.
You spoke at our first Emotion AI Summit back in 2017. With our third annual event coming up on October 15th, can you take us through what you spoke about then, what you thought about the event, and would you recommend that listeners attend?
That was a very exciting event. First of all, I actually spoke about the flywheel. I spoke about how AI systems should perceive human emotion, should take action on it, and then fine tune and understand the interaction between what the system is doing and how the person reacts to it. This creates that feedback loop which includes emotions. Not just, you know, what you touch or click with your mouse, but basically your state of mind while you interact with the system, and take that into account.
What was interesting about the event was the very cross-disciplinary attendance. There was a lot of people attending that had very different backgrounds. What I liked was listening to speakers with backgrounds from psychology to software engineers discuss the many different aspects of system-building, of user interfaces, and of interaction.
Recently, I've noticed Unity working in the automotive space: from your website it looks like you've engaged in projects with Audi and Toyota. Could you speak more on what kind of projects you collaborated with them on?
Yes: one example is visualizing CAD drawings, so that when you have engineers and designers dream up a car in software, you can export all that software into Unity and visualize the car in virtual reality. This allows you to sit in a car seat to make it more real by looking around through your VR goggles or headset. This gives you a very good impression of how the vehicle's going to feel before you even build it. We also have some simulations and training around autonomous vehicles.
What, from your perspective, are the key challenges the automotive industry faces in terms of designing these vehicles?
There are many areas of challenges in automotive design: one particular challenge is getting an autonomous vehicle to fit into a social environment—that is, driving along with humans, pedestrians, bicyclists, and other drivers. Autonomous vehicles have to be more than mechanical robots keeping a safe distance: they need to understand the interaction that goes on between the other agents on the roadway. For example, when I step into a crosswalk, I look at the driver of the vehicle approaching and see if the person sees me before I step. Imagine the self-driving car approaching the crosswalk: how am I going to intact with it? That is a huge challenge generally unsolved at this point, but something that I know that Affectiva, for instance, is also very interested in, especially with regard to understanding emotions of humans interacting with vehicle systems.
How do you see the role of OEMs, Tier 1s, and car companies transforming with so much industry change? Are you seeing any trends in your interactions or conversations with them?
I see a lot of development. What we have seen in the past is OEMs being the traditional conservator. But now I think that both the technological development of simulation environments, and VR and AR environments that Unity have developed, and startups in the area developing various kinds of technologies for vehicles have been pushing the OEMs to be much more innovative, agile, and open to these new technologies. They want to improve their effectiveness and agility as well, and are looking at technologies like the Unity environment to speed up the development process, to do more trial and error, to fail faster, to not have to build the entire vehicle to test elements of that vehicle. So we are seeing a shift where OEMs are getting much more open to trying these new approaches.
Do you have any call to action for people listening today or asks you have of them?
I would ask our listeners to always look for what I call the feedback loop, the AI feedback loop or the flywheel. If someone tells them that something is AI because a very smart software engineer sat down, wrote an algorithm, and it's doing something very clever one way, it's not really AI.
AI is when there's a true interaction between the environment and the system that makes the system a little better all the time. It's very important to look for that, to aim for that, because that's real AI. That's why I'm so fascinated by the work that Affectiva is doing because it is about capturing human emotions, doing something in response, and then seeing how their emotions change. So you have an interaction that carries the promise of creating better systems around us: systems that are more adaptive to our needs and even our mental state.