Award-winning Driver Monitoring System helps improve safety

award-winning-driver-monitoring-system-helps-improve-safety-feat-img
award-winning-driver-monitoring-system-helps-improve-safety-feat-img

SRI researchers teach AI system to recognize human emotional states and react accordingly


A driver of a car can be in many different emotional states while driving: calm, anxious, angry, excited, relaxed or drowsy are just a fewSRI International researchers are teaching vehicles how to read and respond to a driver’s emotional state as part of a project with Toyota Motor Corporation. This project seeks to improve driving safety and personalize the interaction between a person and their car.

These efforts center on integrating emotional artificial intelligence (AI) into cars. SRI scientists are making strides in this with the development of the Driver Monitoring System (DMS). In 2020, DMS was awarded 2020 Auto Sensor Innovation of the Year from AutoTech Breakthrough, which as part of the Tech Breakthrough network, recognizes leading innovations across AI in industry and academia.

While stories about robo-taxis and other types of autonomous vehicles dominate the headlines, some automakers are recognizing that it will take some time before drivers will be ready to let the car do the driving. Until then, what if an AI co-pilot could help drivers remain alert and stay safe by keeping an eye on what happens behind the wheel?

SRI’s DMS uses a suite of infrared and three-dimensional cameras to track the driver’s eye movements, facial expressions and general body language. A supervised machine learning platform — a type of AI system partly trained by humans — analyzes the driver’s behavior in real time.

It then identifies when someone is feeling drowsy and can even recognize emotional states that could affect driving, like anxiety or roadrage. In response, the vehicle might blast the air conditioning to help a driver stay alert or possibly suggest an alternate route if it detects someone is bored or give more detailed directions if someone is nervous at an unfamiliar location. The more miles a specific person logs with the car, the better the AI recognizes and responds to their needs.

We’re trying to create a model of preferences, habits and reactionary behavior by observing you over a long period of time. It’s a very personalized model that we build and varies according to culture, gender, age and associated behavior patterns,” explained Amir Tamrakar, Senior Technical Manager in SRI’s Center for Vision Technologies, in an interview with Forbes magazine about the potential behind imbuing machines like cars with the ability to “understand” human emotion and respond accordingly.

As the principal investigator for the DMS project for Toyota’s high-tech concept car, the LQ, Tamrakar and his team spent the last five years figuring out ways to teach a machine how to recognize and react to human emotion across a range of physical cues. Similar AI systems tend to rely on eye-tracking technology only.

AR-car-dashboard

From DARPA training tool to driver behavior analytics

It was a long road to develop the award-winning sensor system.

SRI’s innovative DMS platform is based on technology first developed for a project with the Defense Advanced Research Projects Agency (DARPA). The project, DARPA SSIM (Strategic Social Interation Module) focused on social interaction training for soldiers deployed overseas, using virtual environments and a nonverbal-behavior sensor system. This sensor system now serves as the nucleus of SRI’s computer vision-based platform for analyzing behavior. Called MIBA, for Multimodal Integrated Behavior Analysis, the system uses sensors and software to track human movements, from how the head is composed to facial expression and eye gaze.

“Then we attach various meanings, semantic meanings, to the gestures and actions, according to the use cases,” said Tamrakar, who also has a background in psychology. The team works with subject-matter experts to help train the higher-functioning algorithms for specific uses.

Before the Toyota program, the Federal Highway Administration (FHWA) awarded SRI several grants to develop a platform that could analyze and automatically annotate video recordings of drivers. Under the Naturalistic Driving Study (NDS) for the second Strategic Highway Research Program (SHRP 2), the agency collected about five million hours of footage — dating back to 1997 — that needed to be analyzed for driving-safety research. The goal of the study was to address the role of driver performance and behavior in traffic safety. Under this project, SRI adapted its MIBA platform into a DMS (Driver Monitoring System) platform to detect driver inattention, distraction, drowsiness and other behavioral cues that may be related to driver impairment including negative emotional states such as anxiety and roadrage.

To maintain driver privacy, SRI also developed a face masking technology that replaced people’s faces with computer-generated avatars that faithfully reproduced the movements and facial expressions of the original people.

The Toyota program represented additional challenges as the AI needed to recognize behaviors in real-time and with limited computational resources on board the vehicle. This is critical since the system needs to react in real-time, especially if it detected something safety-critical like driver inattention and microsleeps. “You can’t have a lot of lag,” Tamrakar noted.

The SRI team also had to account for both facial and cultural differences in developing the LQ concept car’s AI system because the primary users would be Japanese. Culturally, the Japanese tend not to display negative emotions as openly.

“The cultural component is very important,” Tamrakar explained.

Future uses of SRI’s behavioral analytics AI platform

The behavioral analytics technology based on MIBA is also being used in some very different settings. In one use case, SRI is applying the platform to monitor and assess collaborative interactions among middle school students. The concept is to analyze how children collaborate and help teachers manage and gauge classroom activity when students work in groups.

“Our goal is to not only build an assessment system but also a recommendation system so that students can learn to collaborate better, which I think would be very useful,” Tamrakar said, adding that SRI is expanding the platform to college students learning online.

kids-on-steps-smiling-at-each-other

Another early-stage project involves putting a sensor system into clinics for autism studies, helping the clincians with accurately monitoring and tagging their behaviors. Initially, Tamrakar and his team are trying to help medical practitioners with early detection of autism, followed by identifying which interventions show the best results as behaviors change.

Ajay Divakaran, senior technical director of SRI’s Computer Vision Technologies Lab, noted that because the institute works broadly across industries, it can often find interesting intersections among various research modes, with developments in one area generating ideas and insight in another. “That’s a good example of how our technology trajectory can often flow,” Divakaran said.

Research is also ongoing into combining behavioral analytics with a voice assistant like the SRI-created Siri. Here, the challenge is teaching a computer to emote or produce gestures with an embodied avatar that improves the human-machine interaction.

“The hope is that computer communication and interaction is more natural going forward,” Tamrakar said.


Read more from SRI