BLOG

Emotion Recognition Emotion AI Healthcare

SDK on the Spot: Suicide Prevention Project with Emotion Recognition

08.14.17

Emotions have a significant impact on our mental health and wellbeing. Today, not only in the doctor’s office but also in our daily lives, we can measure and analyze all aspects of our physical health. However, we are not yet tracking and quantifying emotions to get a complete understanding of our overall wellbeing.

As part of our Emotion AI Summit coming up on September 13th, we are featuring a healthcare panel: “Emotion AI in Mental Health, Well Being and Medical Research”. In this panel we will discuss how  facial expressions and vocal biomarkers can be indicators of mental health issues and disease. And, how Emotion AI is helping improve research, diagnosis and treatment in areas of autism support, suicide prevention, and early detection of diseases such as Parkinsons.

Moderated by Dr. Joseph Kvedar, panelists include Yuval Mor, CEO, Beyond Verbal Communication, Ned Sahin, PhD, Cognitive Neuroscientist & CEO, Brainpower, and Steven Vannoy, Associate Professor, University of Massachusetts Boston.

We wanted to highlight Steven’s work in this area and dive deeper into his project, his current process of developing it, and the role of emotion-enabled technology for suicide prevention.

VanCirc.png

Steven Vannoy: Suicide Prevention with Emotion Recognition

With more than 48,000 Americans dying by suicide every year, suicide rates have climbed approximately 36% between 2000–2021 in the United States. (Source: CDC, “Suicide Statistics)  Of these tens of thousands of people, over 90% have a mental illness, with depression being very common (It’s important to note that we know that most people who die by suicide are depressed - but that doesn’t mean that most depressed people will die by suicide.) While we know many of the long-term risk factors for suicide, our ability to predict who will attempt when, is severely limited. This absence of help for so many suicidal people is an alarming problem. However,, the ever growing presence of technology could be the solution.

What if our phones could flag an emotional crisis and then provide the necessary resources to prevent suicide? After all, we’re practically always attached to our screens, so perhaps our phones could routinely monitor our emotions and behaviors to gain deeper, more objective insights regarding the state of our mental health.

Steven Vannoy, a mental health services researcher, is digging into this idea. He is developing a project that aims to predict a user’s short-term risk of attempting suicide. Using Affectiva’s SDK, the product will routinely analyze the user’s emotions through periodic “check-ins” in which the user describes what they are doing, who they are with and how they are feeling about the future. Although experimental at this point, the ultimate goal would allow the user’s mental health care provider to easily monitor the patient’s mental health status and be alerted to changes that indicate a move towards crisis.

shutterstock_620403995.jpg

What is the long term goal for the project? What will it do?

The project is still very experimental right now, but the long term goal is to develop an app that will do facial and verbal emotion recognition for the purpose of identifying an impending mental health crisis. To analyzing the user’s emotional state, the app will gauge their well being via questions like, “Hey what are you doing? How are you feeling about the rest of the day? What are you doing that is interesting (rewarding, fun, etc.)?” These check-ins will be scheduled and users will receive notifications at predetermined times of the day.

During every check-in, the user’s response gets compared to previous answers to see if they’re more positive or negative than their baseline and their most recent check-in. To strike a balance between being unobtrusive but still effective, the app will most likely conduct three check-ins per day. The user’s provider will be able to see this log. If the patient isn’t checking in, the provider will be notified and then can conduct a safety check.

What are the first steps or short term goals for the app?

Before the app can actually be able to accurately determine whether a given person is suicidal or not, it can be utilized for situations that require less parameters and have less of a grey-area. Individuals who have been hospitalized for attempting or being at high risk for attempting suicide are at extremely high risk for future attempts. Their hospital stay is a perfect time to collect baseline data and their day of discharge provides an excellent comparison for future check-ins.

What role does emotion and emotion technology play in the concept of your project?

Emotion technology plays a huge role in this concept. With Affectiva’s SDK, the project will be able to analyze the user’s emotions based on their facial expressions. From there, the app can draw conclusions regarding the user’s mental health and, therefore, respond accordingly.

About Steven Vannoy

With a masters in public health and a postdoctoral fellowship in geriatric mental health services, Steven is a mental health service researcher. Currently, he has honed in his research on suicide prevention. He is a member of APA and the American Association of Suicidology.

emotion-ai-summit-2019-recording-access

Emotion Recognition Emotion AI Healthcare