I’m Listening: How AI and Mental Health Services Are Joining Forces

Last Updated June 28, 2022

AI in mental health

The integration of AI and mental health isn’t meant to replace human therapists. It can, however, offer timely support and ultimately save lives.

COVID-19 led to a dramatic increase in the digital distribution of health-related services and information. We couldn’t access information and attention easily, so virtual options became necessary and commonplace.

Interestingly, a RAND study revealed an increased use of telehealth during the height of the pandemic was spurred more by people looking for mental health services than physical care.

Because of that uptick in demand, organizations turned to advancements in Artificial Intelligence (AI). This technology increases access to and availability of mental health services.

In other words, people were getting real and meaningful mental health assistance from non-sentient beings. Think chatbots and virtual assistants, for example. All of that that requires conversational speech data that’s annotated by humans.

Let’s look at how AI and mental health services come together, and examine a couple caveats to keep in mind.

The Connection Between AI and Mental Health

According to the Canadian Mental Health Association, about 50% of the population will have or have had a mental illness by age 40. The added toll of all that we’ve been through since the onset of COVID-19 increased the need for additional tools and resources.

How exactly is AI helping?

Detecting the Signs

Did you know conditions like depression and post-traumatic stress disorder can be spotted by analyzing speech patterns and facial expressions?

One study showed how algorithms using language analysis were 100% accurate at identifying teens who were likely to develop psychosis. These tools already exist, and they’re incredibly powerful.

For example, a smartphone app called Mindstrong alerts your doctor when you’re at risk of depression based on how fast you’re typing or how often you’re leaving your house.

Other wearables like a watch or bracelet flag signs of distress based on heart rates or sleeping patterns. It can refer you to a mental health or emergency response provider as needed.

Appropriate and well-timed action is crucial in many areas of mental health. AI tools provide invaluable support to human providers and patients between appointments.

24/7 Support

Chatbots and apps are accessible no matter where you live. They’re a low-cost and affordable support option. They’re also always awake and therefore available around the clock.

If you’re struggling in the middle of the night, initiating a conversation with a chatbot can be a life saver. The AI can be a listening ear, recommended calming techniques, or even point you to an emergency helpline.

X2, for example, is described as “a transformative digital behavior change program that guarantees people get access and receive the help they need.”

Their clinical psychologists came up with “Tess”, a Mental Health Chatbot that provides self-help chats and text message exchanges – similar to texting with a friend or coach.

Here’s CEO and X2 Founder Michael Rauws’ story:

“Ten years ago, I was struggling with depression. Through trial and error, I found a wonderful psychologist, who was able to help me through that time by using talk-therapy. Later on, I realized that when I was speaking with my friends and colleagues, I was simply repeating the conversations I previously had with my psychologist. That’s when I first realized: If I can help people by repeating these conversations, then we could teach a machine to do the same.”

At the end of the day and in the dark of night, some people may feel more comfortable sharing their struggles with an anonymous chatbot than a human being.

Review and Supervision

AI reviews the therapeutic approach, suggests areas of improvement, and teaches humans how to better respond in certain situations.

ieso, for instance, uses AI to analyze the language used in therapy sessions through natural-language processing (NLP) and machines processed transcripts.

The clinic aims to provide therapists with a better insight into their work. This serves to deliver high standards of care and to help trainees improve.

A tremendous example of this is the Trevor Project, a suicide prevention and crisis intervention organization for LGBTQ youth.

Their research shows an estimated 1.8 million LGBTQ youth between the ages of 13 and 24 in the US seriously consider suicide each year. Furthermore, at least one LGBTQ youth between these ages attempts suicide every 45 seconds, according to Kendra Gaunt, data and AI product manager at The Trevor Project.

People in need can connect with a crisis counselor 24/7, 365 days a year, from anywhere in the United States. It’s 100% confidential and offers immediate support for those struggling and thinking about self-harm.

Recently, they launched the Crisis Contact Simulator. The model simulates digital conversations with LGBTQ youths in crisis. This allows aspiring counselors to experience realistic practice conversations before taking live ones.

The goal is to connect with every LGBTQ youth that needs support. They further leverage AI to equip as many counselors as possible. The technology also improves the flexibility and quality of our training process.

Therapists monitor patient progress as well. They analyze patient utterances during sessions to pinpoint signs of progress, rapport building, or regression.

AI in Mental Health Caveats

Human Connection is Important

There’s something key in that Trevor Project example to keep in mind. They’re using AI to simulate sessions for the purpose of education and training, not to replace one-on-one meetings between patient and therapist.

Conversational therapy, personal rapport, and connection still matter. That’s hard to replicate through virtual calls and AI is certainly not there yet.

AI can’t fully detect context and nuance either. There are situations that require on-your-feet thinking, especially in crisis situations.

This requires further developments in conversational AI, a range of technologies that bring together machine learning, natural language processing (NLP), contextual awareness, and other advanced tools to facilitate a real-time dialog between human and machine.

Developers Need More Data

Annotation and labeling for both speech and text is also huge here. When your goal is to further improve the accuracy of the AI’s speech recognition – whether spoken or typed – and response, you need the help of real-life human transcribers. They record what people say, when and how they say it, and by whom.

When clients come to us for customized speech data collection and transcription, they’re trying to solve for the edge cases where ASR still struggles.

In the case of AI and mental health, that means tone, context, and red flag terms, to name a few.

We therefore initiate natural data collection and err on the side of human speech transcription to ensure accuracy and inclusivity, and to handle complex environments and use cases.

Developing an AI Solution for Mental Health?

There are many variables to consider when it comes to optimizing your requirements for cost and delivery speed.

You need a provider that’s adaptable, flexible, and looking out for your best interests.

If they’re not deep diving into your end use case and offering a variety of solutions, they’re likely not the best fit.

At Summa Linguae, our data solutions experts work with you to understand exactly what level of transcription you need.

And if your requirements aren’t yet fully defined, we can help you choose the right solution.

Contact us now to get help with your AI mental health solution

Related Posts

Summa Linguae uses cookies to allow us to better understand how the site is used. By continuing to use this site, you consent to this policy.

Learn More