

Will AI Help Address Our Behavioral Health Crisis?
It is still early in its evolution, but artificial intelligence (AI) is assisting some clinicians in the way they diagnose and provide therapy for behavioral health patients.
The hope is that AI may be able to help providers improve access for the growing number of patients who need care. The applications, both home-grown in health systems and those from innovative startups, are drawing interest from researchers, payers and investors alike.
Applying AI in Therapy
At Cedars-Sinai Medical Center in Los Angeles, for example, physician-scientists are studying what they say is a first-of-its-kind program that uses virtual reality and generative AI to provide mental health support.
Known as , or XAIA, the program offers users an immersive therapy session led by a trained digital avatar.
XAIA was created to offer patients self-administered, AI-enabled, conversational therapy in relaxing virtual reality environments 鈥 such as a creekside meadow or a sunny beach retreat where patients also can engage in deep-breathing exercises and meditation. It employs a large language model (LLM) programmed to resemble a human therapist.
Expert mock therapy sessions were performed and transcribed to hear firsthand how a trained psychologist can 鈥 and should 鈥 interact with patients. From these sessions, recurring exchanges and patterns were identified and encoded into the LLM as rules and preferred responses for XAIA to emulate. The findings include more than 70 best practices for mental health therapy.
A found that study group patients accepted the technology and that it was a safe form of AI psychotherapy that warrants further research.
Early Identification for Better Outcomes
At the , researchers are working with scientists from the University of Cincinnati, University of Colorado and Oak Ridge (Tennessee) National Laboratory to use AI to identify the risk of anxiety, depression and suicide in children and teens.
The scientists are developing tools that combine information routinely collected by physicians and then use AI and natural language processing (NLP) to sift through the data to identify kids at greatest risk of developing a mental illness for earlier care and intervention. The team has developed software that learns from unstructured data (written texts or spoken words) to identify key features described in the health record and other data.
Meanwhile, the British startup has delivered AI-powered psychological assessment and triage tools in large-scale clinical settings in the U.K.鈥檚 National Health Service. The company reports 45% fewer changes in treatment, due to increased triage accuracy.
The company鈥檚 , including depression, anxiety and post-traumatic stress disorder. Limbic Access augments clinician assessments, saving 12.7 minutes per referral and reducing wait times for screening and treatment. The company plans to expand in the U.S.
Supporting Overstretched Clinicians
Elsewhere, the Boston-based behavioral health startup uses AI, voice analysis and NLP to improve patient outcomes and workflow efficiency. The company is to increase clinician access to AI education, research and provider-centric tools.
Used by behavioral health organizations in 29 states, Eleos鈥 clinically validated technology has been proven to reduce provider documentation time by more than 50%, double client engagement and drive three to four times better care outcomes, the company states.
4 Challenges for the Future of AI Deployment in Behavioral Health
1 | In any application of AI, especially direct clinical interactions, does the clinician feel comfortable with the level of human oversight for the treatment or interaction?
Is the human 鈥渋n the loop,鈥 and do they feel that the potential benefits to the care clearly outweigh the risks associated with using AI in these cases?
2 | Will patients share their deepest and most personal feelings with an avatar or chatbot?
3 | Can AI overcome bias?
A from last year found that there are still significant gaps in our understanding of how AI is applied in mental health care. The report cited flaws in how existing AI health care applications process data and insufficient evaluation of the risks around bias.
4 | How will AI address subjective judgment?
Diagnosing behavioral health issues often requires more subjective judgment compared with diagnosing physical conditions. Decisions must be based on the self-reported feelings of patients rather than medical test data, noted futurist Bernard Marr in a . This could lead to more uncertainty around diagnosis and the need for careful monitoring and follow-up to ensure the best patient outcomes.
Learn More
The AHA Behavioral Health website provides a wealth of resources about child and adolescent health and the AHA鈥檚 behavioral health strategic priorities.