

3 Ways AI Could Aid Behavioral Health Screenings

Using artificial intelligence (AI) to supplement traditional behavioral health screenings is gaining momentum in primary care.
Some areas being explored include: predicting risks among adolescents that they could experience mental illness; reducing readmissions by screening patients and treating if they test positive for opioid-use disorder; and implementing AI therapy chatbots to supplement cognitive therapy.
The tools typically are designed to address common challenges providers face, such as improving efficiency and workforce shortages. And while many of these applications can extend access to care, they are not a replacement for providers. Here are several recent developments that caught our eye.
1 | Predicting Risks, Potential Causes of Adolescent Mental Illness
An , developed by Duke Health researchers, accurately predicted when adolescents were at high risk for future serious mental health issues before symptoms become severe.
Unlike prior models that primarily rely on existing symptoms, the AI model identified underlying causes, such as sleep disturbances and family conflict, as indicators to prescribe preventive interventions. The capability to identify early warning signs and proactively intervene with prophylactic treatments could greatly expand access to mental health services, with assessments and care available through primary care providers, researchers said.
The AI model could be used in primary care settings, enabling pediatricians and other providers to know immediately whether the child in front of them is at high risk and empowering them to intervene before symptoms escalate, notes Jonathan Posner, M.D., professor of psychiatry and behavioral sciences at Duke and senior author of a .
Posner and colleagues analyzed psychosocial and neurobiological factors associated with mental illness using data from an ongoing study that conducted psychosocial and brain development assessments of more than 11,000 children over five years.
Using AI, the researchers built a neural network 鈥 a model that mimics brain connections 鈥 to predict which children would transition from lower to higher psychiatric risk within a year. That model then is used to score a questionnaire that ranks responses from the patient or parent about current behaviors, feelings and symptoms, to predict the likelihood of an escalation.
Takeaway
The model was 84% accurate in identifying patients in the study whose illness escalated within the next year, the study found. Duke researchers analyzed an alternative model that identified the potential mechanisms that might lead to or trigger worsening mental illness. With an accuracy rate of 75%, the new modeling system鈥檚 ability to identify underlying causes can alert doctors and families to potential interventions, researchers conclude.
2 | Reducing Readmissions by Screening for Opioid-Use Disorder
The National Institutes of Health April 3 released a that found that an AI intelligence screening tool was as effective as health care providers in identifying hospitalized adults at risk for opioid-use disorder and referring them to inpatient addiction specialists.
When compared with patients who received consultations with providers, patients screened by AI had 47% lower odds of hospital readmission within 30 days after their initial discharge, saving nearly $109,000 in care costs.
The study, published in Nature Medicine, reports the results of a completed clinical trial, demonstrating AI鈥檚 potential to affect patient outcomes in real-world health care settings. The study suggests that investment in AI may be a promising strategy specifically for health systems seeking to increase access to addiction treatment while improving efficiencies and saving costs.
The AI screener was built to recognize patterns in data, like how our brains process visual information. It analyzed information within all the documentation available in the electronic health records in real time, such as clinical notes and medical history, to identify features and patterns associated with opioid-use disorder. Upon identification, the system issued an alert to providers when they opened patients鈥 medical charts with recommendations to order addiction medicine consultation and to monitor and treat withdrawal symptoms.
Takeaway
The trial found that AI-prompted consultation was as effective as provider-initiated consultation, ensuring no decrease in quality while offering a more scalable and automated approach. Specifically, the study showed that 1.51% of hospitalized adults received an addiction medicine consultation when health care professionals used the AI screening tool, compared with 1.35% without the assistance of the AI tool. Additionally, the AI screener was associated with fewer 30-day readmissions, with approximately 8% of hospitalized adults in the AI screening group being readmitted to the hospital, compared with 14% in the traditional provider-led group.
3 | Deploying AI Therapy Chatbots vs. Standard Cognitive Therapy
Generative AI (GenAI) chatbots hold promise for building highly personalized, effective mental health treatments at scale, while also addressing user engagement and retention issues common among digital therapeutics, notes a recent .
The randomized-control trial study of Therabot by Dartmouth College researchers found 鈥渟ignificantly greater reductions of symptoms鈥 for major depressive disorder, generalized anxiety disorder and those at high risk for eating disorders.
Trial participants felt they could trust the therapy chatbot to a degree that was comparable to working with a real therapist, notes a press release from Dartmouth.
Takeaway
Fine-tuned GenAI chatbots offer a feasible approach to delivering personalized mental health interventions at scale, but further research with larger clinical samples is needed to confirm their effectiveness and generalizability, the study notes.
Michael Heinz, M.D., the study鈥檚 first author and an assistant professor of psychiatry at the Dartmouth College Center for Technology and Behavioral Health and the Geisel School of Medicine, said that 鈥渘o generative AI agent is ready to operate fully autonomously in mental health.鈥 He highlighted, 鈥淲e still need to better understand and quantify the risks associated with generative AI used in mental health contexts.鈥
Learn More
Visit the AHA Behavioral Health website to access a wealth of resources, including reports on child and adolescent mental health, rural behavioral health issues and more. Also, read the AHA Insights report 鈥淏uilding and Implementing an Artificial Intelligence Action Plan for Health Care鈥 for information on how AI can transform your operations.