Generative artificial intelligence demonstrates potential as a powerful tool for mental health care by enabling professionals to identify access barriers experienced by diverse populations and create tailored treatments that improve patient outcomes, according to research from the University of Illinois Urbana-Champaign.
Researchers harnessed generative AI in tandem with measurement-based care and access-to-care models in a simulated case study, creating a framework that promotes personalised mental health treatment, addresses common access barriers and improves outcomes for diverse individuals.
Social work professor Cortney VanHook led the research, collaborating with co-authors to employ generative AI in simulating the mental health experiences of “Marcus Johnson,” a fictitious character. The researchers designed Marcus as a composite representing a young, middle-class Black man experiencing depressive symptoms while seeking care through Atlanta, Georgia’s health care system.
The AI platform created a detailed case study and treatment plan for the fictional client in response to the researchers’ prompts. Based on personal details used in the prompt, the AI platform examined the simulated client’s protective factors, such as supportive family members and potential barriers to care, including gendered cultural and familial expectations, along with concerns about obtaining culturally sensitive treatment due to the shortage of Black male providers in his employer-sponsored health plan’s network.
Test and refine
Simulations reflecting real-world scenarios help practitioners grasp how individuals access mental health care, identify common obstacles and recognise demographic disparities, VanHook said. Employing a simulated client addresses concerns about patient privacy law violations, allowing practitioners, trainees, and students to test and refine potential interventions in a low-risk environment, thereby promoting more equitable, responsive, and effective mental health systems.
“What’s unique about this work is it’s practical and it’s evidence-based,” VanHook said. “It goes from just theorising to actually using AI in mental health care. I see this framework applying to educating students about populations they might not be familiar with but will come in contact with in the field, as well as its being used by supervisors in the field when they’re training their students or by clinicians on how to understand and best support clients that come to their facilities.”
VanHook and co-authors Daniel Abusuampeh of the University of Pittsburgh and Jordan Pollard of the University of Cincinnati prompted the AI platform to apply three theoretical, evidence-based frameworks in creating its simulated case study and treatment plan.
The AI software was prompted to use Andersen’s Behavioural Model, a theory about the factors that determine individuals’ health services utilisation, to examine the personal, cultural and systemic factors that supported or hindered the client’s use of mental health services. The proposed treatment plans incorporated a theory about the five components of access to evaluate the availability, accessibility, accommodation, affordability and acceptability of care for the client, along with Measurement Based Care, a clinical approach that applies standardised, reliable measures for ongoing monitoring of the client’s symptoms and functioning.
The team used measurement-based care to refine the treatment approaches recommended by AI. To ensure that the AI-generated simulation accurately reflected real-world clinical practice, VanHook and Pollard, both licensed mental health professionals, reviewed the proposed treatment plan to verify its clinical accuracy and compared the case brief to published research findings.
As all three authors identify as Black men, they confirmed the materials’ cultural sensitivity and conceptualisation of the barriers that Black men often face in the US mental health system.
“Every population ― regardless of race, age, gender, nationality and ethnicity ― has a unique mental health care pathway, and there is a lot of information out there in AI to understand different populations and how they interact with the mental health field. AI can account for the complex barriers as well as the facilitators of population-wide mental health care,” VanHook said.
Data and patterns
The authors acknowledged that AI-generated content faces constraints imposed by the data and patterns within the platform’s training set, potentially failing to capture the diversity, unpredictability or emotional nuances of clinical encounters. Despite the evidence-based frameworks applied in the project, VanHook said these do not address all of the systemic and structural barriers experienced by Black men or capture every social, cultural or individual factor that influences clients’ care.
The team maintained in the paper, published in the journal Frontiers in Health Services, that generative AI holds significant promise for improving access, cultural competence and client outcomes in mental health care when integrated with evidence-based models.
“AI is a train that’s already in motion, and it’s picking up speed. So, the question is: How can we use this amazing tool to improve mental health care for many populations? My hope is that it is used in the field, as a tool for teaching and within higher-order management and administration when it comes to mental health services,” VanHook said.
In August, Illinois Gov. JB Pritzker signed a new law, The Wellness and Oversight for Psychological Resources Act, that limits the use of AI in mental health care “to administrative and supplementary support services” by licensed behavioural health professionals. The new policy came in response to reports of youths in the US committing suicide after interactions with AI chatbots.
“The use of AI in the manner in our study complies with the new state law if it is used in the process of education and clinical supervision,” VanHook said. “The measurement-based process described may blur the lines, so I would urge caution against its use beyond education and clinical supervision purposes until we receive more guidance from the state.”