Chinese youth.
Photo credit: Zhang Kaiyv/Pexels

China’s youth are facing an unprecedented mental health emergency, but experts say that stigma and a severe shortage of doctors are driving a generation toward a new source of comfort: artificial intelligence.

In a new book, DeepSeek and Mental Health Support Among Chinese Youth, clinical psychologist Dr Olive Woo and AI expert Dr Yuk Ming Tang detail how platforms like DeepSeek are becoming critical lifelines in a society where psychological struggles are often viewed as a sign of weakness.

“The emotional well-being of young people in China is in crisis,” the authors write in the book’s preface. “Yet, societal stigma and a severe shortage of professional mental health resources leave countless individuals without the support they desperately need.”

The ‘silent crisis’

The scale of the problem is immense. Data cited in the book reveals that suicide rates among urban Chinese adolescents aged 15 to 20 nearly doubled between 2017 and 2021. Furthermore, depression symptoms are estimated to affect 20 per cent of Chinese teenagers.

However, getting help is difficult. The authors point out that China has only two psychiatrists for every 100,000 people, and fewer than 500 child psychiatrists in the entire country.

Beyond the lack of doctors, deep-rooted cultural norms often prevent young people from speaking out. Mental illness is frequently tied to the concept of “losing face”, leading many to suffer in silence to protect their family’s reputation.

An anonymous listener

This is where AI steps in. The book suggests that tools like DeepSeek are filling the void by offering anonymity and “cultural customisation” that traditional Western-style therapy might lack.

Because the AI provides a non-judgmental space, it lowers the barrier for those who are afraid to seek professional help.

“For many, it has become a trusted companion, offering solace and guidance in moments of distress,” the authors note.

A complementary tool

Despite the potential benefits, the authors warn that relying on algorithms for mental healthcare carries significant risks.

They caution that AI systems lack genuine empathy and require strict ethical oversight to avoid “hallucinations” or providing harmful feedback.

“This book is not just about technology — it is about people,” Dr Woo and Dr Tang conclude. “It is about the potential of technology to bridge the gap between need and access… provided it is implemented with care and responsibility.”

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Simple picture test helps AI diagnose addiction with over 80% accuracy

Diagnosing addiction is notoriously difficult due to stigma and denial, but scientists…

Babes to the rhythm: Newborns enter the world ready for beats but deaf to tunes

Humans enter the world with an innate ability to predict rhythmic patterns…

Rise of ‘skill-based’ hiring: AI expertise now worth more than a master’s degree

With AI skills commanding a 23% wage premium and offsetting age bias…

Scientists create ‘smart underwear’ that tracks every time you fart

A new wearable device that tracks human flatulence has revealed that the…