China’s youth are facing an unprecedented mental health emergency, but experts say that stigma and a severe shortage of doctors are driving a generation toward a new source of comfort: artificial intelligence.
In a new book, DeepSeek and Mental Health Support Among Chinese Youth, clinical psychologist Dr Olive Woo and AI expert Dr Yuk Ming Tang detail how platforms like DeepSeek are becoming critical lifelines in a society where psychological struggles are often viewed as a sign of weakness.
“The emotional well-being of young people in China is in crisis,” the authors write in the book’s preface. “Yet, societal stigma and a severe shortage of professional mental health resources leave countless individuals without the support they desperately need.”
The ‘silent crisis’
The scale of the problem is immense. Data cited in the book reveals that suicide rates among urban Chinese adolescents aged 15 to 20 nearly doubled between 2017 and 2021. Furthermore, depression symptoms are estimated to affect 20 per cent of Chinese teenagers.
However, getting help is difficult. The authors point out that China has only two psychiatrists for every 100,000 people, and fewer than 500 child psychiatrists in the entire country.
Beyond the lack of doctors, deep-rooted cultural norms often prevent young people from speaking out. Mental illness is frequently tied to the concept of “losing face”, leading many to suffer in silence to protect their family’s reputation.
An anonymous listener
This is where AI steps in. The book suggests that tools like DeepSeek are filling the void by offering anonymity and “cultural customisation” that traditional Western-style therapy might lack.
Because the AI provides a non-judgmental space, it lowers the barrier for those who are afraid to seek professional help.
“For many, it has become a trusted companion, offering solace and guidance in moments of distress,” the authors note.
A complementary tool
Despite the potential benefits, the authors warn that relying on algorithms for mental healthcare carries significant risks.
They caution that AI systems lack genuine empathy and require strict ethical oversight to avoid “hallucinations” or providing harmful feedback.
“This book is not just about technology — it is about people,” Dr Woo and Dr Tang conclude. “It is about the potential of technology to bridge the gap between need and access… provided it is implemented with care and responsibility.”