This year’s Australian Youth Digital Index provides striking insights into where young people turn when dealing with mental health challenges: nearly half (47%) say their first source of support for a mental health concern would be online, including 7% who say they would go straight to an AI chatbot.
In contrast, only 1% would first approach a mental health professional.
The findings reflect both the realities and the risks of the digital age. Young people go online for support because it’s accessible, anonymous, and immediate, while the rise of AI brings a new level of tailored responses (at least on the surface) to mental health queries.
But Associate Professor Shane Cross, Clinical Psychologist and Director of Digital Service Transformation and Research at Orygen Digital, says the rise of chatbots in youth mental health is not a straightforward development. There is a significant risk apps will do more harm than good for some, he says, and an urgent need for responsible design.
“When young people talk about “AI chatbots,” they’re usually talking about the big, general-purpose tools – ChatGPT, Gemini and similar platforms – not purpose-built mental health systems,” Shane says.
“There are apps in the app store that look like they are built for mental health purposes, but they have not been tested for clinical accuracy and safety. They rely on vague “wellbeing” claims like improving mood or happiness as a way to avoid regulation.”
Accessible, anonymous, immediate: what drives digital help‑seeking
In Orygen’s own research last year, 30% of community respondents had used commercial AI tools for mental health purposes, reporting benefits like accessibility and non-judgemental responses. But well over half of them also reported downsides or harm from the experience, including around privacy risks and inaccurate information.
“One of the biggest risks is the level of agreeableness these systems have, the sycophancy that sees it agreeing with everything you say,” Shane explains. “This makes you feel good and stay engaged, but in a mental health setting it can be really unhelpful, especially if you’re starting to express some thoughts or beliefs that need to be gently challenged.”
Another serious risk: chatbots detect only around 50% of crisis indicators, compared to clinicians who detect well over 90%. “And even when they do pick it up, the escalation isn’t always appropriate.” Shane notes that millions of users, including teens, have disclosed suicidal or psychotic thoughts to these systems, often without receiving safe and effective guidance.
Privacy pitfalls when conversations train models
There are also concerns about data privacy, especially when chat transcripts are analysed by third parties or used to train models.
“Young people are reaching for AI because they need support,” Shane says. “But the support they’re getting is inconsistent, unregulated, and in many cases unsafe.”
Which is not to say AI has no role in mental health support, if designed with evidence, ethics and safety at the centre. Alongside the untested products flooding app stores, there is a smaller but growing group of organisations, Orygen among them, building clinically-grounded, responsibly-governed AI.
“The opportunities are huge,” Shane says. “But you need guardrails: clinical evidence, safety mechanisms, training data that represents diverse users, strong user testing, and proper clinical trials.”
There should also be appropriate escalation pathways in place. He’d like to see any developed platform detecting risk comparable to or better than clinicians.
Co‑design with young people and clinicians at every stage
And finally, we need to bring young people and clinicians into every stage of the architecture, testing and evaluation process.
“Nothing we build will be designed to work alone,” Shane says. “We want AI that enhances the relationship between young people and clinicians, not replaces it.”
About Orygen Digital
Telstra Foundation has proudly supported Orygen Digital for over a decade to create innovative digital solutions that support young people’s wellbeing. While Orygen is yet to launch an AI chatbot for mental health, it offers two respected and clinically tested digital products – MOST and Mello – for young people experiencing mental health challenges.

