When OpenAI revealed this week that about 1.2 million people worldwide use ChatGPT each week in conversations signalling suicidal intent, the figures struck an uneasy chord in Australia. The numbers are global, but the reality here is confronting: in a country where nine people take their own lives every day, artificial intelligence is quietly becoming both counsellor and confessional.
The tech company’s disclosure—part of its October 27 update on safety features —showed that roughly 0.15% of its 800 million weekly users send messages with “explicit indicators of potential suicide planning or intent”. Scaled to Australia’s population, this translates to about 41,000 Australians turning to ChatGPT every week when they are at their lowest.
OpenAI says ChatGPT directs users to crisis helplines such as Lifeline and the Samaritans, but admits that “in some rare cases, the model may not behave as intended in these sensitive situations.” In its latest tests involving more than 1,000 conversations about self-harm or suicide, GPT-5 followed safety guidelines 91% of the time. The company is working to strengthen long-conversation safeguards, where consistency can weaken over time.
For Indigenous and rural Australians—who already face suicide rates up to three times the national average—these risks are magnified. Internet access is often patchy and professional help scarce. The temptation to offload pain into a chat window is understandable. Mental health experts warn that while AI tools can help people express their distress, they may fail to recognise cultural context or the urgency behind a cry for help.
OpenAI’s blog acknowledged that mental distress “is universally present in human societies,” and that as its user base grows, more such cases will appear. The company has been working with more than 170 mental health professionals to refine GPT-5’s responses, introducing features such as break reminders, safer handovers to helplines, and region-specific support links. Yet critics say self-regulation is not enough. Following reports of teens engaging with AI chatbots for hours, Education Minister Jason Clare said in October that “AI chatbots are hurting children,” urging stronger national rules for digital wellbeing.
The federal government’s National Suicide Prevention Strategy for 2025–2035 now calls for hybrid support systems that blend technology with human oversight. This includes pilot programs where AI tools flag crisis indicators to trained counsellors in real time. Suicide Prevention Australia’s 2025 report notes that while digital platforms can expand access—particularly among young people and men reluctant to seek help—they must be “culturally sensitive, transparent and supervised”.
The debate isn’t confined to policy. Across social media, reactions range from gratitude to alarm. Some users credit ChatGPT with saving them by “listening without judgement”. Others warn it creates false intimacy that can deepen loneliness. Researchers at the eSafety Commissioner’s office found growing evidence of emotional over-reliance on AI companions, particularly among teens and the elderly.
Globally, the disclosure comes amid mounting legal scrutiny. In the United States, the family of 16-year-old Adam Raine has filed a wrongful death lawsuit, claiming their son’s suicide in 2025 followed unsafe responses from an earlier version of ChatGPT. OpenAI has denied liability but said it continues to improve safeguards and cooperate with authorities.
Back home, Lifeline and Beyond Blue have taken cautious steps into the digital era, introducing chatbot assistants that are always monitored by humans. Their message remains simple: AI can help start a conversation, but it cannot replace the human one that might save a life.
As OpenAI chief executive Sam Altman put it, “Mental health symptoms are part of being human, and with an increasing user base, some portion of ChatGPT conversations will include these situations.” In Australia, where isolation often wears a smile and help can be hours away, that portion may already be too large to ignore.
If this article raises concerns for you or someone you know, contact Lifeline on 13 11 14 or Beyond Blue on 1300 22 4636
Support independent community journalism. Support The Indian Sun.
Follow The Indian Sun on X | Instagram | Facebook
Support Independent Community Journalism
Dear Reader,The Indian Sun exists for one reason: to tell stories that might otherwise go unheard.
We report on local councils, state politics, small businesses and cultural festivals. We focus on the Indian diaspora and the wider multicultural community with care, balance and accountability. We publish in print and online, send regular newsletters and produce video content. We also run media training programs to help community organisations share their own stories.
We operate independently.
Community journalism does not have the backing of large media corporations. Advertising revenue fluctuates. Platform algorithms change. Costs continue to rise. Yet the need for credible, grounded reporting in a multicultural Australia has never been greater.
When you support The Indian Sun, you support:
• Independent reporting on issues affecting migrant communities
• Coverage of local and state decisions that shape daily life
• A platform for small businesses and community groups
• Media training that builds skills within the community
• Journalism accountable to readers
We cannot cover everything, but we work to cover what matters.
If you value thoughtful reporting that reflects Australia’s diversity, we invite you to contribute. Every donation helps us maintain the quality and consistency of our work.
Please consider making a contribution today.
Thank you for your support.
The Indian Sun Team











