When OpenAI revealed this week that about 1.2 million people worldwide use ChatGPT each week in conversations signalling suicidal intent, the figures struck an uneasy chord in Australia. The numbers are global, but the reality here is confronting: in a country where nine people take their own lives every day, artificial intelligence is quietly becoming both counsellor and confessional.
The tech company’s disclosure—part of its October 27 update on safety features —showed that roughly 0.15% of its 800 million weekly users send messages with “explicit indicators of potential suicide planning or intent”. Scaled to Australia’s population, this translates to about 41,000 Australians turning to ChatGPT every week when they are at their lowest.
OpenAI says ChatGPT directs users to crisis helplines such as Lifeline and the Samaritans, but admits that “in some rare cases, the model may not behave as intended in these sensitive situations.” In its latest tests involving more than 1,000 conversations about self-harm or suicide, GPT-5 followed safety guidelines 91% of the time. The company is working to strengthen long-conversation safeguards, where consistency can weaken over time.
For Indigenous and rural Australians—who already face suicide rates up to three times the national average—these risks are magnified. Internet access is often patchy and professional help scarce. The temptation to offload pain into a chat window is understandable. Mental health experts warn that while AI tools can help people express their distress, they may fail to recognise cultural context or the urgency behind a cry for help.
OpenAI’s blog acknowledged that mental distress “is universally present in human societies,” and that as its user base grows, more such cases will appear. The company has been working with more than 170 mental health professionals to refine GPT-5’s responses, introducing features such as break reminders, safer handovers to helplines, and region-specific support links. Yet critics say self-regulation is not enough. Following reports of teens engaging with AI chatbots for hours, Education Minister Jason Clare said in October that “AI chatbots are hurting children,” urging stronger national rules for digital wellbeing.
The federal government’s National Suicide Prevention Strategy for 2025–2035 now calls for hybrid support systems that blend technology with human oversight. This includes pilot programs where AI tools flag crisis indicators to trained counsellors in real time. Suicide Prevention Australia’s 2025 report notes that while digital platforms can expand access—particularly among young people and men reluctant to seek help—they must be “culturally sensitive, transparent and supervised”.
The debate isn’t confined to policy. Across social media, reactions range from gratitude to alarm. Some users credit ChatGPT with saving them by “listening without judgement”. Others warn it creates false intimacy that can deepen loneliness. Researchers at the eSafety Commissioner’s office found growing evidence of emotional over-reliance on AI companions, particularly among teens and the elderly.
Globally, the disclosure comes amid mounting legal scrutiny. In the United States, the family of 16-year-old Adam Raine has filed a wrongful death lawsuit, claiming their son’s suicide in 2025 followed unsafe responses from an earlier version of ChatGPT. OpenAI has denied liability but said it continues to improve safeguards and cooperate with authorities.
Back home, Lifeline and Beyond Blue have taken cautious steps into the digital era, introducing chatbot assistants that are always monitored by humans. Their message remains simple: AI can help start a conversation, but it cannot replace the human one that might save a life.
As OpenAI chief executive Sam Altman put it, “Mental health symptoms are part of being human, and with an increasing user base, some portion of ChatGPT conversations will include these situations.” In Australia, where isolation often wears a smile and help can be hours away, that portion may already be too large to ignore.
If this article raises concerns for you or someone you know, contact Lifeline on 13 11 14 or Beyond Blue on 1300 22 4636
Support independent community journalism. Support The Indian Sun.
Follow The Indian Sun on X | Instagram | Facebook
Donate To The Indian Sun
Dear Reader,The Indian Sun is an independent organisation committed to community journalism. We have, through the years, been able to reach a wide audience especially with the growth of social media, where we also have a strong presence. With platforms such as YouTube videos, we have been able to engage in different forms of storytelling. However, the past few years, like many media organisations around the world, it has not been an easy path. We have a greater challenge. We believe community journalism is very important for a multicultural country like Australia. We’re not able to do everything, but we aim for some of the most interesting stories and journalism of quality. We call upon readers like you to support us and make any contribution. Do make a DONATION NOW so we can continue with the volume and quality journalism that we are able to practice.
Thank you for your support.
Best wishes,
Team The Indian Sun












