Artificial intelligence is changing how people seek help, yet psychiatrist and medical director Dr Sampath Arvapalli of The Banyans Healthcare says technology still can’t replace the signals and support found in real conversations. Speaking to Saleha Singh on Chai Chat and Community, he responded to new figures showing more than 41,000 Australians each week may be using ChatGPT to talk about suicidal thoughts, and set out where AI can help and where it falls short.
“People are trying to turn towards AI. Firstly, because it’s very accessible. Secondly, it’s very affordable and thirdly, it’s also anonymous to get these kind of helps and it’s available 24/7,” he said. He added “a study in Australia showed about a third of those surveyed turn to AI therapy chat bots to beat the costs and also the long cues.”
Dr Arvapalli cautioned against seeing chatbots as a substitute for therapy. “I don’t want to throw the baby out of the bath water yet. At the same time, we have to be very cautious in interpreting what’s happening at the moment.” He said AI can be useful when there is clear accountability. “These AIE services could be used as long as they are they have an oversight of a human.”
Younger Australians are the keenest adopters. “The most prominent AI therapy users are aged between 16 to 25 years and this is a generation that’s accustomed to working through relationships, life issues uh and communications through digital technology,” he said. That convenience comes with risk. “They are almost at risk of dropping out. Tendencies, as well, are quite high because they don’t stick with it for long enough and that premature abandonment can also risk the outcome. That’s why the human oversight is very important.”
Asked whether chatbots can help someone in distress or make things worse, he described patterns he’s seen when people form attachments to AI tools. “When people start interacting with AI, the first stage what I’ve seen is the mirroring where AI starts to agree with them giving them more of the emotional responses or understanding that they would want to hear.” Over time”, he said, “there’s something called boundary dissolution the AI becomes more of a partner rather than a tool. AI can’t be a true replacement for human emotion and human understanding.” Eventually, he warned, “people start to have a reality drift where they seek validation from AI rather than humans for everything they’re doing in their life.”
For clinicians, the core problem is a chatbot can’t assess what a trained professional can. “The AI doesn’t have the skill of studying the emotional responses. What we call as the mental state examination. Watch the body language. Watch the subtle changes in the person’s presentation—are they presenting as being psychotic? Are they presenting agitated? There is no way to study that.” He noted a further risk when AI-generated notes make their way into clinics. He has seen AI summary documents that did not reflect the patient’s history. “They were like, oh Dr Sam, I don’t know where that came from. That wasn’t me.”
He drew a firm line between using AI to find help and using it as help. “In early stages it’s okay to connect through to AI to understand where to get services, where to get help. What is a free resource? What is a low cost effective resource? You can use AI to get you that information but not directly as a chatbot that will give you the psychological or emotional support.”
“No technology can replace warmth of a real human conversation whether it’s a friend, a family member, or a psychiatrist like you. Reaching out can actually and truly save a life”
On multicultural communities, Dr Arvapalli said technology can miss what matters most. “For example, if you do a WhatsApp call or a FaceTime call to talk to your parents versus seeing your parents is very different, isn’t it? Being there with them is different,” he said. “That emotional connection is very important especially among the migrants and also the Indigenous community. There’s a lot of nonverbal communication that goes on.” Pandemic telehealth widened access, yet many still prefer to attend in person. “Even today a lot of patients still want to come and see us in person because it’s not the same thing seeing a psychologist or a psychiatrist over a video u conversation.”
Stigma remains a barrier, especially in South Asian communities. “Judgment and I think a lot of stigma is what affects people’s ability to access treatments,” he said. “I always encourage people to connect to people with similar cultural background to express yourself at least as a first line before you can step out and access help.”
Dr Arvapalli said regulated care exists for a reason. “The clinical services in Australia have to go through the TGA approval,” he said, warning that some digital tools avoid safeguards by framing themselves as wellness products. “We need to have a lot more evidence to support any use of this and because this is early stages we have to be very careful.”
At The Banyans, he said integrated teams can see the whole picture. “I can proudly say that you know we at Banyan’s healthcare we work as a multidisciplinary team so we have the advantage of checking how the person is doing medically and then discussing that with your psychologist from a psychology point of view and also talking to someone else and seeing how are they doing the socially in their communication in the outside environment.” By contrast, “through AI it can be a bit fragmented.”
His take-home message is simple. “AI does promise a lot for our future for the point of improving treatment efficacy, efficiency, and the way we deliver care. But we are still in the early stages. So please check with a human oversight whatever we do.” And at the heart of care, he was clear. “No technology can replace warmth of a real human conversation whether it’s a friend, a family member, or a psychiatrist like you. Reaching out can actually and truly save a life.”
If this story raises concerns for you or someone you know, contact Lifeline on 13 11 14 or Beyond Blue on 1300 22 4636.
Support independent community journalism. Support The Indian Sun.
Follow The Indian Sun on X | Instagram | Facebook
Donate To The Indian Sun
Dear Reader,The Indian Sun is an independent organisation committed to community journalism. We have, through the years, been able to reach a wide audience especially with the growth of social media, where we also have a strong presence. With platforms such as YouTube videos, we have been able to engage in different forms of storytelling. However, the past few years, like many media organisations around the world, it has not been an easy path. We have a greater challenge. We believe community journalism is very important for a multicultural country like Australia. We’re not able to do everything, but we aim for some of the most interesting stories and journalism of quality. We call upon readers like you to support us and make any contribution. Do make a DONATION NOW so we can continue with the volume and quality journalism that we are able to practice.
Thank you for your support.
Best wishes,
Team The Indian Sun












