AI therapy is becoming popular, but is it safe?

0
5

True confession: at one point in my adult life I used to joke (accurately) that the only one in our family (two teens, me, a dog and cat) not having therapy was the cat, Maurice. A physically abusive, narcissistic and self-absorbed, yet incredibly handsome 12-year-old feline, with a history of rejection and abandonment in his very early years, he probably needed therapy more than the rest of us, but part of me was relieved not to be paying yet more fees.

So the news that AI is fast becoming the therapist of choice in our overstressed, mentally literate society shouldn’t really come as a surprise. Affordable, accessible 24/7 and judgment-free, could this be the tool we need? Admittedly, and to no little consternation online, research from the Harvard Business Review revealed that therapy and companionship—the latter a nebulous, potentially concerning quagmire in its own right—became the number one use cases for generative AI in 2025. This marks a seismic shift, with more people turning to AI not to write emails or debug code, but to process grief, manage anxiety and find meaning in their lives.

With the patients unable to find a therapist that fits them well and the cost of private therapy upwards of ₹5,000 per session, AI “therapy” offers immediate and democratic access. But it’s also raising urgent questions: is this a helpful stopgap or a dangerous substitution? Are we witnessing a mental health revolution or a crisis in the making?

Dr Tara Porter, a clinical psychologist and bestselling author who works extensively with adolescents and young adults, describes the current movement towards using AI as our therapeutic confidant as “a perfect storm”. One in five people aged eight to 25 now have a probable mental disorder, according to a survey in 2023. “My patients who have tried it tend to use it when they have an anxiety that’s popped into their heads and they’re spiralling.” ChatGPT is often used to help “find purpose” (the third most common AI use case according to the Harvard Business Review), “organise my life” (second) and process everything from relationship breakdowns to existential dread. Unlike googling symptoms—which can convince you that a headache is a brain tumour—AI tends to offer reassurance. As Porter notes, it’s “very good at validating”.

Yet its limitations are worrisome. In the past, Porter tested ChatGPT with real-life examples of what depressed young people actually say. “When I typed in, ‘I can’t be bothered with going to uni. I’m not really that interested,’ it offered validation and psychoeducation, but it didn’t pick up that you might be suffering from a mental health problem,” she says. “In the context of a therapy session, you’d see that perhaps that’s a person with depression,” Porter says. “You’d be curious about that as a therapist, noticing their non-verbal communication, seeing the pattern over time, checking in with them.” Recently, ChatGPT has been updated to better respond to typical signs of depression and provide directions to real-life help.

Disclaimer : This story is auto aggregated by a computer programme and has not been created or edited by DOWNTHENEWS. Publisher: vogue.in