To me, using AI is like when the hitman in a movie drives someone into the woods and makes them dig their own grave before they get murdered. Every time we ask it to do something for us, we’re training it – free of charge – so it can come for us, and take our jobs and livelihoods even faster.
Obviously journalism is child’s play to AI. Thank goodness I have totally future proof Plan B, psychotherapy.
So you can imagine exactly how thrilled I was to read that research published in the New England Journal of Medicine has revealed that given the right training, bots can deliver therapy with as much efficacy as - or more than - human clinicians.
Or more than.
AI can gulp down every book and paper ever written, and - the perfect therapist – will always be at its patients’ beck and call, 24/7/365. How can any mere person compete?
A friend’s sister can’t afford an IRL therapist at the moment, so had gone down this route, and when my friend told me about it, my first instinct was better AI therapy than no therapy at all. Insert hollow laugh here. Apparently Chat GPT quickly noticed that the sister had been in a string of disappointing relationships with men, identified that she struggled to make healthy choices when it came to partners, and linked this to issues with her father in childhood. So far, so Freud, but on fast-forward.
It then told the sister that in future, she should simply pick better blokes to go out with - obviously a revelation
that had never occurred to her, and also really easy to instantly start doing once your phone tells you to.
The problem here - yippeee there is one! - is that AI lacks nuance.
Another friend is divorced and now in a new relationship with a man who was also previously married. They’ve had a surprisingly easy time of blending their families, their kids get on well, the only fly in the ointment is his mother. She was close to his first wife, and cannot - will not - accept my friend. The man is driving himself crazy trying to juggle the two women in his life and keeps asking my friend to be patient. His mum was a great support during his marriage breakdown, is a wonderful, very involved grandma, and he’s confident that while she needs time to adjust to the new situation, she’ll come round eventually. My friend had to pluck up the courage to leave a deeply unsatisfying marriage and endure the slow, relentless pain of the divorce process, so is now keen to get on with her happily ever after. She’s frustrated by being left out of family gatherings, and with her next chapter being made more complicated by this stubborn woman’s refusal to move forward. You can see both sides, can’t you? Enter Chat GPT.
Interestingly, my friend would never see a real therapist, but was happy to pour her heart out to AI. The bot didn’t do anything dull like ask her to explore and work through her feelings, instead – perfect therapist again – it immediately told her the magic solution to all her woes. More helpful still, it crafted a text that my friend should send to her partner, explaining that if he could not put her first, as she deserved, she’d have to seriously think about the future they had been planning together.
She read it to me later, saying she’d been blown away AI could write something that insightful, and personal, specific to her situation, so she’d sent it straight away. When I asked if she really felt like that - because I had no idea she was so unhappy she was considering ending the relationship - she was aghast. That wasn’t what she’d intended at all. I told her it was quite an ultimatum-y message.
“That’s what he said!” she wailed, realising what she’d done.
Chat GPT hadn’t understood that this man wasn’t stringing her along, having his cake and eating it, he was navigating a difficult, complex situation. Trying his best. Working hard to keep everyone happy. And basically getting dumped by a robot for his trouble.
Luckily my friend was able to turn the situation round, with some messages she wrote all by herself, which turns out to be a better way to communicate with your nearest and dearest than outsourcing. Who knew?
Everyone seems resigned to AI taking over but maybe this outcome isn’t inevitable. Imagine if AI actually proves to be a double bluff, only serving to remind us of the importance of human contact, creativity, truth.
Yeah, it’ll definitely either be that, or people will keep training it for free so it gets better and better, and buries us in the hole we dug for ourselves.


Thanks for sharing this. I strongly believe AI therapy bots are a terrible idea for anything beyond triage and very basic assessment. AI will never replace a human therapist. Study upon study has demonstrated that the most significant factor in the success of psychotherapy is the therapeutic bond established between therapist and client. Human interaction isn't just a delivery method for therapy, it's the entire point of the process.