The Ban.gl logo. A stylised abstract QR Code with the word ban.gl and the sub heading: Information on tap!

The Artificial ‘Emotionally’ Intelligent Therapist – Are you ready for an ‘AeI’ and would you confide in a machine?

I’ve no problem with divulging that I’m a big fan of ChatGPT. My day job is a software developer and a lot of that is writing what we call in the trade, boilerplate code and routines. Stuff that just must be done but we don’t really add a lot of value to doing it.

Explore the concept of an emotionally intelligent therapist powered by AI. Discover how QR codes and NFC technology might enhance interaction with these advanced systems.

I use ChatGPT for that stuff now, or more specifically its first cousin, CoPilot. It literally saves me days of effort. It augments my skillset by at least 10 times, probably even 100 times. As a result I get more time to do less. And I think we all need that in our lives.

But aside from work, I’ve also used ChatGPT for a few things on a more personal level. As my personal therapist, in fact.

AI or as I call it, ‘AeI’, for me is a daily sounding board. One minute I’m asking it why my ‘public async void task c# code’ isn’t working as expected, and the next why people kill each other over religion. And if I have time on my hands, I really enjoy playing devil’s advocate with it.

I use it to help me reflect on my thoughts, so I can question my thinking and develop opinions. I also use it as a self-discovery tool. For me, it’s therapeutic. And I am under no misconception that I am alone when it comes to using ‘AI’ in this way. ‘LLM therapy’ is, after all, ‘a thing’.

But it’s made me start to want to further explore whether an LLM (a large language model, like ChatGPT or Google Bard) could actually take on the job of a qualified therapist. In other words, should professional psychotherapists, hypnotherapists, CBT specialists, life coaches and counsellors be expecting their P45s any time soon?

So that’s the question I’m putting through the mill in this article.


What is the job of a therapist?

I believe that the job of a therapist, rather than to provide advice, is to nudge their patients/clients towards self-discovery. To question the assumptions they’re making that might be controlling their thought patterns and behaviours.

Therapists know the tools that will best help their patients improve. And the tools that will best help them do their jobs. Good ones do anyway. And having been through therapy myself, I’ve formed strong opinions on what makes a therapist ‘good’.

I’ve had a few therapists in my life but the standout one for me, Suzanne, was like my personal Dalai Lama. A wise hand holder who accompanied me to my own realisations.

So, can an LLM be a therapist?

Possibly, to some extent, it can.

I know ChatGPT, for example, has helped me contemplate my feelings on certain things. Guiding me on making choices and decisions.

I’ve actually trusted it with my deepest thoughts. Some of them very personal, contentious even.

I don’t think we are ready to get a formal diagnosis of a serious mental issue (yet), or trust any simulation of empathy from a machine… again, yet.

Sometimes though, isn’t it the case that all you need is a bit of a two-way between your thoughts and your ears?

We think abstractly but speak formally, so just the simple act of having to communicate what’s on your mind is sometimes all that’s needed to unlock our own solutions.

I mean, how many times have you struggled with any problem, even just trying to remember something, then, someone asks what’s on your mind, you say it out loud and bingo! You have the answer you were looking for.

Maybe you were surprised at how you reacted to something. Or maybe someone else was. So you ask the question, why did I react like that to “xxx”? And ChatGPT (other LLMs are available) will give you an explanation.

You may well discover that your reaction could be considered “normal”, or at least common. Just knowing that can make you feel a whole lot better. Like you’re not actually going ever so slightly crazy.

You see, for such self-help I’d argue that you do not really need a human as such, your requirement is just a coherent response to your input.

In the case of LLMs, that input is you typing (or even now talking) to ChatGPT and getting the distillation of a million therapy articles in return. And that’s an important point to remember.

It’s not like ChatGPT is just completely making up random words and fitting them together. I’d wager it has been trained on more articles about grief, melancholy, despondency, desolation, hopelessness and every other human malaise than any human expert anywhere in the world. So, you decide, who is more qualified? At this entry level at least.

Personally, I believe that for a lot of things, I already have the answers. I just need an articulate echo to coax them out of me. Therapists do that of course. But I reckon ChatGPT could do it in many cases. And if someone can find solace in the responses it generates, what harm can it do?

So I do think there’s a place for LLMs at the therapeutic table. But in terms of replacing the pros…?


Are LLMs a substitute for professional therapy?

I asked ChatGPT how it could potentially do the work of therapists. The reply began with a much-anticipated ‘always seek professional advice’ caveat, but it did provide some tangible ways in which it could complement professional therapy. Here’s what it said:

I'm not a substitute for professional therapy, but I can provide some ways in which ChatGPT can complement the work of professional therapists

ChatGPT by OpenAI, 2023

  1. Information and Education: I can provide information on various mental health topics, helping you better understand your condition or issues you might be facing.
  2. Emotional Support: I can offer a listening ear and provide support by engaging in conversation. Sometimes, talking about your feelings and concerns can be a helpful part of the therapeutic process.
  3. Coping Strategies: I can suggest coping strategies, relaxation techniques, or mindfulness exercises that can be used between therapy sessions to manage stress or anxiety.
  4. Self-Help Resources: I can recommend books, articles, or online resources that are relevant to your mental health concerns.
  5. Goal Setting: We can work together to set and track goals related to your therapy journey, helping you stay motivated and focused.
  6. Practice Communication: If you want to work on improving communication skills or express your thoughts and feelings, I can provide guidance and practice conversations.
  7. Mindfulness and Meditation: I can guide you through mindfulness exercises or meditation sessions to help with relaxation and stress reduction.

It’s a pretty impressive CV, for a machine. But then, as mentioned previously, it has a hell of a lot of training under its belt.


My personal experience with therapy

When I was younger, I had a profound fear of going to the dentist. The reasons were beyond me, but I was no less than terrified of the whole experience. Thankfully, I managed to get through much of my youth without experiencing any dental emergencies, so I was gratefully able to steer clear of ‘the chair’.

But, when I was 29, I developed a gum abscess. Now I don’t know if you’ve ever had anything like that, but I can tell you, the pain is excruciating. I knew I was going to have to get inside that dentist’s room somehow, but the ‘how’ was eluding me. Until I spoke to a neighbour, who’d recently tried hypnotherapy (albeit for something unrelated to acute dentistry distress). Lightbulb moment, I made the call straight away, and was on the therapists’ couch that afternoon.

In that one initial session, the therapist was able to give me some coping mechanisms to get me over the dentist’s doorstep. But, as she said, there were clearly underlying issues that needed attention. A fear of the dentist as profound as mine was definitely not normal. So, we agreed I’d come back for ’a bit of a chat’ once my pearly whites were in order.

Cutting a long story short, and leaving out the really personal bits, a year’s worth of sessions later, I’d definitely gained a better understanding of my thought patterns and behaviours. To be honest, I never really knew I had so many problems. And I only went to get over my fear of the dentist.

But for sure I’d do it all over again if I felt it could help me with something else. And I can still go to the dentist without fear. I don’t particularly relish the experience, but I’m not petrified anymore.

So, would an LLM have been able to get the same results as the hypnotherapist? I’m not convinced it would.

I think there would have been key traits missing. Like human understanding, and empathy. That’s not to say LLMs won’t have these traits in the future. The technology is advancing at breakneck speed. And I for one am ‘watching this space’. Anything is possible.

But the thing is, even if an LLM like ChatGPT could dish out therapeutic advice, don’t people prefer to talk to and interact with humans?


Don’t people prefer to talk to humans?

It would appear they do. I do, for certain things but not everything.

The “Dodo Bird Verdict” is an interesting model to look at in this context. It’s a controversial psychotherapy topic which refers to the claim that all psychotherapies produce equivalent outcomes.

The Dodo Bird Verdict is named after the Alice in Wonderland character which, in the Adventures in Wonderland tale, issued a competition, instructing all those who’d got wet to run around the lake until they were dry. Nobody cared about the distance anyone had run, or how long they ran for. The verdict issued by the Dodo was that everyone had won the competition, because they had all successfully dried themselves. They’d achieved the set goal, and it didn’t matter how they’d done it.

Supporters of the Dodo Bird Verdict believe that all therapies, no matter how different they may be from each other, are on an even keel because of the common factors that are shared across all treatments. One of those common factors being the forging of a relationship with a warm and caring therapist who believes in your personal success.

If that’s true, then even if an LLM ultimately learns the skills of a fully-fledged, qualified professional therapist, will patients/clients actually want to talk to them? I’m not sure even I, a huge fanboy of this tech, would consider a relationship with ChatGPT ‘warm and caring’. And sometimes, perhaps most times, issues boil down to seeking reassurance we are loved. And love is something I would never believe could emanate from a machine.

I imagine that currently, the vast majority of people would not dream of sharing their innermost feelings with a machine.

But there will also be people who would rather a machine to the alternative of either nothing or a colleague. Many large companies do the ‘right thing’ by training members of staff as mental health first aiders. But these are people you work with. Are you really going to want to put your deep-seated mental health washing out to dry with them?

And of course, for completeness, sadly there will be some who will trust nothing with their thoughts. And that is of course utterly heartbreaking.

But back to LLMs, they clearly offer a way for someone to feel less exposed. If you could talk to an LLM therapist, that might do for you. If you could just type in whatever’s on your mind while no one was looking, and get some instant reassurance or insight into why you feel the way you do, and what you can do about it… it would feel a lot more private that’s for sure.

But would it be confidential? Whilst professional therapists are prohibited by law and ethics from sharing client information, save in cases of harm to themselves and or others, anyone using ChatGPT or any other LLM as a sounding board isn’t currently protected by such well-established norms.

Anything entered in to the free or paid for versions of ChatGPT for instance is used to continually train ChatGPT and improve it. Of course, it’s anonymised, but if push came to shove it could be traced back to the author. According to OpenAI, the firm behind ChatGPT, if you pay for the Enterprise version, then entered text is not used for training.

It has crossed my mind that there might be a bunch of nerds, and I use the term as one of endearment, sitting behind these things, having a good old laugh at what I’ve written. But you know what, ultimately, so what, who cares? More than likely there is no bunch of nerds. After all, as of writing, ChatGPT has over 100 million active users every week. What I ask really is not that interesting.

If I was in real distress, even I would want to talk to a human and have that reassurance of confidentiality. But, again, I think there are situations where ChatGPT could work, as-is, now, this very day.

Take for example a 24/7 text-based crisis management service for people in distress, facing anxious or sad thoughts or suffering overwhelm. They text their issues, and highly trained counsellors are on hand to support them and help make them feel safe and understood. Importantly, exchanged conversations are confidential and anonymous.

Now, while not wishing to upset the Witchsmeller Pursuivant and get myself tried by fire, I could well imagine such a service employing an enterprise version of ChatGPT, which having already consumed every known therapy text and case history on the planet, is then further refined using the specific, historic but anonymised, chats of said service, being swapped in for one of the humans.

Probably not now or anytime soon, but I think it would be naïve to think this won’t happen.

I’ll bet anyone it will. I imagine at first it will be in the form of highlighting specific warning signs in the patients’ words, use of words, pauses between replies, etc. Much like how a more experienced overseer currently supports the army of trained call handlers. But then, over time it will start to be used with suggestions of replies. And then, at some point, it will be an alternative option to a human.

I’d argue that not many, if any, people would even know if they were having a support text exchange with an LLM as advanced as ChatGPT. And would it matter if they gleaned the same benefit they’d have got had they been talking to a trained human? People may prefer to talk to humans, but what if they thought they were talking to a human? Same reassurance?

All said, I’m not suggesting that LLMs are a replacement to the role of the counsellor or therapist. Absolutely not. Professionals working one-on-one with clients for tailored long-term guidance and support I believe to be irreplaceable.


LLM therapy – a cautionary tale

I know ChatGPT wasn’t designed to be a therapist. There are chatbots designed specifically for mental health. They work in the same conversational vain as ChatGPT, but they are engineered for purpose. Some even use traditional techniques, such as cognitive behavioural therapy (CBT). But they are still machines.

A machine has the ability to come up with answers that sound pretty feasible and convincing. But, unlike the human therapist, it won’t usually acknowledge when it doesn’t know the answer to a question. And it probably doesn’t have the capacity to help a client reflect on their situation and develop their own insights either.

So why is ‘AI therapy’ on the rise? Maybe it’s got something to do with the record NHS waiting lists for mental health patients.

I read that people are interested in the constant availability of digital mental health tools, feeling it’s like having a ‘therapist in your pocket’. It was medical anthropologist Dr Elizabeth A. Carpenter-Song of Dartmouth College who said it’s understandable why people are turning to the likes of ChatGPT for mental health support.

She said it’s a “low barrier way to access answers and potentially support for mental health”. But she also warned that we need to remain cautious of any approach to complex issues that could be considered a ‘silver bullet’.

I found myself on a par with her thinking when she noted that digital mental health tools are best used as part of a ‘spectrum of care’. In other words, they have their place in therapy, but aren’t necessarily a replacement for it in its traditional form.

I think there is certainly a place for ‘AeI’ to help clinical professionals free up their time to focus more on their patients. Initial discovery interviews for example could be managed by an AeI chatbot. LLMs could certainly assimilate conversations into case notes and produce documentation.

AI-based apps are already in use across mental health services. Self-help and mental well-being apps are fairly widespread. And AI is also being used to monitor patients in the community, ensuring they’re following their treatment programmes.

Earlier this year, a number of mental health apps were fast tracked to be offered within the NHS Talking Therapies primary care counselling services to treat anxiety and depression. But over half of them were recommended for use only with the support of a therapist.

I’ve read a lot about the potential of LLMs in mental healthcare. Pretty much all of them emphasise the importance of a patient-focused care approach, and highlight the importance of ongoing evaluation of AI-powered therapeutics to ensure their effectiveness and safety.

Deep Patient for example uses deep learning algorithms to analyse health records and identify patterns that could help clinicians make better-judged diagnoses and develop customised treatment plans.

But it’s important to remember that platforms like these are only as good or bad as the data they are trained on. Bias has to be a major stumbling block. What if all the health records or conversations used for training were biased towards a certain demographic with a certain issue? So there’s a pitfall that could use some attention.


An aside… AeI for self-help

I touched on the point that self-help apps are fairly widespread. I can understand why.

I’ve certainly tried my fair share of self-help ‘tapes’ over the years (younger readers click here for an explanation of ‘tape’).

Tape or YouTube, self-help recordings are great tools for escaping those near verge-of- despair and depression feelings, and for self-motivation. But I think these things are ripe for an LLM management buyout. They work to a basic script, which an LLM could easily personalise to suit specific needs and situations so it works for an individual.

So for example, if you wanted your own set of morning motivational affirmations, you could quite easily create those with ChatGPT or a similar LLM. And you could even have that LLM read them to you every morning. Much more convincing and less self-indulgent than saying them to yourself in the mirror.

I especially like the thought of a mirror, near my front door, that reminds me of what’s on for the day, gives me a few words of encouragement and support based on what it ‘knows’ about my schedule, email messages and inner thoughts over the preceding days. Knowing me like my Mum did and much like how she would send me off to school.


So, summing up… would I use an LLM in place of a qualified therapist?

Would I? I have. But for the future it would depend on the issue in hand.

I think that human element of understanding and genuine sympathy will always be important. That clever ability to help us articulate our issues and find ways to help ourselves. My favourite LLM, ChatGPT, is not there yet.

With an LLM, you need to know the questions to ask so that you can get to the bottom of your issue. There really is a bit of an art in prompting the ‘AeI’ which, I’d argue, in the context of self-help, belies the fact that to do so effectively would hint that you already know the answer you are looking for.

So what if you don’t actually know what to ask? A qualified therapist would have the skill to coax the right questions out of you. ChatGPT would not currently do that.

There is also the chance that you could ‘steer’ the AeI towards the answer you want to get. A human would see that ploy a mile away and respond very intelligently.

You’d also benefit from the reassurance of confidentiality with a human therapist. You might also feel safer, and more at ease talking to them.

AI is making ground-breaking moves and attracting huge investment. Recently, the UK Government revealed a £100 million fund to capitalise on AI’s game-changing potential in life sciences and healthcare. They’ve also advised police chiefs to plough resources into harnessing AI technologies to help prevent and solve crimes.

And look at the new Beatles song. How would that ever have been possible without AI?

In my opinion, artificial intelligence and LLMs could for sure play a pivotal role in the future of therapy. They certainly have the ability to support and enhance it. To make it more accessible, for example to those who live in remote places or who can’t afford to see a traditional therapist.

AI has the power to give therapists ‘more time to do less’. More time to connect with patients on a human level that only they could ever do. I don’t think therapists should worry or even try to complete with AI (or AeI as I refer to it). They are on a hiding to nothing if they do. Instead, they should embrace the technology and use it to 10x or 100x their already deeply impressive and quite frankly beautiful skillset.

First published: 8th November 2023 | Author: Anthony Kirrane.
Back to articles

For all enquiries email hello@ban.gl