The UK is facing a deepening mental health crisis, with rising demand, long waiting times, and growing concerns over the use of AI as a stopgap solution
New figures from the NHS report that 3,790,826 people were in contact with services during 2023-24, compared to 2,726,721 in 2018/19.
Despite the Government spending £11.31 billion on mental health in 2023/24 and £11.79 billion on MHIS in 2024/25, this is not enough to address the crisis. Waiting lists are up to two years to see a Mental Health professional (Rethink), and during the waiting period, people’s mental health is decreasing fast.
The NHS is addressing this critical gap by piloting AI assistants that offer therapy to people while waiting to be seen by a human health professional.
This temporary quick-fix solution doesn’t feel right, using an AI robot to offer advice and therapy on their devices, encouraging more use of technology. Surely, AI is better placed to improve efficiencies in the back office and for diagnostics in the lab, but not to offer support to a group of fragile and vulnerable people. Especially as this could exacerbate their condition. The answer is that the Government needs to invest more in recruiting additional mental health nurses and offering attractive salaries.
Mental health escalating
Levels of mental health cases have escalated since the COVID pandemic, lockdown and the cost-of-living crisis. The speed of modern life and the overuse and reliance on technology and social media, particularly in children, are contributing factors.
With a staggering more than one million children, with 16-year-olds most likely to be seeking NHS support for mental health. Ofcom reported that 46% of adolescents are online ‘almost constantly’. And 97% of children have a smartphone by 12, according to Ofcom data.
Recruiting mental health nurses is challenging, and the demand is outstripping the capacity. The Royal College of Nursing revealed a survey showing dissatisfaction with nurses last year, with nearly half of the respondents (45%) indicating they were planning to quit or consider leaving the profession.
AI therapists
Several NHS Trusts are piloting AI apps, known as Digital Mental Health Interventions (DMHIs), to help people with mental health challenges.
Unlike the day-to-day apps used to support well-being, these apps are subject to standards and regulations. The AI apps must integrate into existing care pathways and the delivery infrastructure, and staff must be upskilled to use them.
It is also suggested that they should supplement rather than replace face-to-face delivery. Small-scale trials have reported that fully automated chatbots DMHI can reduce anxiety and depression, and academics have stated that these could be used to support people while waiting to be seen by a professional.
Naturally, this poses many risks, such as data privacy and safety. Also, human professionals may forget to oversee what is happening with the patient, leaving the AI robot and the patient alone and leaving decision-making to an AI bot!
However, a new study with Dartmouth reported they used a Gen-AI-powered chatbot to provide mental health treatment in a randomised controlled trial. The bot was fine-tuned on mental health conversations created by experts, consisting of cognitive behaviour therapy (CBT). Participants who used the personalised Therabot experienced significant improvements in their symptoms over an eight-week period where 51% said their symptoms for depression were decreased, as were the 31% who suffered from anxiety and 19% improved with an eating disorder.
It is very early, and these chatbots and apps are only being trialled with limited time and data to make such critical decisions. Can therapy work or be effective without human connection?
AI cannot replace the human therapist
People are using ChatGPT for therapy because it’s free and instant. The benefits of speaking to AI robots are that they don’t judge, so people don’t feel worried about what they are asking or talking about in the same way they would if it were a human.
They can ask for advice on what they should do in social or work situations, or children use it to ask about friendship issues. The answers are friendly, creative, and helpful, but it is a robot programmed by an algorithm, so it is not personalised. Let’s not forget that it is a machine! And not trained to deal with individual mental health requirements.
A human therapist knows the patient as a person; they know their back story, what they like and dislike, and they can read their facial expressions and body language. They challenge them with questions, forcing them to think outside the box or from another person’s perspective to help their healing journey.
An AI robot cannot do all this, lacking empathy, nuance, reflection, and understanding. As humans, we are all different and don’t fit into an algorithm, especially someone who has complex mental issues.
A significant concern is that whatever you put into the algorithm is recorded and used elsewhere, which poses significant security concerns for people’s personal, sensitive, and highly confidential data.
AI for adolescence
Any parent that has watched the global hit harrowing TV show Adolescence has almost certainly reduced screen time and monitored usage.
Are parents really going to be comfortable with the suggestion that while they wait about 18 months to two years to see a mental health professional, they can use an AI bot in the interim? I would have thought not!
AI has its place
AI is the perfect solution to streamline processes in the back office and relieve mental health professionals of unnecessary burdens. Enabling them to focus on patient care rather than administrative tasks and reporting. It can also help improve triaging, prioritising, and screening emails and calls in the contact centre and highlight emergency calls.
Plus, automatically sending personalised reminders for appointments in advance or chasing patients who have missed appointments is very costly to the NHS. This can lead to shorter waiting times, improved patient care, and retention of staff who are relieved of the overburden of admin work.
Balance approach
AI to improve processes and efficiencies in the back office is a no-brainer, and it can also be used in labs for precise diagnosis, monitoring and risk.
But to start using it to plug the gap for fragile and vulnerable people, especially children waiting to see a mental health professional from up to 18 months to two years, is high risk.
It is an easy, quick deterrent from the real issue that needs addressing: to increase the investment to recruit more mental health nurses and offer them the higher pay they deserve. If we don’t, then the mental health crisis will escalate to robots or not!