The NHS must rethink its plans to replace mental health counselors with artificial intelligence (AI), experts warn.
Smartphone apps designed to support people with anxiety and depression are being launched in parts of England. The software is even being offered to some patients on NHS waiting lists as part of an ongoing trial.
The interactive “chatbots” help people with mental illness by guiding them through cognitive behavioral therapy – a form of talk therapy – as well as meditation and breathing exercises to ease their suffering.
But the initiative – first proposed by former health secretary Matt Hancock in June 2021 – has raised alarm that some patients who need good mental health care may be turning to apps instead of getting the help they need.
And some experts worry that a lack of human involvement may actually exacerbate the mental health problems of vulnerable people.
The interactive “chatbots” help people with mental illness by guiding them through cognitive behavioral therapy – a form of talk therapy – as well as meditation and breathing exercises to ease their suffering

The initiative has raised alarm that some patients who need quality mental health care may turn to the apps instead of getting the help they need
The British Association of Counseling and Psychotherapy (BACP), the peak professional body for mental health professionals, told The Mail on Sunday that the NHS should not try to address the national shortage of such professionals by simply passing them on to AI -powered chatbots that are being replaced.
The organization called on the NHS to focus instead on hiring more staff. “We do not believe that AI can recreate and replace the human elements of therapy,” said Martin Bell, head of policy and public affairs at BACP.
READ MORE: NHS PATIENTS GET MENTAL HEALTH TREATMENT VIA CHAT

“Counseling is based on a deeply human process involving complex emotions. “The relationship between therapist and client plays a crucial role in therapy.”
Around five million Britons suffer from anxiety or depression, and around 1.2 million are waiting to see an NHS mental health specialist. According to government figures, this includes nearly 700,000 children. Waiting times are so long that thousands of patients come to emergency departments seeking help, according to the Royal College of Psychiatrists.
Experts claim that AI chatbots are now being used to tackle this growing crisis. The smartphone app Wysa has already been made available to thousands of teenagers in West London to help them cope with mental illness.
When a user logs in, the app asks how their day is going. For example, if they’re feeling anxious, the chatbot will guide them through meditation and breathing exercises to calm their mood, using language meant to express empathy and support.
The app is also being used in a £1m study for patients on the NHS mental health waiting list in North London and Milton Keynes, comparing their wellbeing with that of other patients on the waiting list without access to the app.
However, many published studies highlighting the benefits of Wysa—and another widely used app called Woebot—were conducted by the companies themselves. Experts fear that this could render the software ineffective.
“Some people may feel less embarrassed to talk to a chatbot about their mental state,” says Dr. Elizabeth Cotton, lecturer at Cardiff School of Management and author of a forthcoming book, UberTherapy: The New Business Of Mental Health.
“But a chatbot can’t handle clinical depression.” They don’t do much more than say, “Shut up, honey.” And they are no help for young people living in poverty, excluded from school or with abuse. people that parents have to do.”
Another problem is that some chatbots “hallucinate,” meaning they make up answers when they can’t give an appropriate response—which is inherently dangerous for someone in a delicate mental state.
Meanwhile, AI was also found responsible in the case of 21-year-old Jaswant Singh Chail, who was sentenced to nine years in prison last month for breaking into Windsor Castle in 2021 to kill the Queen with a crossbow.
It was revealed at the Old Bailey trial that Chail exchanged more than 5,000 messages with an online bot he created through an app called Replika – which describes itself on its website as an “empathetic friend”.
And the National Eating Disorders Association in the US was forced earlier this year to pull the plug on Tessa, a chatbot it developed to replace caregivers. It followed claims by former eating disorder patient Sharon Maxwell of San Diego that the bot told her a good way to cope was to weigh herself regularly and even measure her body fat percentage with a caliper – all steps that likely her condition will worsen.
“If I had used this chatbot when I was in the middle of my eating disorder… I wouldn’t be alive today,” she wrote on Instagram.
A spokesperson for Wysa says the AI chatbot is programmed to only provide doctor-approved answers. “Wysa’s reactions are predetermined by doctors, which reduces the likelihood of saying inappropriate things,” she added. “We do not market ourselves as an app suitable for someone who has suicidal thoughts or wants to harm themselves.”
An NHS spokesman said: “The National Institute for Health and Care Excellence has made it clear that the digital therapies it has provisionally approved for mental health care are not a replacement for NHS therapists.”
Source link

Crystal Leahy is an author and health journalist who writes for The Fashion Vibes. With a background in health and wellness, Crystal has a passion for helping people live their best lives through healthy habits and lifestyles.