Public experiences with AI in NHS care reveal both optimism about potential benefits and concerns regarding risks, accessibility, and accuracy as AI systems are integrated into health services
As the government aims for a world-leading AI-enabled NHS, initial feedback suggests that while AI tools may improve care, access, and efficiency, people have mixed feelings about their impact on safety, accessibility, and the accuracy of patient support. Healthwatch’s 2025 analysis reports both promising benefits in health advice and information, and challenges when patients use AI for appointments, advice, and communication.
Why patients turn to AI for NHS support
The 10 Year Health Plan aims to make the NHS the world’s most AI-enabled care system. The NHS is progressing, with providers using AI note-taking tools to generate real-time transcriptions and summaries of clinician-patient conversations.
Healthwatch examined how and why people use AI for health information and advice, and explored their attitudes and experiences with NHS administration.
The research team found that people were generally using AI to support their health because of difficulties accessing NHS services or poor experiences with the NHS. This included:
- Someone who turned to ChatGPT following frustration trying to get advice from NHS 111
- One person who was struggling to get a diagnosis from their GP
- Someone who used AI tools for advice because of a lack of physiotherapy appointments
“The earliest appointment was in 12 days, so I went to ChatGPT for advice.” Story shared by Healthwatch Somerset.
“Trying to talk to NHS 111 was exhausting, so in desperation, I put my info to ChatGPT. My age, my weight, etc, and it gave me the advice I needed like a good pharmacist.” Story shared by Healthwatch Hertfordshire.
AI advice: Personalised but not always accurate
People were often impressed by the personalised, detailed information from AI, noting that these tools sometimes offered more relevant advice than existing NHS online services.
However, a significant concern was the risk that AI tools would provide inaccurate information.
One person told Heathwatch they had consulted ChatGPT about a rash and accompanying pain. ChatGPT suggested they might have shingles, of which mild cases can be monitored without treatment. This person was eventually diagnosed with Lyme disease, which should be treated as soon as possible. Delayed treatment increases the risk of complications. Although the person was able to get the treatment they needed the same day, this could have easily not been the case.
In another example, a nurse personally tested an AI tool to see what it advised when they were unwell. The AI tool advised blood tests, though the nurse was certain they could manage their symptoms at home.
“I inputted [sic] my symptoms, but it didn’t ask me the questions I thought it would ask…It advised me I needed a blood test, but I knew I didn’t; it could all be done with self-care. I worry AI will be sending patients for tests they don’t need.”
AI struggles with NHS admin tasks
Healthwatch’s exploration of AI use in booking appointments and managing medications found that most people’s experiences were poor, highlighting widespread dissatisfaction with these administrative processes.
Many reported that AI systems could not understand their health or care needs. People also cited problems with WhatsApp AI receptionists, finding them hard to navigate and often unable to transfer bookings to the NHS.
Requesting or changing medications through AI was challenging. One person requested a change due to side effects but received the same medication, causing a treatment delay.
One person reported that: “The AI system is not able to work anything out for itself without being prompted… The AI system is the one left to work out my medication requirements, and it’s obvious that it is not that intelligent as it is unable to read detailed individual notes, or work out individual medication dispensing information.”
AI and accessibility for people with communication needs
One area of AI within the NHS that raises concerns is accessibility for those with communication needs. For example, one person registered as blind told us that some AI chatbots are not compatible with screen-reading software. People with autism and/or learning difficulties also found that AI systems couldn’t understand them or their requests.
One person said AI would be inaccessible for some vulnerable groups, including digitally excluded people, those with sensory or cognitive impairments, communication needs, or mental health issues.
“All I want to do is order a prescription, but because of my learning difficulties and speech issues, this AI system won’t understand me like a real human.”











