Patient Voices in the Age of AI: Opportunity or Visibility Theft?
- Oct 27
- 3 min read
Is a patient’s story still their story if it is told - or retold - by a machine?

Written by Patrick Toland - Ada Lovelace Centre AI Community
Researcher (previously PPI Lead for a large Health organisation
and CEO of Northern Ireland Rare Disease Partnership).
Artificial Intelligence (AI) is moving quickly into many corners of health and research. From summarising large datasets to generating new forms of communication, AI tools are now part of the conversations we need to have in patient and public involvement (PPI). At first glance, the idea of using AI to support patient storytelling sounds promising. Stories can be translated across languages, into images, made more accessible for different audiences, and woven into reports or campaigns more efficiently. But as with any technological advance, the opportunities come hand-in-hand with ethical and practical challenges.
Amplifying voices: the art of story-holding in an AI age
Patient stories have long been the heart of PPI. They bring texture and humanity to often dry evidence, help decision-makers see beyond the numbers, and remind us who is at stake in policy and practice. AI tools can offer new ways to amplify these voices.
Yet this responsibility does not disappear with new technology - if anything, it becomes more important. AI should never replace human empathy, listening, and co-creation. The challenge is not whether we use AI, but how we ensure that patients remain at the centre of decisions about when and how their stories are told.
Ethical frameworks and emerging risks
The T.E.S.T. framework (Transformational Ethical Story Telling), developed by Our Race Community (2021), is a helpful example. It emphasises creating safer spaces, sharing power, and giving story holders control over how their narratives are used. Story-holding is not just about capturing words; it is about holding the dignity, context, and meaning of someone’s lived experience with care (Toland, 2023, The Art of Story-Holding).
However, new technologies can complicate this. In AI, Rare Disease Research, and the Risks of “Synthetic Patients” (SWII, 2025), the creation of AI-generated profiles to stand in for real people is presented as a potential solution to evidence gaps in rare disease research. While efficient, this approach risks visibility theft - making patient experience more visible on the surface while stripping away the nuance, authenticity, and trust that come from lived voices. Overall, digital stand-ins cannot meet regulatory requirements for genuine patient input, nor can they reflect the cultural and contextual richness of real lives. Synthetic patients may complement engagement by mapping gaps or improving accessibility, but they should never substitute for real voices.
Safeguarding authenticity in PPI
So what does this mean in practice? A few guiding principles may help:
Consent: Patients need to know if and how AI will be used to process or present their stories. Transparency around AI involvement is key.
Narrative ownership: Use recognised/accredited story gathering frameworks to ensure patients (and their partner story holders) can decide what parts of their experiences are shared, edited, or amplified. Let them know they will always be in control.
Keep ‘the human’ in the loop: Even when AI helps with summarisation, translation, or synthetic data, there must be human review and oversight - especially to protect nuance and avoid bias or oversimplification.
These safeguards are not about slowing down innovation but about ensuring trust. Without trust, patient stories risk becoming content to be managed rather than truths to be honoured.
Looking ahead
AI is here to stay, and its potential in health research and PPI is undeniable. It can help us hear more voices, connect across boundaries, and strengthen the visibility of patient experience. But visibility without authenticity is a hollow gain.
The task for all of us in the PPI community is to make sure that new tools enhance, rather than overwrite, patient voices. If we get this right, AI can become a powerful ally in story-holding. If we get it wrong, we risk turning lived experience into something polished but disconnected from its source - in the most disingenuous way possible.
So let me leave you with two questions: How do you see AI fitting into the future of PPI storytelling? And what safeguards or opportunities do you think matter most?
SWII (2025) AI, Rare Disease Research, and the Risks of “Synthetic Patients”. Rare Insights, 9 September. Available at: AI, Rare Disease Research, and the Risks of “Synthetic Patients” | swii.ch health
Our Race Community (2021) T.E.S.T. Framework: Transformational Ethical Story Telling. Available at: T.E.S.T. — Our Race Community [Accessed 23 September 2025].
Toland, P. (2023) The Art of Story-Holding: How Ethical Storytelling Helps Charities Create Real Change. Medium. Available at: The art of story holding: How ethical storytelling helps charities create real change | by Patrick Toland | Inside the Joseph Rowntree Foundation | Medium



Comments