22.2 C
Genève

ChatGPT ‘uncovered woman’s rare condition’ after years of misdiagnosis

Published:

ChatGPT has helped to uncover a woman’s rare condition after years of being misdiagnosed by doctors.

Phoebe Tesoriere, 23, claims she was told she was anxious, depressed, had epilepsy and warned she’d be treated as a mental health patient if she kept returning to A&E.

Following three days in a coma after a seizure, Phoebe, from Cardiff, put her symptoms into the AI chatbot.

She said it suggested a number of conditions, including hereditary spastic paraplegia, which Phoebe presented to her GP. Genetic testing confirmed the diagnosis.

Cardiff and Vale University Health Board said: « We are sorry to hear about Phoebe’s experience while in our care. »

GP Dr Rebeccah Tomlinson said if people use tools such as AI chatbots to research health concerns, these should then be discussed with a medical professional.

A recent University of Oxford study found that people using AI for healthcare advice were given a mix of good and bad responses, making it hard to identify what advice they should trust.

Phoebe understands the challenges the hospital faced diagnosing her, but said she turned to AI after finding the experience « really lonely ».

« I had to fight to be listened to, » she added.

« All my childhood I had a limp.

« I was born without a hip socket and had operations as a baby, so thought it was to do with that. »

She also struggled with her balance as a child and was tested for dyspraxia – a condition which affects physical co-ordination.

But she did not have this.

When Phoebe was 19, she collapsed and suffered a seizure at work.

But she says doctors told her it was anxiety, which was then added to her medical records.

« I had no history of anxiety, I was a really happy, bubbly person, » she said.

In 2022, Phoebe says, she was diagnosed with epilepsy and prescribed medication.

In December 2024, she began to feel unwell again.

She could not keep her epilepsy medication down which led to more seizures.

Phoebe was struggling to walk, and was misdiagnosed with Todd’s Paralysis, a neurological condition experienced by individuals with epilepsy where a seizure is followed by a brief period of temporary paralysis.

In January 2025, Phoebe fell down the stairs, which led to three months in hospital and inconclusive tests.

Then in July 2025, a significant seizure left Phoebe in a coma for three days.

When she recovered, she claims a doctor told her she did not have epilepsy – she had anxiety.

It was then that Phoebe put her symptoms into ChatGPT.

The chatbot came back with a list of possible conditions, including hereditary spastic paraplegia.

« I went back and forth with my partner, questioning ‘do I go to the doctors?’, ‘do I not?’, ‘what should I do?’, ‘surely it can’t be that’, » she said.

Thankfully, the GP agreed it could be a « plausible reason ».

Genetic testing then confirmed the AI suggestion.

The NHS says it is not known how many people have hereditary spastic paraplegia because it is often misdiagnosed.

Symptoms can be managed through physiotherapy.

Phoebe is no longer able to continue working as a special educational needs teacher due to her symptoms and uses a wheelchair.

She is now pursuing a new career path, studying a masters degree in psychology because she still wants to « do something that helps people ».

A spokesperson for the Cardiff and Vale health board said: « As it would be inappropriate to comment on an individual patient case, we are unable to comment further.

« Phoebe is welcome to contact our concerns team should she wish to discuss any aspect of the care she received at Cardiff and Vale University Health Board. »

Dr Rebeccah Tomlinson is a GP serving Cardiff and Vale of Glamorgan, and said: « It’s difficult for GPs to know everything.

« With the pressure on the NHS, we have to know even more.

« Patients coming with information helps me understand what they are thinking and guide the discussion more clearly.

« It’s good as a starting talking point [AI tools] which should be followed by going to a medical professional to discuss concerns further.

« It’s helpful for patients to come armed with information but the GP has to be open and receptive to the patient.

« General practice has to be a two-way conversation. »

While AI is increasingly becoming part of everyday life, using it for health purposes has divided opinion.

Earlier this year, a study from the University of Oxford found that AI chatbots gave inaccurate and inconsistent medical advice that could present risks to users.

The research found people using AI for healthcare advice were given a mix of good and bad responses, making it hard to identify what advice they should trust.

In January, a new ChatGPT feature was launched in the US with the aim of analysing people’s medical records to give them « better answers », according to developer OpenAI.

The company said it was not meant to be for « diagnosis or treatment », but 230 million people ask its chatbot questions about their health and wellbeing every week.

While campaigners raised concerns about ChatGPT Health having access to sensitive health data, OpenAI said it was designed to « support, not replace, medical care ».

It is unclear if or when the feature will be introduced in the UK.

While debate rages over their use, millions of people, including Phoebe, are increasingly using AI tools for tasks that range from personalising social media feeds, to spotting friends and family in smartphone photos, and asking for advice on everyday issues.

Articles récents

spot_imgspot_imgspot_imgspot_img

Sélection de la rédaction

spot_imgspot_imgspot_imgspot_img