Across the NHS, technology is often slowing people down rather than speeding them up. We’ve got a lot to do to find ways to release precious clinical time to be put to other purposes.
So acknowledged Tara Donnelly, Chief Digital Officer (CDO) of NHSX, at the Intelligent Health UK conference in London this week (Wednesday), where she gave an update on how the seven-month-old unit is addressing this challenge.
One of the key problems is a lack of interoperability across NHS systems; here, Donnelly pointed to the NHS login. This was built as a way to authenticate for the NHS app, but has since been able to integrate with lots of other products, and has now been opened up as an API on a self-service basis.
This is really good news for digital health innovators. Authentication is tricky and time consuming and costly to develop. This is a wheel we don’t think needs constant reinvention, we would much rather open up national NHS tools to the market, to the eco-system, so they can use it for good for more patients.
NHSX ran a three-month program from last July to September, asking health innovators what the unit should focus on, and one of the biggest areas was procurement, especially for new players. In response, NHSX has already made changes to the procurement process. Donnelly explained:
Procurement can be really difficult for companies just beginning within the NHS. Sometimes the procurement rules that the NHS sets up, not on purpose but inadvertently, can shut out people who are earlier stage.
NHSX has now opened up Lot 0, or the innovation greenhouse, which features different sets of criteria than the other lots to help startups and smaller businesses get their tech out across the NHS. Donnelly said:
That was a way we can try to link some of the work we’re doing with digital health standards and say that if you meet these standards, it should be easier for NHS organisations to commission you. We’re pleased to have something practical moving forward in the procurement space.
More pressing concerns?
However, not everyone is as enthused about the potential for tech across the NHS as Donnelly, who recounted a recent conversation with someone who had just spent the evening in the busiest A&E in the country on a wet winter evening.
They said, ‘I really love this digital innovation stuff but AI felt very far away from my experience that evening.’ I have a lot of sympathy for that point, I come from an operational background, I am absolutely rooted in the real life of the NHS.
But I think if we select those tools that meet our most pressing problems, the opportunity is huge to make a difference in that A&E, and in many others across the country.
She went on to share examples of technology projects already having an impact.
Skin Analytics is an attachment that goes on the back of a smartphone, and looks for early stages of skin cancer. A study of seven NHS trusts, led by the Royal Free, compared the ability of clinicians and AI to detect cancer, looking at images of moles. The AI tool successfully identified all the melanomas, proving it was just as good as human detection.
Donnelly noted that if skin cancer is caught at stage 1, the patient has a 95 percent chance of complete recovery; if it’s only caught at stage 4, this goes down to as low as eight percent. The AI tool found a huge number at stage 0 and 1, which could make an enormous difference to how people recover.
Health Navigator, meanwhile, uses AI to help focus on the most important patients in terms of their likeliness to visit A&E. The Nuffield Trust ran a test with a cohort of 1,000 patients, and compared to control they had 34 percent fewer A&E attendances as a result of having nurse coaching instead. Donnelly said:
That argument that these tools are futuristic and that they are not dealing with today’s problems is not necessarily the case. It may be we need to prioritise those for which the NHS and social care has the most pressing demand. Data-driven technology doesn’t have to be fancy to make a difference.
A central focus on AI
Looking ahead, AI will be a core focus for NHSX. The unit received £250m of government investment to establish an AI Lab, which includes £140m for a newly launched AI Award.
However, Dr Allison Gardner, Programme Director for Data Science Degree at Apprenticeships at Keele University and Co-Founder, Women Leading In AI, gave a rather stark warning about the use of AI across the NHS.
Gardner cautioned that with systems like Skin Analytics, the AI might not be as good at detecting skin cancer through mole images on black skin as on white skin, especially if the data used to build the algorithm comes from primarily fair-skinned populations in the United States, Australia and Europe.
Another example is an algorithm being used in the states to assess potential child neglect and indicate the need for further investigation. The results are not accurate about 60 percent of the time, and have been shown to be biased against poor families. However, as the system displays a risk score shown on a big red screen at the end of the assessment, Gardner noted that it’s hard for social workers to argue against the score, and there’s evidence that they default to it as it protects them by being able to blame the algorithm. She added:
One of the things that has happened is that if you are a less skilled user of these systems, you trust them more. Highly skilled people tend to not trust the algorithm as much. So not only have they reduced the workforce, in terms of numbers, but they’ve actually lowered the skill level of the workforce as well, as they found this algorithm was beginning to train the workers.
As the experienced ones leave or retire, the new ones are becoming trained by that algorithm, which isn’t that accurate and is biased. Be conscious of those problems in these implementations.
Gardner also called out an algorithm that predicts paediatric illnesses, and was found to perform as well as experts. However, this was only the case when the AI was compared to sample groups of junior doctors, not against the senior consultant level staff. Gardner said:
How are they going to use that algorithm? If the junior doctors are using it to train with, will they get to be as good as the senior doctors?
This is a little warning that we need to be a little bit more careful about how we look at using AI.