Saving money and lives
In a recent case Gladding was involved with, retrospective use of AI to read an X-ray found lung cancer in a woman in her sixties – a lesion that had been missed when the image was taken nine months earlier. “Unfortunately, this particular technology is not approved for clinical use, but hospitals need to seriously consider how to use AI, because it has the potential to significantly streamline our workload.” He says large hospitals in the US are developing AI technologies, validating and using them clinically. “We need to start thinking about doing the same. He says in another case involving an advanced ECG, the test identified the genetic cause for a sudden cardiac arrest in a 32-year-old woman.
Gladding believes AI can play an important role in public health, but needs an influential supporter at government level to drive its use. He’d like to use it as a tool for screening and triaging heart patients waiting for hospital clinic appointments, for example, after advanced ECGs performed well in a pilot study. He says it could save money and lives. “A lot of the focus is on big-ticket items. We see lists blowing out in CT or MRI scanning, so people buy a new machine for $1-2 million. These things get funded almost without much question. People seem to overlook tools they already have, thinking you can’t get any more out of the basics, but that just isn’t true.”
He says the public, and health authorities, need to overcome their scepticism about the technology. “The public should know more so they are kept in the loop about what’s happening with their data, and discover that it’s not all hype and that it is exciting. With the right governance and oversight, it won’t become Big Brother or rogue intelligence that could prevent people getting access to healthcare. If anything, you expect it would do the opposite, by enabling primary care to make much more informed decisions, rather than the ambulance at the bottom of the cliff approach we have now.”
In two studies supervised by Gladding at the Waitematā DHB and published in 2015 and 2017, more than 370 ECG readings were assessed by advanced ECG AI, cardiologists and GPs. The AI-read ECG was better at identifying disease than either doctor, and predicted which patients would be readmitted with heart failure or suffer a sudden cardiac death.
“There is a temptation in medicine to be cautious, and medicolegally defensive, which results in overcalling disease, over-ordering investigations and procedures to make sure not to miss anything. That reduces the ability to see health when no disease is present. The advanced ECG is not only better at seeing disease, but substantially better at seeing health than a human.”
Gladding says some patients have to wait longer than they should with current, time-consuming triaging methods, and the AI could also speed up diagnoses. “When I am triaging, I might spend a minute or two or less reading the GP’s notes, so they have to be brief and to the point. Generally, I won’t go into the rest of that patient’s clinical records to see what’s happened in the past. I just don’t have time. But a machine could do that in an instant and bring other information to the fore.”
At present, machine triaging is being investigated by Precision Driven Health, an initiative managed by Auckland-based software company Orion Health. “It will be a while before we get to the point where we have only machine-assisted triaging, if we do at all. We may discover things you just can’t predict.” However, Gladding says, a retrospective analysis of patients referred for ECGs and ultrasound has found that machines predicted, with about 85% accuracy, what an ultrasound would later find, based on information doctors had already gathered.
The advanced ECG was pioneered by flight surgeon Todd Schlegel when he worked at the National Aeronautics and Space Administration’s (Nasa) Johnson Space Centre and has been used on the International Space Station. Schlegel, now in Switzerland, has been working with Gladding to fine-tune the algorithms for cloud-based analyses of advanced ECGs done in Auckland. More than 1000 have been collected and analysed so far.
Nasa technology is also fundamental to research by a 19-year-old software-engineering wunderkind at the Auckland Bioengineering Institute (ABI), who is training artificial intelligence to interpret cardiac ultrasounds.
Will Hewitt “dropped out” of Nelson College without completing his final year to set up his own electronics consultancy doing small projects before approaching public-health physician and health-technology expert Robyn Whittaker about his passion for AI and cardiac ultrasound imaging. Whittaker put the then 17-year-old in touch with Gladding, who directed him to the University of Auckland ABI, which, under the leadership of internationally renowned professor Peter Hunter, specialises in building computational models of the heart. The work has aimed to better understand the biology of the heart and how it works, in the hope of improving diagnostic technology and potentially developing drugs or therapies to correct abnormalities. Information fed into the diagnostic algorithms is highly personalised, including blood, biomarker and genetic data.
Hunter had no qualms about taking on the undergraduate as a part-time researcher, and describes the machine-learning work as “Will’s baby”. People with the most in-depth knowledge on any subject are those who are passionate about the subject, not those tied to their textbooks, he says.
“He’s incredibly fast at absorbing information, and very focused,” Gladding says of Hewitt. “When people turn up as he did, you wonder how much they can know about something complex, such as cardiac ultrasound. It requires years of training to understand, but he’s like a sponge. He’s soaked it up in six months to a year.”
Hewitt is doing an undergraduate degree in applied maths and physics, while also working to commercialise the software that interprets the ultrasounds. He has submitted a paper to the European Society of Cardiology conference to be held in Paris in August, about one of his studies, which showed that AI assessment is more consistently accurate than that of the human eye. More computer power could make AI readings much faster and far more accurate, he believes. “Patrick and a sonographer can spend anything from 40 to 50 minutes annotating the ultrasound and taking measurements off the scan. The object of the deep-learning project is to take those measurements for him, in 10 minutes rather than an hour. We are trying to see if we can get the same numbers he does, but faster.”
If all the research went perfectly, it could still be five years or more before the technology is applied clinically or commercially, says Hewitt. And there is a risk of doctors being too reliant on an AI technology, which is only as good as the data that’s fed into it, including outliers. “You can’t release a tool that might miss 5% of patients.”