DeepMind AI matches health experts at spotting eye diseases

DeepMind has successfully developed a system that can analyze retinal scans and spot symptoms of sight-threatening eye diseases. Today, the AI division -- owned by Google's parent company Alphabet -- published "early results" of a research project with the UK's Moorfields Eye Hospital. They show that the company's algorithms can quickly examine optical coherence tomography (OCT) scans and make diagnoses with the same accuracy as human clinicians. In addition, the system can show its workings, allowing eye care professionals to scrutinize the final assessment.

At the moment, hospitals and clinics use flesh-and-bone specialists to dissect OCT scans. The sheer volume they have to process, however -- Moorfields Eye Hospital analyzes over 1,000 every day -- means there can be substantial delays between the initial scan, diagnosis and treatment. Occasionally, problems are caught too late and the developing symptoms cause permanent and irreversible sight loss.

DeepMind's ultimate aim is to develop and implement a system that can assist the UK's National Health Service with its ever-growing workload. Accurate AI judgements would lead to faster diagnoses and, in theory, treatment that could save patients' vision. "These incredibly exciting results take us one step closer to that goal," Mustafa Suleyman, co-founder and head of applied AI at DeepMind Health said. "And could, in time, transform the diagnosis, treatment and management of patients with sight threatening eye conditions, not just at Moorfields, but around the world."

 

A consultant ophthalmologist analyzes an OCT scan.

DeepMind's system uses two separate 'networks' to tackle the problem. The first, called a segmentation network, converts the raw OCT scan into a 3D tissue map with clearly-defined, color-coded slices. "That map doesn't only describe the layers of the eye, but if there's disease in the eye, and where that disease is," Alan Karthikesalingam, a senior clinician scientist at Google DeepMind said. The network was trained to do this with a dataset that contained 877 OCT scans manually segmented by trained ophthalmologists.

A second 'classification' network analyzes the 3D tissue map and makes decisions about what the diseases might be and how urgent they are for referral and treatment. It was trained on 14,884 tissue maps that were produced by the segmentation network and checked by a trained ophthalmologist and optometrist.

The two-stage process is unusual. A conventional AI system would start with the original retinal scan and go straight to the final diagnosis. DeepMind developed its tool this way so clinicians can check the tissue map and see how the AI came to its final conclusion. "You might wonder, 'why did the system think that there's macular edema, which means fluid in the eye?'" Karthikesalingam explained. "And you could look back at that interpretable tissue map and say 'Oh I see, it's highlighting some fluid here,' And that would be, we think, potentially helpful."

An OCT scan.

That breakdown promotes trust and gives eye care professionals the information needed to debate complex cases with multiple treatment options. The separation also ensures that the system can be used with any type of OCT scanner. Oftentimes, a slight change in hardware will produce OCT scans that deviate slightly from the dataset the AI was originally trained on. These can be enough to break the system and require hundreds, if not thousands of new training images to be effective again.

"It's a big step toward AI systems that are agnostic."

"That's fine in many non-healthcare fields because of the widely available nature of non healthcare images, but in healthcare it could be a real challenge, because imagine a scanner gets updated — it wouldn't be great for patients to then have to wait for several years before hundreds of thousands of new images have been taken on the new scanner," Karthikesalingam said. Deepmind's system, by comparison, only requires 100 images to retrain the segmentation network. The classification network, meanwhile, can be left alone. "It's a big step toward AI systems that are agnostic, or at least very flexible and quick to adapting to new devices."

The system needs to pass clinical trials and regulatory approval before it can be used on the frontlines of the NHS. DeepMind also wants to validate its results with further testing and refinements to the underlying algorithms. That, according to Karthikesalingam, could take another three to five years. Moorfields Eye Hospital is hopeful about the technology's future, though. Professor Sir Peng Tee Khaw, director of the NIHR Biomedical Research Centre at Moorfields Eye Hospital said: "I am in no doubt that AI has a vital role to play in the future of healthcare, particularly when it comes to training and helping medical professionals so that patients benefit from vital treatment earlier than might previously have been possible."

 

Technician performing an OCT scan on a patient.

Last year, DeepMind was criticized over a data-sharing deal struck with the Royal Free NHS Foundation Trust. The company used 1.6 million patient records to develop an app called Streams that could diagnose acute kidney injuries in NHS patients. The legal basis of the transfer, though, was deemed inappropriate by the UK's national data guardian and the Information Commissioner's Office. "There was some controversy about that project, and concerns were raised around, for example, how patients at the Royal Free were informed about how their data was being used," Dominic Kind, clinical lead at DeepMind Health said. "We have certainly learned from that work and the guidance given."

"We're working very hard to be as transparent as we can be."

The AI system developed with Moorfields, however, is strictly a research project and falls under a different legal purview. Moorfields applied to the UK's Health Research Authority for anonymized OCT scans and retains control of the resulting database. (Moorfields is already using the database for nine other medical research studies.) Brown says the team has taken steps to inform the public about the project through various charities, the Moorfields website and hospital, and DeepMind's own website. The Alphabet-owned division is now working on technical infrastructure that logs every time one of its AI systems interacts with sensitive data.

"Patients and the public have an absolute right to know how their data is being used, and by who," King said. "We're working very hard to be as transparent as we can be."

بالا