AI is just as good as a doctor at analyzing chest X-rays

Shawn Knight

Posts: 15,296   +192
Staff member
Forward-looking: Artificial intelligence is equally as good – and in some cases, even better – than doctors at analyzing X-rays and diagnosing certain medical conditions. Researchers from the University of Warwick trained their AI using 2.8 million historic chest X-rays from more than 1.5 million patients to check for 37 different conditions. The X-rays were collected over a period of 13 years from three hospital networks in the UK.

For 35 of the 37 conditions, the AI was just as accurate or more accurate than a doctor's analysis at the time the X-rays were taken.

To verify the AI's accuracy, a selection of over 1,400 X-rays from the study was also examined by a group of senior radiologists. They compared the results from the AI with the diagnoses made by radiologists when the scans were originally captured.

The researchers' AI is able to analyze scans as soon as they are taken and flag any potential conditions or abnormalities. It can also leverage a large language model to digest historical reports accompanying scans for a more in-depth understanding.

Dr. Giovanni Montana, a professor of data science at Warwick and lead author on the study, said the tool could be useful as an early screening step or as an "ultimate second opinion."

AI would eliminate the unavoidable element of human error, and could also weed out bias. As Dr. Montana highlights, if a patient were referred for an X-ray with a heart problem, the doctor will inevitably focus on that organ and could perhaps overlook a problem with the lungs.

The tech could also help lessen doctors' workload, and bring concerning scans to their attention sooner than they might otherwise be able to get to them. According to a recent poll by the Royal College of Radiologists, 97 percent of the UK's cancer treatment facilities experienced delays in treatments due to a shortage of radiologists.

The team's AI, dubbed X-Raydar, is available for the research community to trial in non-clinical applications through a pair of APIs.

Image credit: Anna Shvets

Permalink to story.

 
This is AI's actual use, as an assistant for work that you know how to do already but the AI can give a view of the problem with its bias instead of yours. A way to get another take on a problem without having to deal with other personalities, egos or simple availability problems.
 
Trust but verify.
That’s not exactly the meaning of trust.

What you have said is trust your bungee instructor but verify all the ropes are in good order and the correct weight and tied off correctly. Not exactly trusting.
 
The name, X-Raydar, has to change. It sounds like a camp superhero with chiselled good looks and the imagination of a walnut.
 
To say that this is as good as, or better than, doctors is one thing, and if that's true, then good for them! Congratulations. But to say that it can offer an unbiased second opinion (impossible; it will be biased by whatever training data was used) and will eliminate human error (also impossible; the system designers can still make mistakes, doctors can still look at the results of a completely different person, ...) are falsehoods.

Let's not overhype these achievements please.
 
I wonder if malpractice insurance will start either charging a premium for such tools or offer a discount depending on the statistical delta advantage or creating false positive analysis based on biases that weren't previous there? I guess to mitigate thier (malpractice insurance prospective) risk I could see ai tools being pushed. They will probably push for ai tools on hopes that ai will improve over time. Same goes for 3rd party insurance paying for such services. Does the 3rd party pay for multiple services and opinions or do they pay a higher premium for the practitioner getting right on the 1st try using ai tools statistically significant over non ai tools?
 
Back