THE CMG VOICE

Artificial Intelligence continues making inroads into medicine by interpreting x-rays

Followers of this blog may recall a recent [post](https://cmglaw.com/Blog/2018/04/Radiologists-watch-out-Artificial-I) on how Artificial Intelligence (AI) was being used to screen patients for the eye disease diabetic retinopathy.

Mere months later, more data has come out about the algorithm “CheXNeXt”, created by researchers at Stanford. It was created to review chest x-rays for 14 different medical conditions.

Recently, it was “trained” with over 100,000 x-rays, after which it was tested against a panel of three trained (human) radiologists. CheXNeXt and each of the radiologists reviewed 420 x-rays one by one. The results were encouraging (for AI): for 11 of the 14 diseases, CheXNeXt was as good or better than the radiologists at catching the disease.

Not only is CheXNeXt’s accuracy similar to board certified radiologists, it comes with the added advantage of being very fast. While each of the radiologists reviewing the 420 images did so in about three hours, it took CheXNeXt 90 seconds.

This has some practice advantages. First, it may be used in underserved parts of the world where skill radiologists are lacking. Second, it can be used as a triage tool. For example, if a patient came in to the ER and his physical exam and lab results were consistent with pneumonia, the ER doc could use CheXNeXt to read the patient’s x-ray quicker than waiting for the radiologist to read it. In such a circumstance, the confidence in the diagnosis would be high, and antibiotics could be given the patient more quickly. However, if CheXNeXt came up with a different diagnosis, then a radiologist could review the images and consult with the ER doctor as needed.

Additionally, it might also serve as a quality control, scanning the images interpreted by radiologists during the day, and making sure that there were no “missed” diagnoses.

Forecasting the future, as much as this article makes efforts to not conclude the jobs of radiologists may be at stake, one cannot help but make that conclusion. From a medical negligence perspective, it will be interesting how claims involving radiology “misses” are dealt with in the setting of an algorithm essentially making the diagnosis. If there is no human health care provider, did anyone commit malpractice?

It may be that claims such as these in the future focus on either the manufacturer of the algorithm, or on how it is implemented. Either way, the future of medicine – and accountability for harms resulting from negligence care – looks to be changing. And quickly.

You can read more here:

[Artificial intelligence rivals radiologists in screening X-rays for certain diseases](http://med.stanford.edu/news/all-news/2018/11/ai-outperformed-radiologists-in-screening-x-rays-for-certain-diseases.html)