Events Calendar

Mon
Tue
Wed
Thu
Fri
Sat
Sun
M
T
W
T
F
S
S
29
1
3
4
5
7
8
10
11
12
15
16
17
18
19
21
24
25
26
27
30
31
1
2
A Behavioral Health Collision At The EHR Intersection
2014-09-30    
2:00 pm - 3:30 pm
Date/Time Date(s) - 09/30/2014 2:00 pm Hear Why Many Organizations Are Changing EHRs In Order To Remain Competitive In The New Value-Based Health Care Environment [...]
Meaningful Use and The Rise of the Portals
2014-10-02    
12:00 pm - 12:45 pm
Meaningful Use and The Rise of the Portals: Best Practices in Patient Engagement Thu, Oct 2, 2014 10:30 PM - 11:15 PM IST Join Meaningful [...]
Adva Med 2014 The MedTech Conference
2014-10-06    
All Day
Adva Med 2014 The MedTech Conference October 6-8, 2014 McCormick Place Chicago, IL For more information, visit, advamed2014.com For Registration details, click here  
Public Health Measures Meaningful Use
2014-10-09    
12:00 pm - 12:45 pm
Public Health Measures Meaningful Use: Reporting on Public Health Measures Join Meaningful Use expert Jim Tate for a three part series of webinars addressing MU [...]
2014 Hospital & Healthcare I.T. Conference
2014-10-13    
All Day
Join us at our 2014 Hospital & Healthcare I.T. Conference and experience the following: Up to 125 Hospital & Healthcare I.T. executives from America’s most prestigious [...]
Connected Health Care 2014
Key Trends That will be Discussed at the Conference! Connected Healthcare 2014 is set to explore the crucial topics that are revolutionizing the connected health industry: [...]
HealthTech Conference
2014-10-14    
All Day
HealthTech Capital is a group of private investors dedicated to funding and mentoring new "HealthTech" start ups at the intersection of healthcare with the computer [...]
Health Informatics & Technology Conference (HITC-2014)
2014-10-20    
All Day
Information technology has ability to improve the quality, productivity and safety of health care mangement. However, relatively very few health care providers have adopted IT. [...]
HIMSS Amsterdam 2014
2014-10-20    
12:00 am
About HIMSS Amsterdam 2014 This year, the second annual HIMSS Amsterdam event will be taking place on 6-7 November 2014 at the Hotel Okura. The [...]
Patient Portal Functionality and EMR Integration Demonstration
2014-10-22    
2:00 pm - 3:30 pm
This purpose of this webcast is to present a demonstration to show how the Patient Portal integrates with EMR, as well as discuss how this [...]
Connected Health Symposium 2014
Symposium 2014 - Connected Health in Practice: Engaging Patients and Providers Outside of Traditional Care Settings Collaborating with industry visionaries, clinical experts, patient advocates and [...]
CHIME College of Healthcare Information Management Executives
2014-10-28 - 2014-10-31    
All Day
The Premier Event for Healthcare CIOs Hotel Accomodations JW Marriott San Antonio Hill Country 23808 Resort Parkway San Antonio, Texas 78761 Telephone: 210-276-2500 Guest Fax: [...]
The Myth of the Paperless EMR
2014-10-29    
2:00 pm - 3:00 pm
Is Paper Eluding Your Current Technologies; The Myth of the Paperless EMR Please join Intellect Resources as we present Is Paper Eluding Your Current Technologies; The Myth [...]
Events on 2014-09-30
Events on 2014-10-02
Events on 2014-10-06
Events on 2014-10-09
Events on 2014-10-13
Events on 2014-10-14
Connected Health Care 2014
14 Oct 14
San Diego
HealthTech Conference
14 Oct 14
San Mateo
Events on 2014-10-20
HIMSS Amsterdam 2014
20 Oct 14
Amsterdam
Events on 2014-10-23
Events on 2014-10-28
Events on 2014-10-29
Latest News

AI matched, outperformed radiologists in screening X-rays for certain diseases

radiologists in screening X-rays for certain diseases

In a matter of seconds, a new algorithm read chest X-rays for 14 pathologies, performing as well as radiologists in most cases, a Stanford-led study says.

A new artificial intelligence algorithm can reliably screen chest X-rays for more than a dozen types of disease, and it does so in less time than it takes to read this sentence, according to a new study led by Stanford University researchers.

The algorithm, dubbed CheXNeXt, is the first to simultaneously evaluate X-rays for a multitude of possible maladies and return results that are consistent with the readings of radiologists, the study says.

Scientists trained the algorithm to detect 14 different pathologies: For 10 diseases, the algorithm performed just as well as radiologists; for three, it underperformed compared with radiologists; and for one, the algorithm outdid the experts.

“Usually, we see AI algorithms that can detect a brain hemorrhage or a wrist fracture — a very narrow scope for single-use cases,” said Matthew Lungren, MD, MPH, assistant professor of radiology. “But here we’re talking about 14 different pathologies analyzed simultaneously, and it’s all through one algorithm.”

The goal, Lungren said, is to eventually leverage these algorithms to reliably and quickly scan a wide range of image-based medical exams for signs of disease without the backup of professional radiologists. And while that may sound disconcerting, the technology could eventually serve as high-quality digital “consultations” to resource-deprived regions of the world that wouldn’t otherwise have access to a radiologist’s expertise. Likewise, there’s an important role for AI in fully developed health care systems too, Lungren added. Algorithms like CheXNeXt could one day expedite care, empowering primary care doctors to make informed decisions about X-ray diagnostics faster, without having to wait for a radiologist.

“We’re seeking opportunities to get our algorithm trained and validated in a variety of settings to explore both its strengths and blind spots,” said graduate student Pranav Rajpurkar. “The algorithm has evaluated over 100,000 X-rays so far, but now we want to know how well it would do if we showed it a million X-rays — and not just from one hospital, but from hospitals around the world.”

A paper detailing the findings of the study was published online Nov. 20 in PLOS Medicine. Lungren and Andrew Ng, PhD, adjunct professor of computer science at Stanford, share senior authorship. Rajpurkar and fellow graduate student Jeremy Irvin are the lead authors.

Practice makes perfect

Lungren and Ng’s diagnostic algorithm has been in development for more than a year. It builds on their work on a previous iteration of the technology that could outperform radiologists when diagnosing pneumonia from a chest X-ray. Now, they’ve boosted the abilities of the algorithm to flag 14 ailments, including masses, enlarged hearts and collapsed lungs. For 11 of the 14 pathologies, the algorithm made diagnoses with the accuracy of radiologists or better.

Back in the summer of 2017, the National Institutes of Health released a set of hundreds of thousands of X-rays. Since then, there’s been a mad dash for computer scientists and radiologists working in artificial intelligence to deliver the best possible algorithm for chest X-ray diagnostics.

We need to be thinking about how far we can push these AI models to improve the lives of patients anywhere in the world.

The scientists used about 112,000 X-rays to train the algorithm. A panel of three radiologists then reviewed a different set of 420 X-rays, one by one, for the 14 pathologies. Their conclusions served as a “ground truth”— a diagnosis that experts agree is the most accurate assessment — for each scan. This set would eventually be used to test how well the algorithm had learned the telltale signs of disease in an X-ray. It also allowed the team of researchers to see how well the algorithm performed compared to the radiologists.

“We treated the algorithm like it was a student; the NIH data set was the material we used to teach the student, and the 420 images were like the final exam,” Lungren said. To further evaluate the performance of the algorithm compared with human experts, the scientists asked an additional nine radiologists from multiple institutions to also take the same “final exam.”

“That’s another factor that elevates this research,” Lungren said. “We weren’t just comparing this against other algorithms out there; we were comparing this model against practicing radiologists.”

What’s more, to read all 420 X-rays, the radiologists took about three hours on average, while the algorithm scanned and diagnosed all pathologies in about 90 seconds.

Next stop: the clinic

Now, Lungren said, his team is working on a subsequent version of CheXNeXt that will bring the researchers even closer to in-clinic testing. The algorithm isn’t ready for that just yet, but Lungren hopes that it will eventually help expedite the X-ray-reading process for doctors diagnosing urgent care or emergency patients who come in with a cough.

“I could see this working in a few ways. The algorithm could triage the X-rays, sorting them into prioritized categories for doctors to review, like normal, abnormal or emergent,” Lungren said. Or the algorithm could sit bedside with primary care doctors for on-demand consultation, he said. In this case, Lungren said, the algorithm could step in to help confirm or cast doubt on a diagnosis. For example, if a patient’s physical exam and lab results were consistent with pneumonia, and the algorithm diagnosed pneumonia on the patient’s X-ray, then that’s a pretty high-confidence diagnosis and the physician could provide care right away for the condition. Importantly, in this scenario, there would be no need to wait for a radiologist. But if the algorithm came up with a different diagnosis, the primary care doctor could take a closer look at the X-ray or consult with a radiologist to make the final call.

“We should be building AI algorithms to be as good or better than the gold standard of human, expert physicians. Now, I’m not expecting AI to replace radiologists any time soon, but we are not truly pushing the limits of this technology if we’re just aiming to enhance existing radiologist workflows,” Lungren said. “Instead, we need to be thinking about how far we can push these AI models to improve the lives of patients anywhere in the world.”

Other Stanford authors of the study are biostatistician Robyn Ball, PhD; undergraduate student Kaylie Zhu; former research assistant Brandon Yang; data scientist Hershel Mehta; research assistants Tony Duan and Daisy Ding; former research assistant Aarti Bagul; professor of radiology and of medicine Curtis Langlotz, PhD; assistant professor of radiology Bhavik Patel, MD; associate professor of radiology Kristen Yeom, MD; research associate Katie Shpanskaya; associate professor of radiology Francis Blackenberg, MD; clinical assistant professor of radiology Jayne Seekins, MD; clinical associate professor of radiology Safwan Halabi, MD; and clinical assistant professor of radiology Evan Zucker, MD.

Researchers from Duke University and from the University of Colorado also contributed to the study.

Lungren is a member of Stanford Bio-X, the Stanford Child Health Research Institute and the Stanford Cancer Institute.

Stanford’s departments of Radiology and of Computer Science along with the Stanford Center for Artificial Intelligence in Medicine & Imaging supported the work.

Source