Artificial Intelligence Is (and Isn’t) Transforming Radiology

Source – columbusceo.com

Radiologists say that using AI can make their practice better without rendering them obsolete.

Artificial intelligence conjures up scenarios of robots building other robots or self-driving vehicles putting truck drivers out of work. But these days, IBM’s Watson computer is just as likely to interpret a CT scan, using AI to revolutionize radiology and other medical fields.

Researchers believe AI will not put human radiologists on the endangered species list anytime soon, if ever. But AI and deep learning offer speed, accuracy and consistency to extend the capabilities of human imaging professionals.

“We feel in the future the radiologists who know how to deploy AI will be better than the ones that do not use the technology, but we do not see the technology replacing us,” says Dr. Vikram Krishnasetty of Columbus Radiology, one of the largest radiology practices in central Ohio.

Early efforts to substitute smart computers for radiologists “didn’t understand fully what radiology is,” he says. The field is a complex environment that goes well beyond the technical aspects of photo images and facial recognition. Only professionals can integrate information from images with patient history and clinical concerns to provide an integrated approach, Krishnasetty says.

Still, Dr. Luciano Prevedello, chief of the imaging informatics division at the Ohio State University Wexner Medical Center, says healthcare has reached a significant inflection point for AI in radiology, a specialty already driven by technologies like computed tomography (CT), magnetic resonance imaging (MRI) and cryo- and micro-surgery on the interventional side.

At OSU, a team of AI data scientists and radiology specialists is using deep-learning techniques to digest thousands of radiological images and discern the most serious cranial emergencies: large brain masses, tumors, internal bleeding and stroke. The OSU project seeks to amplify the powers of radiologists to identify those conditions and prioritize them out of thousands of cranial images flowing into the hospital system. “The tool we created can recognize these diseases as soon as the images go through them. In a few seconds, it can determine whether these diseases are present,” Prevedello says.

What’s behind AI’s prominence in radiology research? Machine learning allows a software program to learn and improve from experience without being explicitly programmed, improving on the exhaustively detailed steps of traditional coding. Meanwhile, deep learning, a subset of machine learning, uses algorithms emulating the structure and function of the human brain, handing off the result of tasks from one process to the next.

These efforts take massive computing power. The most advanced home computers today have Intel and AMD processors with eight cores, whereas the three supercomputers at OSU have 15,000 cores apiece. “That’s 44 teraflops per second. It’s an insane amount of processing in one single computer. That wasn’t even dreamed of five years ago,” Prevedello says.

Late last year, a group of Stanford University researchers published a study pitting four professional radiologists against AI-based software, testing whether the humans or the computer program could better read chest X-rays. ChexNet, the 121-layer “convolutional neural network,” beat the radiologists on at least one significant measure of performance. The program can now identify 14 diseases at “state-of-the-art” levels, the researchers said. The deep learning involved feeding more than 100,000 chest x-rays into the program.

The excitement has spread to hundreds of AI startups in radiology alone, many of them seeking automated ways to translate radiology images into full-fledged diagnosis. But the Stanford study in some ways raises questions about what constitutes accuracy and whether isolated “binary” tests of machine vs. human will change clinical practice, Krishnasetty says.

“There are different sectors of radiology that AI is being applied to. The flashiest is whether AI can interpret an image, but I think it goes beyond that. We’re seeing AI improve workflows and improve detection of specific diagnoses,” Krishnasetty says.

Radiology Partners, the parent company of Columbus Radiology, is working closely with Illumination Works, of Dublin, on ways to help practicing radiologists turn image analysis into diagnosis and treatment more quickly and accurately.

Kelly Denney, principal data scientist with Illumination Works, also predicts radiologists won’t be replaced any time soon. “If anything, it’s going to be a machine-radiologist team thing. Computers are really good at measuring things and comparing them to other things,” Denney says. “But when it comes to experience, it’s different from anything you can put into a computer, even if it’s learning. That experience is invaluable and irreplaceable.”

So Columbus Radiology and Illumination Works partners are developing RecoMD, a software suite that helps speed healthcare workflows, communications and billing as radiologists apply diagnostic standards and state-of-the-art clinical recommendations to the work of radiologists. Essentially, it uses AI and deep-learning techniques to supply radiologists with clinical options and classifications as they write image-based reports.

The beta release of the product covers only three conditions: abdominal aortic aneurysms, ovarian cysts and thyroid nodules. “We’re analyzing reports in real time, looking for ways radiologists can best adhere to hundreds of clinical guidelines and best practices that are always changing and hard to keep track of,” Krishnasetty says. “The computer is reading and processing language, developing where a clinical finding or recommendation can be chosen so we can pop it right into our reports.”

The other benefit the radiologists recognized quickly was that billing is linked specifically to medical diagnoses and procedures, and radiology reports must provide the evidence and classify the conditions correctly for payment, Denney says. There are myriad complex conditions and standards left to go, she says.

Overall, Prevedello says the current pace of AI and deep learning work is accelerating, but applications in typical radiology practices are much more likely to surface gradually. “We can tackle one drop in the ocean at a time and make improvements, but you may be asking when are we going to take on the entire ocean?” he says. “That’s a very valid point. We are still learning how to analyze and be comfortable with one single drop, let alone the entire ocean.”

Related Posts

Subscribe
Notify of
guest
2 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
2
0
Would love your thoughts, please comment.x
()
x
Artificial Intelligence