<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Radiology Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/radiology/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/radiology/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Tue, 29 Jun 2021 10:43:46 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>Artificial intelligence predicts delayed radiology turnaround times during nights and weekends</title>
		<link>https://www.aiuniverse.xyz/artificial-intelligence-predicts-delayed-radiology-turnaround-times-during-nights-and-weekends/</link>
					<comments>https://www.aiuniverse.xyz/artificial-intelligence-predicts-delayed-radiology-turnaround-times-during-nights-and-weekends/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 29 Jun 2021 10:43:45 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[delayed]]></category>
		<category><![CDATA[Predicts]]></category>
		<category><![CDATA[Radiology]]></category>
		<category><![CDATA[turnaround]]></category>
		<guid isPermaLink="false">https://www.aiuniverse.xyz/?p=14633</guid>

					<description><![CDATA[<p>Source &#8211; https://www.radiologybusiness.com/ Imaging experts have developed an artificial intelligence tool that can help predict delays in radiology turnaround times during nights and weekends, key info for <a class="read-more-link" href="https://www.aiuniverse.xyz/artificial-intelligence-predicts-delayed-radiology-turnaround-times-during-nights-and-weekends/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/artificial-intelligence-predicts-delayed-radiology-turnaround-times-during-nights-and-weekends/">Artificial intelligence predicts delayed radiology turnaround times during nights and weekends</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.radiologybusiness.com/</p>



<p>Imaging experts have developed an artificial intelligence tool that can help predict delays in radiology turnaround times during nights and weekends, key info for quality improvement efforts.</p>



<p>University of California, San Francisco, researchers created the machine learning model utilizing more than 15,000 CT scans. Testing the tool out, they produced solid early results predicting delays greater than 245 minutes (area under the curve of 0.85) and interpretation setbacks of 57 minutes or longer (AUC 0.71).</p>



<p>“As delays in radiology are an important measure of patient safety and&nbsp;hospital efficiency, having the ability to predict such potential delays has important benefits,” Jae Ho Sohn, MD, a cardiothoracic radiology fellow at UCSF, and colleagues wrote June 27 in&nbsp;<em>Academic Radiology</em>. “Furthermore, prediction of delays in radiology can improve the referrer and radiologist relationship and help clinicians to prepare alternative options in case a delay is expected.”</p>



<p>For their study, San Francisco scientists gathered retrospective CT data from two hospitals within the same organization, logged between 2018 and 2019. The original set included nearly 30,000 inpatient and emergency cases, whittled down to about half that for their analysis. They tracked order and scan time, first communication by radiologist, free-text indications and more.</p>



<p>Sohn et al. used 85% of this data to train their ensemble machine learning model and the remaining 15% for testing. AI was tasked with predicting delays between when the exam was ordered to the first communication, along with delays between scan completion and interpretation.</p>



<p>The team discovered that CT study description, time of day and year in training were much more predictive features than body part imaged, inpatient status and hospital campus. In addition, some protocols were associated with delayed turnaround time because of the complexity of cases, including CT of the neck with contrast, were associated with delayed turnaround times</p>



<p>Future studies could potentially add additional variables, such as hospital and ED patient census, number of providers, transportation and average technology operating time. Sohn and colleagues see their work as an important starting point for quality improvement projects.</p>



<p>“Given the complexity of real-world radiology workflow, no algorithm can make perfect predictions on which cases will be delayed. However, attaining a reasonable prediction of such cases can be relevant,” the authors advised.</p>



<p></p>
<p>The post <a href="https://www.aiuniverse.xyz/artificial-intelligence-predicts-delayed-radiology-turnaround-times-during-nights-and-weekends/">Artificial intelligence predicts delayed radiology turnaround times during nights and weekends</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/artificial-intelligence-predicts-delayed-radiology-turnaround-times-during-nights-and-weekends/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>What has Artificial Intelligence done for radiology lately in 21st Century?</title>
		<link>https://www.aiuniverse.xyz/what-has-artificial-intelligence-done-for-radiology-lately-in-21st-century/</link>
					<comments>https://www.aiuniverse.xyz/what-has-artificial-intelligence-done-for-radiology-lately-in-21st-century/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 27 Dec 2019 09:36:06 +0000</pubDate>
				<category><![CDATA[Human Intelligence]]></category>
		<category><![CDATA[Artificial intelligence (AI)]]></category>
		<category><![CDATA[Computer systems]]></category>
		<category><![CDATA[Machine intelligence]]></category>
		<category><![CDATA[Radiology]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=5845</guid>

					<description><![CDATA[<p>Source: standardmedia.co.ke/ Artificial Intelligence (AI), sometimes called machine intelligence, is Computer systems theory and engineering capable of performing tasks that typically require human intelligence, such as visual <a class="read-more-link" href="https://www.aiuniverse.xyz/what-has-artificial-intelligence-done-for-radiology-lately-in-21st-century/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/what-has-artificial-intelligence-done-for-radiology-lately-in-21st-century/">What has Artificial Intelligence done for radiology lately in 21st Century?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: standardmedia.co.ke/</p>



<p>Artificial Intelligence (AI), sometimes called machine intelligence, is Computer systems theory and engineering capable of performing tasks that typically require human intelligence, such as visual processing, speech recognition, decision-making, and language translation.</p>



<p>To further expand this concept of AI in the scope of radiology results in &#8220;a computer science unit dealing with the processing, reconstruction, analysis and/or analysis of medical images by simulating intelligent human behavior in computers.&#8221;</p>



<p>Radiology, also defined as diagnostic imaging, is a series of different tests that take images of different parts of the body. Radiologists perform a wide array of diagnostic tests, including x-rays, ultrasound, densitometry of bone minerals, fluoroscopy, mammography, nuclear medicine, CT, and MRI.</p>



<p>Artificial intelligence (AI) algorithms, particularly deep learning, have demonstrated remarkable progress in image-recognition tasks. Methods ranging from convolutional neural networks to variational autoencoders have found myriad applications in the medical image analysis field, propelling it forward at a rapid pace.</p>



<p>Radiologists are medical professionals that are incredibly busy. They can&#8217;t really make any blunders. They need to interact with a diverse range of prescribing doctors; the list goes on with gastroenterologists, gynecologists, orthopedic practitioners. They must always be sharp. How can AI bring in and enable such stretched radiologists even better at what they&#8217;re doing?</p>



<p>In April 2016, Drs. Tim Dowdell, Joe Barfett, and Errol Colak – all radiologists – created the Machine Intelligence in Medicine Lab (MIMLab) in order to teach computers with artificial intelligence (AI) how to interpret medical images.</p>



<p>&#8220;With these AI methods, it&#8217;s very unreasonable to think that no one will ever be missing a lung nodule on a chest X-ray in the next five years,&#8221; Dr. Barfett says. &#8220;AI can span from rare to extremely rare instances like this.&#8221;</p>



<p>The three radiologists recruited AI specialist Hojjat Salehinejad, a Ph.D. student at the University of Toronto&#8217;s Department of Electrical and Computer Engineering, shortly after founding the MIMLab, who Dr. Barfett said is now the driving force behind their work.</p>



<p>The team found that AI algorithms could not be trained sufficiently to analyze X-rays using hospital databases due to instabilities in the datasets. A new solution was implemented, and the team strengthened its database by programming AI algorithms to create computer-generated chest X-rays rather than relying solely on real medical images. Enough pictures of rare conditions were produced, which, in conjunction with the real ones, gave the team exactly what it needed to teach a computer how to spot conditions on a very wide spectrum – including those rare cases that could mean the difference between a patient&#8217;s life and death.</p>



<p>A Stanford research created an algorithm that could detect pneumonia at that particular site in those patients participating with a better average F1 metric (a statistical measure focused on precision and recall) than the radiologists involved in that trial. During its annual meeting, the Radiological Society of North America conference incorporated AI visualization presentations. Many specialists view the advent of AI technology in radiology as a hazard, as the technology in isolated cases, as opposed to specialists, may make improvements in certain statistical metrics.</p>



<p><strong>Benefits</strong></p>



<p>Provide an appropriate treatment. Most AI systems are focused on delivering more info. It can be achieved via the quantifying of information contained in an image, in which it is typically only reported in a qualitative way. Or the system can incorporate universal values, enabling physicians to align patient outcomes with an acceptable boundary-section based on population.</p>



<p>Pick up repetitive routine tasks.AI isn&#8217;t good at all. Still not, at least. What are the correct tasks to turn over to AI at such a time? Tasks in which we have access to lots of data that are relatively straightforward and, therefore, do not demand a lot of different inputs to be merged. Therefore, radiologists do a lot of simple routine tasks. It usually includes more repetitive tasks.<ins></ins></p>



<p>Reduce inter- and intra-observer variability. With their diagnosis, even the best trained, most experienced radiologists may sometimes vary. Well rested in the morning, something else that catches your attention than after a long day&#8217;s work. In addition, different radiologists in their reports could highlight different aspects. This can be difficult for doctors to respond to, as they need to take these differences into account when synthesizing all the details they have before the final diagnosis is made. AI software can reduce or even remove this heterogeneity between reports of radiologists</p>



<p><strong>AI Realizing the benefits</strong></p>



<p>In the sense of radiology, there are many tasks that AI can do. Many tasks will only require a medical image as input and will depend on pixels (or voxels) to construct the</p>



<p>analysis. This can be done manually, but this is perceived by many radiologists as a tedious job, rendering it an appropriate candidate for some AI assistance. Some will go one move ahead, combining radiological images with other sources of information. The integration of medical images with other knowledge will lead to insights that radiologists do not always consider easy to obtain. Usually, these types of analyzes are regarded as more modern. For example, it is possible to allow an algorithm to extract pathology information from a medical image by linking image data to pathology laboratory results.</p>



<p><strong>Could AI take over the work of radiologists?</strong></p>



<p>The obvious answer, NO, radiologist jobs won&#8217;t be taken over. It will certainly take over certain radiologist tasks though By performing automated assessments that are currently very time consuming, it will assist radiologists. It will pick up repetitive tasks that many radiologists encounter as burdensome. However, radiologists have a much more differentiated job than just these kinds of tasks. Radiologist jobs are going to change, but they won&#8217;t go away.</p>
<p>The post <a href="https://www.aiuniverse.xyz/what-has-artificial-intelligence-done-for-radiology-lately-in-21st-century/">What has Artificial Intelligence done for radiology lately in 21st Century?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/what-has-artificial-intelligence-done-for-radiology-lately-in-21st-century/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Google’s AI team classifies chest X-rays with superior levels of accuracy</title>
		<link>https://www.aiuniverse.xyz/googles-ai-team-classifies-chest-x-rays-with-superior-levels-of-accuracy/</link>
					<comments>https://www.aiuniverse.xyz/googles-ai-team-classifies-chest-x-rays-with-superior-levels-of-accuracy/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Sat, 07 Dec 2019 07:18:45 +0000</pubDate>
				<category><![CDATA[Google AI]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Google]]></category>
		<category><![CDATA[HealthTech]]></category>
		<category><![CDATA[Radiology]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=5538</guid>

					<description><![CDATA[<p>Source: siliconcanals.com While millions of diagnostic examinations are carried out annually, chest X-rays play a vital role in diagnosing several diseases. But the usefulness of the same can be <a class="read-more-link" href="https://www.aiuniverse.xyz/googles-ai-team-classifies-chest-x-rays-with-superior-levels-of-accuracy/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/googles-ai-team-classifies-chest-x-rays-with-superior-levels-of-accuracy/">Google’s AI team classifies chest X-rays with superior levels of accuracy</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: siliconcanals.com</p>



<p>While millions of diagnostic examinations are carried out annually, chest X-rays play a vital role in diagnosing several diseases. But the usefulness of the same can be limited due to the challenges in interpretation that need thorough and rapid evaluation of 2D image depicting complex, 3D organs, and disease processes. Sometimes, major details can be missed by chest X-rays resulting in adverse outcomes for patients. </p>



<p>Recent efforts have improved lung cancer detection in radiology, differential diagnosis in dermatology, and prostate cancer grading in pathology. And, obtaining accurate clinical labels for the deep learning models for X-ray interpretation.</p>



<p>Most efforts have applied rule-based natural language processing to radiology reports or based on image review by readers. Eventually, both might introduce inconsistencies, that can be problematic at the time of model evaluation.</p>



<h3 class="wp-block-heading">Deep learning models to resolve challenges!</h3>



<p>In an effort to resolve this, researchers at Google devised Artificial Intelligence models to spot four findings on human chest X-rays. Advances in machine learning present an opportunity to create new tools to help experts interpret medical images. In the journal Radiology, the deep learning models for chest radiograph were published.</p>



<p>The team developed deep learning models for four important clinical finds such as pneumothorax (collapsed lungs), nodules, and masses, airspace opacities (filling of the pulmonary tree with material), and fractures. These were chosen in consultation with clinical colleagues and radiologists to focus on conditions that are critical for patient care.</p>



<p>These deep learning models were evaluated using several thousands of held out images from the dataset for which the high-quality labels have been collected using a panel-based adjudication process among radialogists who are certified by the board. Later, the held-out images have been reviewed independently by separate radiologists to make sure these are accurate.</p>



<p>The team leveraged more than 600,000 images sourced from two de-identified datasets. The first one was developed along with co-authors at the Apollo Hospitals and has a diverse set of chest X-rays gathered over several years from the hospital network across locations. The second one has been released publicly by the National Institutes of Health and served as a vital resource for machine learning efforts. But the same has limitations related to accuracy and clinical interpretation of available labels.</p>



<h3 class="wp-block-heading">High-quality reference standard labels</h3>



<p>In order to generate high-quality reference standard labels for model evaluation, the team has used a panel-based adjudication process. In this process, three radiologists reviewed the final tune and test set images and addressed disagreements via discussion. It let difficult findings that were only detected by a single radiologist to be detected and documented. Later, the discussions took place anonymously via an adjudication or online discussion system.</p>



<p>Google notes that while the models achieved an overall expert-level accuracy, the performance varied based on the corpora. For instance, the sensitivity to detect penumothorax among radiologists was nearly 79% for ChestX-ray14 images and just 52% for the same radiologists in other datasets.</p>



<p>The team hopes to lay the groundwork for exceptional methods with a corpus of the adjudicated labels for the ChestX-ray14 dataset that they have made available in open source. This comprises 2,412 training and validation set images and 1,962 test set images or 4,372 images in total.</p>
<p>The post <a href="https://www.aiuniverse.xyz/googles-ai-team-classifies-chest-x-rays-with-superior-levels-of-accuracy/">Google’s AI team classifies chest X-rays with superior levels of accuracy</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/googles-ai-team-classifies-chest-x-rays-with-superior-levels-of-accuracy/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>AI adds big data power to radiology</title>
		<link>https://www.aiuniverse.xyz/ai-adds-big-data-power-to-radiology/</link>
					<comments>https://www.aiuniverse.xyz/ai-adds-big-data-power-to-radiology/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 05 Dec 2019 08:10:56 +0000</pubDate>
				<category><![CDATA[AI-ONE]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Big data]]></category>
		<category><![CDATA[data science]]></category>
		<category><![CDATA[Radiology]]></category>
		<category><![CDATA[Transformation]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=5487</guid>

					<description><![CDATA[<p>Source: tmc.edu When the&#160;X-ray was discovered&#160;at the end of the 19th century, a new medical discipline&#160;was born. Radiology became a way to study, diagnose and treat disease. <a class="read-more-link" href="https://www.aiuniverse.xyz/ai-adds-big-data-power-to-radiology/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/ai-adds-big-data-power-to-radiology/">AI adds big data power to radiology</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: tmc.edu</p>



<p>When the&nbsp;X-ray was discovered&nbsp;at the end of the 19th century, a new medical discipline&nbsp;was born. Radiology became a way to study, diagnose and treat disease. Today, expertise among radiologists, radiation oncologists, nuclear medicine physicians, medical physicists and technicians includes many forms of medical imaging—from diagnostic and cancer imaging to mammography, radiation therapy, ultrasound, computed tomography (CT) and magnetic resonance imaging (MRI).</p>



<p>As we move into the third decade of the 21st century, radiology—perhaps more than any other medical speciality—is poised for transformation. Thanks to&nbsp;artificial intelligence (AI), radiologists foresee a future in which machines enhance patient outcomes and avoid misdiagnosis.</p>



<p>“In the old days, X-rays were very shadowy, very difficult to interpret. They required a lot of expertise. Nowadays, with MRIs, you can see the anatomy really, really well. So now, the next step from that—which is a big jump—is artificial intelligence,” said&nbsp;Eric Walser, M.D., chairman of radiology at&nbsp;The University of Texas Medical Branch (UTMB) at Galveston.</p>



<p>Early machine learning used information from a few cases to teach computers basic tasks, like identifying the human anatomy. Today, AI can distinguish patterns and irregularities in large collections of data, which makes radiology an ideal application. Software can draw from millions of images and make diagnoses with speed and accuracy.</p>



<p>“Ultimately, what you would want out of an AI algorithm is value,” said&nbsp;Eric M. Rohren, M.D., Ph.D., professor and chair of radiology at&nbsp;Baylor College of Medicine&nbsp;and radiology service line chief for&nbsp;Baylor St. Luke’s Medical Center. “It’s pretty clear that the algorithms can do some pretty amazing things. They can detect abnormalities with a high degree of precision. They can see what, perhaps, the human eye cannot see. What’s not known is: What is the value of that in a health care system? Are you truly improving patient outcomes and patient care by introducing a particular algorithm into your practice? Does it have measurable impact in terms of better patient experience, hospital stay and better outcomes following surgery?”</p>



<p>To that end, Baylor College of Medicine has created an internal library of all imaging data from the last decade.</p>



<p>“It’s being put into a research archive so that investigators can come in, pull that data, be able to research that data, develop their own algorithms and link it with the electronic medical records to see pathology and laboratory values and outcomes for those patients,” Rohren said. “Computers, since they don’t have the level of intuition that we have as humans, they really need to be trained in a systematic fashion off a data set that’s been developed specifically for them to learn. That data set can be tens of thousands of examinations in order for the algorithm to be able to determine prospectively and predictively what it’s going to do when it’s faced with a scan it’s never seen before.”</p>



<p>The American College of Radiology joined the AI revolution by creating its Data Science Institute that aims to “develop an AI ecosystem beyond single institutions,” according to the institute’s 2019 annual report.</p>



<p><strong>Houston as a hub</strong></p>



<p>The region dubbed “Silicon Bayou” for its innovation ecosystem is becoming a hot spot for artificial intelligence ventures.</p>



<p>Houston was ranked among the world’s top large cities prepared for artificial intelligence, according to the Global Cities’ AI Readiness Index released in September 2019. The city ranked No. 9 among places with metro area populations of 5 million to 10 million residents.</p>



<p>The report was based on a sur- vey conducted by the Oliver Wyman Forum, part of the Oliver Wyman management consulting firm that tracks how well major cities are prepared to “adapt and thrive in&nbsp;the coming age of AI.”</p>



<p>Researchers at The University of Texas Health Science Center at Houston (UTHealth) have demonstrated that distinction by building an AI platform called DeepSymNet that has been trained to evaluate data from patients who suffered strokes or had similar symptoms.</p>



<p>A team including Sunil Sheth, M.D., an assistant professor of neurology at UTHealth’s McGovern Medical School, and Luca Giancardo, Ph.D., an assistant professor at UTHealth’s School of Biomedical Informatics, created an algorithm to assist doctors outside of major stroke treatment facilities with diagnoses. The work was published online in September in the journal Stroke.</p>



<p>The project started because of difficulties identifying patients who could benefit from an endovascular procedure that opens blocked blood vessels in the brain, a common cause of stroke.</p>



<p>“It’s one of the most effective treatments we can render to patients. It takes them from having severe disability to sometimes almost completely back to normal,” said Sheth, who practices as a vascular neurologist with Memorial Hermann Health System. “The challenge is that we don’t know who will benefit from the treatment.”</p>



<p>Finding out depends on advanced imaging techniques that are not available at most community hospitals, the first stop for the vast majority of stroke patients.</p>



<p>“What we are trying to do, in using Dr. Giancardo’s software,&nbsp;is to see if we could generate the same type of results that we get with advanced imaging techniques but with the type of imaging that&nbsp;we already do routinely in stroke in the less-advanced centers,” Sheth said. “The purpose of this software is that—no matter what hospital you show up at—you can get the same type of advanced evaluation and all of the information you need to make a treatment decision.”</p>



<p>At Memorial Hermann-Texas Medical Center, patients have access to advanced technology and undergo standard imaging called a CT angiogram along with a CT perfusion, which is used to decide if someone would benefit from an endovascular procedure to remove a blood clot.</p>



<p>“We took patients who had both—the CT angiogram, which&nbsp;can be done at any hospital, and the CT perfusion imaging—and then&nbsp;we sent that into Dr. Giancardo’s software. What that essentially did is trained the algorithm to take the CT images and to generate the type of output that the CT perfusion&nbsp;was telling us,” Sheth said. “Then we tested it. Here are a bunch of patients that you’ve never seen before. How good are you at predicting what the CT perfusion is going to say? And that’s what we did in our paper and showed that it did a very good job.”</p>



<p>The study included more than 200 images from a single hospital. The technology hasn’t been implemented clinically.</p>



<p>“The main benefit to a patient is that a lot of hospitals from other nations have the basic imaging, but not more advanced capabilities. So, the approach is that they could get the same information with the infrastructure that is already there,” Giancardo said.</p>



<p><strong>Replacing the radiologist?</strong></p>



<p>The AI disruption in radiology may be predictive of what’s to come in other areas of medicine. But does AI mean that the demand for radiologists will decline?</p>



<p>Walser, UTMB’s radiology chair- man, thinks so.</p>



<p>“There will be fewer of us prob- ably needed,” he said. “Radiologists are going to become more the man- agers of the data rather than&nbsp;the creators of the diagnoses.”</p>



<p>Sheth, the UTHealth neurologist, views AI and radiology as “decision support” for the specialists.</p>



<p>“I don’t think we will ever be at the point where we can say ‘Do x,&nbsp;y and z to this patient because Dr. Giancardo’s software told us to.’ This is going to be something that will help all of the physicians taking care of the patients—the ER doc, the neurologist, the radiologist—make treatment decisions with better data,” he said.</p>



<p>Rohren, the Baylor radiology chair, agrees that AI will improve decision-making without replac- ing the physicians who interpret imaging.</p>



<p>“It will make our jobs and roles a little different than they are&nbsp;now, but it will absolutely not put radiologists out of business,” he said. “Machines and machine learn- ing are very good at information handling, but they are very poor at making judgments based on that information. … The radiologist will continue to be critical.”</p>



<p>Yet Rohren anticipates a trans- formed discipline.</p>



<p>“I think radiology in the future will be a different profession than it is today,” he said. “The radiologist has the potential to be the curator and the purveyor of a large amount of data and information that, given the limitations of human brain power and information systems,</p>



<p>we just don’t have access to today. But with AI working alongside the radiologist, I think the radiologist is in the position to be at the hub of a lot of health care.”</p>



<p><strong>Innovation in the TMC</strong></p>



<p>Optellum, a United Kingdom-based company, brought its AI software for lung cancer diagnosis and treatment to the TMC Innovation Institute this year and spent several months in the TMCx accelerator. The product identifies patients at risk for lung cancer, expedites optimal therapy for those with cancer and reduces intervention for millions who do not need treatment.</p>



<p>Company founder and CEO Vaclav Potesil, Ph.D., said he decided to focus on lung cancer after losing an aunt within a year of her Stage 4 diagnosis. She never smoked.</p>



<p>“I’ve seen firsthand how very healthy people can be killed and it’s still the most common and deadliest cancer worldwide,” he said. “We are really focused on enabling cancer patients to be diagnosed at the earliest possible stage and be cured. It’s not just the modeled data on the computer. It’s addressing the right clinical problems to add value to doctors.”</p>



<p>Potesil noted that two cancerous growths on the left lung of U.S. Supreme Court Associate Justice Ruth Bader Ginsburg were discovered early because of tests on her broken ribs, which were examined after she fell in 2018.</p>



<p>UTMB is starting a new AI radiology project soon, Walser said, that also aides in early detection.</p>



<p>“The computer scans every X-ray from the ICU and if they see something that looks like it might be a problem, they pop it up to the top&nbsp;of my list, so I look at that one first,” he said. “In other words, if there&nbsp;are 200 chest X-rays from Jennie [Sealy Hospital] and there’s one that’s a pneumothorax [collapsed lung] that could kill a patient and it’s all the way down on the bottom of my stack because the last name is Zimmerman, the computer will push it up to the top for me. That’s our first foray into AI.”</p>



<p>Baylor College of Medicine and&nbsp;Baylor St. Luke’s Medical Center researchers have explored using algorithms to interpret breast lesions on mammograms, improve cancer detection with breast MRIs and predict which sinusitis patients might benefit most from surgery, Rohren said.</p>



<p>He hopes for an AI collaboration among institutions in Houston’s medical city.</p>



<p>“I personally believe the Texas Medical Center could become the international hub of AI—not only for radiology, but for any aspect of medicine,” Rohren said. “We have so many elite health care institutions sitting right here right next door to each other. We have outstanding undergraduate universities right here in the city with data scientists and engineers. We have all the components to be able to develop a program. What it will require is for us all to work together.”</p>
<p>The post <a href="https://www.aiuniverse.xyz/ai-adds-big-data-power-to-radiology/">AI adds big data power to radiology</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/ai-adds-big-data-power-to-radiology/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Rad AI launches with $4 million led by Google AI venture fund to use machine learning to transform radiology</title>
		<link>https://www.aiuniverse.xyz/rad-ai-launches-with-4-million-led-by-google-ai-venture-fund-to-use-machine-learning-to-transform-radiology/</link>
					<comments>https://www.aiuniverse.xyz/rad-ai-launches-with-4-million-led-by-google-ai-venture-fund-to-use-machine-learning-to-transform-radiology/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 26 Nov 2019 10:50:24 +0000</pubDate>
				<category><![CDATA[Google AI]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[Rad AI]]></category>
		<category><![CDATA[Radiology]]></category>
		<category><![CDATA[transform]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=5405</guid>

					<description><![CDATA[<p>Source: techstartups.com Rad AI, an artificial intelligence startup that uses machine learning to transform the practice of radiology, today announced its formal launch and a $4 million seed <a class="read-more-link" href="https://www.aiuniverse.xyz/rad-ai-launches-with-4-million-led-by-google-ai-venture-fund-to-use-machine-learning-to-transform-radiology/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/rad-ai-launches-with-4-million-led-by-google-ai-venture-fund-to-use-machine-learning-to-transform-radiology/">Rad AI launches with $4 million led by Google AI venture fund to use machine learning to transform radiology</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: techstartups.com</p>



<p>Rad AI, an artificial intelligence startup that uses machine learning to transform the practice of radiology, today announced its formal launch and a $4 million seed round led by Gradient Ventures, Google’s AI-focused venture fund. Other backers included UP2398, Precursor Ventures, GMO Venture Partners, Array Ventures, Hike Ventures, Fifty Years VC and various angels. Rad AI will use the new capital infusion to build out its engineering team and expand the rollout of its first product to more radiology groups and customers.</p>



<p>Founded in 2018 by Doktor Gurson and Dr. Jeff Chang, Rad AI uses machine learning to transform the practice of radiology. Its AI products are designed by radiologists, for radiologists. By streamlining existing workflow and automating repetitive manual tasks, Rad AI increases daily productivity while reducing radiologist burnout. In addition, Rad AI provides more consistent radiology reports for ordering clinicians, and higher accuracy for the patients it serves.</p>



<p>The idea of Rad AI came when one of the co-founders, Dr. Jeff Chang, the youngest radiologist and second youngest doctor on record in the US, was troubled by high error rates, radiologist burnout, and rising imaging demand despite a worsening shortage of US radiologists, so he decided to pursue graduate work in machine learning to identify ways that AI could help. After he met serial entrepreneur Doktor Gurson, they created Rad AI in 2018 at the intersection of radiology and AI. Built by radiologists, for radiologists, Rad AI is transforming the field of radiology with the inside perspective as its driving force.</p>



<p>“Radiology is facing severe pressures that range from falling reimbursements to market consolidation. There is also a radiologist shortage that is exacerbated by rising imaging volumes nationwide. We help radiology groups significantly increase productivity, while reducing radiologist burnout and improving report accuracy. By working closely with radiologists, we can make a positive impact on patient care,” said Dr. Chang.</p>



<p>Rad AI uses state-of-the-art machine learning to automate repetitive tasks for radiologists so they have more time to focus on what matters: accurate and timely diagnosis for patients. The first product automatically generates the impression section of radiology reports, customized specifically to the preferred language of each radiologist. Initial customers have shown significant reduction in radiologist burnout, error rates, and turnaround time — improving radiologists’ well-being and patient care.</p>



<p>Rad AI’s current partners include Greensboro Radiology, Medford Radiology, Einstein Healthcare Network, and Bay Imaging Consultants, one of largest private radiology groups in the United States, as well as other radiology groups that have yet to be announced. Product rollouts have demonstrated an average of 20% time savings on the interpretation of CTs and 15% time savings on radiographs — translating into an hour a day saved for each radiologist.</p>



<p>“The team at Rad AI is uniquely suited to apply innovative technology to this field, with strong radiology and AI experts and firsthand knowledge of this market. It’s exciting to see the quantitative benefits and positive feedback from their radiology customers, and we’re looking forward to the impact of their future products,”&nbsp;Zachary Bratun-Glennon, Partner at Gradient Ventures, added.</p>
<p>The post <a href="https://www.aiuniverse.xyz/rad-ai-launches-with-4-million-led-by-google-ai-venture-fund-to-use-machine-learning-to-transform-radiology/">Rad AI launches with $4 million led by Google AI venture fund to use machine learning to transform radiology</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/rad-ai-launches-with-4-million-led-by-google-ai-venture-fund-to-use-machine-learning-to-transform-radiology/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Adopting Machine Learning in Radiology Requires Further Research</title>
		<link>https://www.aiuniverse.xyz/adopting-machine-learning-in-radiology-requires-further-research/</link>
					<comments>https://www.aiuniverse.xyz/adopting-machine-learning-in-radiology-requires-further-research/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 23 Jul 2019 12:59:37 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[Adopting]]></category>
		<category><![CDATA[Further]]></category>
		<category><![CDATA[JMIR Medical Informatics]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[Radiology]]></category>
		<category><![CDATA[Research]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=4115</guid>

					<description><![CDATA[<p>Source: healthitanalytics.com To advance the use of machine learning in medical imaging, researchers will have to examine radiologists’ perceptions of the technology, as well as the cost-effectiveness <a class="read-more-link" href="https://www.aiuniverse.xyz/adopting-machine-learning-in-radiology-requires-further-research/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/adopting-machine-learning-in-radiology-requires-further-research/">Adopting Machine Learning in Radiology Requires Further Research</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: healthitanalytics.com</p>



<p>To advance the use of machine learning in medical imaging, researchers will have to examine radiologists’ perceptions of the technology, as well as the cost-effectiveness of these tools, according to a study published in JMIR Medical Inform<em>atics</em>. </p>



<p>Countless studies have shown the diagnostic accuracy of machine learning tools. Organizations across the care continuum, from research universities to companies like Google, have developed machine learning algorithms that can identify breast cancer in medical images as effectively as human clinicians. </p>



<p>While the results of these studies have promising implications for the future of radiology and pathology, JMIR researchers noted that more investigations may be necessary before machine learning tools can become part of routine clinical practice.</p>



<p>“Advancements in computer algorithms are becoming increasingly sophisticated and widespread in the field of radiology, with the potential to be cost-effective for increasing detection rates of various medical conditions and improve the efficiency of radiologists,” the team said.</p>



<p>“As we continue to head into an artificial intelligence era, it is essential that we understand the implementation of technologies in healthcare settings and its impact on health care providers and their potentially shifting roles.”&nbsp;</p>



<p>The group analyzed nine peer-reviewed articles that focused on the implementation and adoption of computer-aided detection (CAD) in breast cancer screening. CAD, a form of machine learning, can help clinicians interpret medical images by acting as a double check or a second pair of eyes, replacing the typical double reading by a second pathologist. </p>



<p>CAD scans digital mammograms and marks areas of potential cancer, which pathologists then review to reach a final assessment of the image. Although the use of CAD has increased significantly over the past several years, researchers stated that studies have largely overlooked radiologists’ perceptions of the technology, as well as its cost-effectiveness and efficiency.&nbsp;</p>



<p>After reviewing past articles, the team found that incentives for adopting CAD included improved cancer detection rates, breast imaging profitability, and less radiologist time taken.&nbsp;</p>



<p>However, researchers also found that providers didn’t have an overly positive view of the technology. In general, radiologists had more favorable perceptions of double reading by a colleague rather than single reading with CAD. One study showed that 74 percent of radiologists believed double reading improved cancer detection rates, while just 55 percent thought that CAD improved detection rates.</p>



<p>Additionally, the group found that the use of CAD was associated with higher interpretation times. CAD may take less time than double reading by a second radiologist, but researchers saw that when radiologists reviewed CAD-marked images, the mean interpretation time increased by 19 percent.&nbsp;</p>



<p>CAD implementation was also associated with a significant increase in recall rates, which occurs when a patient is called back for follow-up imaging. Moreover, the use of CAD for breast cancer screening can be associated with higher financial costs, depending on the accuracy of CAD, the number of patients screened, and comparison with single versus double reading.&nbsp;</p>



<p>These results indicate that more research is needed to identify and overcome barriers to machine learning adoption in the medical imaging field.&nbsp;</p>



<p>“Through our scoping review of the adoption and implementation of CAD in clinical settings for breast cancer detection and other related articles, CAD use by radiologists is based on trade-offs between the barriers and facilitators,” researchers said.&nbsp;</p>



<p>“The use of CAD for breast cancer screening involves several tradeoffs including weighing the impact on detection rates and patient outcomes, costs and financial incentives, time saved from double reading, increased recall rates, and radiologist perceptions.”</p>



<p>The study was limited in that researchers reviewed only a small number of articles. However, the results indicate that further research is needed to assess the implementation and adoption of machine learning in medical imaging.</p>



<p>“Our review suggests that there is a large focus on the diagnostic accuracy of CAD, but little focus on CAD implementation and perceptions of radiologists—the end users,” researchers said.</p>



<p>“We propose that further studies be carried out to better understand CAD adoption and implementation in clinical settings. Specifically, there should be a focus on investigating radiologists’ perceptions of CAD use in various settings, as we only came across one such study based in the United States, which cannot be generalized to other settings and health care systems.”</p>
<p>The post <a href="https://www.aiuniverse.xyz/adopting-machine-learning-in-radiology-requires-further-research/">Adopting Machine Learning in Radiology Requires Further Research</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/adopting-machine-learning-in-radiology-requires-further-research/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
