<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Imaging Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/imaging/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/imaging/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Tue, 15 Jun 2021 05:08:28 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>Combining AI with cardiac imaging helps predict heart attacks, cardiovascular deaths</title>
		<link>https://www.aiuniverse.xyz/combining-ai-with-cardiac-imaging-helps-predict-heart-attacks-cardiovascular-deaths/</link>
					<comments>https://www.aiuniverse.xyz/combining-ai-with-cardiac-imaging-helps-predict-heart-attacks-cardiovascular-deaths/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 15 Jun 2021 05:08:26 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[attacks]]></category>
		<category><![CDATA[cardiac]]></category>
		<category><![CDATA[Cardiovascular]]></category>
		<category><![CDATA[Combining]]></category>
		<category><![CDATA[deaths]]></category>
		<category><![CDATA[Imaging]]></category>
		<category><![CDATA[predict]]></category>
		<guid isPermaLink="false">https://www.aiuniverse.xyz/?p=14305</guid>

					<description><![CDATA[<p>Source &#8211; https://www.cardiovascularbusiness.com/ Researchers have developed a deep learning network capable of accurately predicting a person’s risk of adverse cardiac events, presenting their findings virtually at the <a class="read-more-link" href="https://www.aiuniverse.xyz/combining-ai-with-cardiac-imaging-helps-predict-heart-attacks-cardiovascular-deaths/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/combining-ai-with-cardiac-imaging-helps-predict-heart-attacks-cardiovascular-deaths/">Combining AI with cardiac imaging helps predict heart attacks, cardiovascular deaths</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.cardiovascularbusiness.com/</p>



<p>Researchers have developed a deep learning network capable of accurately predicting a person’s risk of adverse cardiac events, presenting their findings virtually at the Society of Nuclear Medicine and Molecular Imaging (SNMMI) 2021 Annual Meeting.</p>



<p>The new analysis included data from more than 20,000 patients who underwent single photon emission CT (SPECT) myocardial perfusion imaging (MPI). The advanced algorithm used those SPECT MPI results to determine each patient’s risk of a major adverse cardiac event—myocardial infarctions or cardiovascular deaths, for example—and then patients were followed for an average of nearly five years to test the algorithm’s accuracy.</p>



<p>Overall, the authors found, the annual rate of major adverse cardiac events among patients with the highest deep learning scores was 9.7%. This represented a 10.2-fold increase compared to the annual rate among patients with the lowest scores.</p>



<p>“These findings show that artificial intelligence could be incorporated in standard clinical workstations to assist physicians in accurate and fast risk assessment of patients undergoing SPECT MPI scans,” Ananya Singh, MS, a research software engineer in the Slomka Lab at Cedars-Sinai Medical Center in Los Angeles, said in a prepared statement. “This work signifies the potential advantage of incorporating artificial intelligence techniques in standard imaging protocols to assist readers with risk stratification.”</p>
<p>The post <a href="https://www.aiuniverse.xyz/combining-ai-with-cardiac-imaging-helps-predict-heart-attacks-cardiovascular-deaths/">Combining AI with cardiac imaging helps predict heart attacks, cardiovascular deaths</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/combining-ai-with-cardiac-imaging-helps-predict-heart-attacks-cardiovascular-deaths/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Machine learning analyses of lung imaging for COVID-19 falls short, Minded launches to streamline psychiatric med refills and more digital health news briefs</title>
		<link>https://www.aiuniverse.xyz/machine-learning-analyses-of-lung-imaging-for-covid-19-falls-short-minded-launches-to-streamline-psychiatric-med-refills-and-more-digital-health-news-briefs/</link>
					<comments>https://www.aiuniverse.xyz/machine-learning-analyses-of-lung-imaging-for-covid-19-falls-short-minded-launches-to-streamline-psychiatric-med-refills-and-more-digital-health-news-briefs/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 18 Mar 2021 06:29:03 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[analyses]]></category>
		<category><![CDATA[COVID-19]]></category>
		<category><![CDATA[Imaging]]></category>
		<category><![CDATA[launches]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[Minded]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=13594</guid>

					<description><![CDATA[<p>Source &#8211; https://www.mobihealthnews.com/ Also: PainChek&#8217;s app picks up European and Australian regulatory clearances; Digital health access as a social determinant of health. AI isn&#8217;t ready for COVID-19 prime <a class="read-more-link" href="https://www.aiuniverse.xyz/machine-learning-analyses-of-lung-imaging-for-covid-19-falls-short-minded-launches-to-streamline-psychiatric-med-refills-and-more-digital-health-news-briefs/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/machine-learning-analyses-of-lung-imaging-for-covid-19-falls-short-minded-launches-to-streamline-psychiatric-med-refills-and-more-digital-health-news-briefs/">Machine learning analyses of lung imaging for COVID-19 falls short, Minded launches to streamline psychiatric med refills and more digital health news briefs</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.mobihealthnews.com/</p>



<p>Also: PainChek&#8217;s app picks up European and Australian regulatory clearances; Digital health access as a social determinant of health.</p>



<p><strong>AI isn&#8217;t ready for COVID-19 prime time. </strong>A systematic review of published this week in nature machine intelligence warns that new models using machine learning to review chest radiographs and chest computed tomographies for COVID-19 have major methodical deficiencies or underlying biases.</p>



<p>Among the 62 published or pre-print papers outlining these approaches, the authors wrote that not a single one was of potential clinical use. Many were hampered by low-quality data, and in particular the high likelihood of duplicated images across different sources that result in so-called &#8220;Frankenstein datasets.&#8221; All the proposed models also suffered some degree of bias, they wrote, such as including samples from nonrepresentative populations.</p>



<p>&#8220;Despite the huge efforts of researchers to develop machine learning models for COVID-19 diagnosis and prognosis, we found methodological flaws and many biases throughout the literature, leading to highly optimistic reported performance,&#8221; the reviewers wrote.</p>



<p>&#8220;Higher-quality datasets, manuscripts with sufficient documentation to be reproducible and external validation are required to increase the likelihood of models being taken forward and integrated into future clinical trials to establish independent technical and clinical validation, as well as cost-effectiveness.&#8221;</p>



<hr class="wp-block-separator"/>



<p><strong>Easy med refills for psychiatric patients.&nbsp;</strong>Today marked the launch of New York-based Minded, a digital service that helps those taking psychiatric medications renew, adjust, refill and order delivery of their prescriptions.</p>



<p>The startup, which has raised more than $5 million from investors, aims to cut down the burden and cost of regular visits to a traditional provider for assessment and prescription renewal.</p>



<p>Through its app-based platform, users can instead complete a five-minute online assessment regarding their mental health and a 10-minute video consultation. If appropriate, they can either have their prescription filled at a local pharmacy or delivered to their home for free.</p>



<p>The subscription service costs $30 per month plus $5 for each medication, and includes 24/7 access to the company&#8217;s care team and other long-term medication management support.</p>



<p>&#8220;Once I found what worked for me, I did not want to go to the doctor every 90 days to pay&nbsp;$300&nbsp;for a five-minute appointment. I wanted to take the frustrating, time-consuming, and expensive process of renewing my prescription and make it magically simple,&#8221; David Ronick, Minded cofounder and CEO, said in a statement. &#8220;We&#8217;re tackling the critical issues of access and affordability facing millions of Americans.&#8221;</p>



<hr class="wp-block-separator"/>



<p><strong>Regulatory wins for pain measurement app.&nbsp;</strong>PainChek, the maker of a pain assessment and monitoring app for smartphones, announced this week that it&#8217;s received a CE Mark and a Therapeutic Goods Administration clearance for its Universal Pain Assessment Solution.</p>



<p>Designed for caretakers and others providers, the tool helps assess pain severity among those who cannot adequately describe it, or otherwise document quantified pain levels for those who can self-report. With these, the company said that it&#8217;d be rolling out the app in the U.K. and Australia next month, and then moving onto the rest of Europe and other international markets.</p>



<p>&#8220;PainChek can become a single, simple and&nbsp;rapid point-of-care solution for healthcare professionals in assessing and documenting pain across all their patients, in a broad range of settings including the larger home care and hospital care markets,&#8221; CEO Philip Daffas said in a statement. &#8220;Based on initial market feedback, we expect this novel solution will be well received by our existing users and attract a wider global audience.”</p>



<hr class="wp-block-separator"/>



<p><strong>Not everyone has a smartphone.&nbsp;</strong>A comment letter published today in&nbsp;<em>NPJ Digital Medicine&nbsp;</em>makes the case that access to digital tools, and subsequently mobile health technologies, is increasingly important for healthcare stakeholders to view as another social determinant of health (SDOH).</p>



<p>Economic access, Internet connectivity and general tech literacy are becoming core issues as care delivery is digitized and novel tools are built using software or devices, they wrote. As such, they recommended that health systems adopt &#8220;a digital-inclusion-informed strategy regarding mobile health&#8221; that not only takes access into account, but works to assess and support patients as they learn digital skills.</p>



<p>&#8220;Mobile health technologies hold significant promise to increase the efficiency of care and improve health outcomes. Yet, we must be cognizant of their potential to increase health disparities,&#8221; they wrote.</p>
<p>The post <a href="https://www.aiuniverse.xyz/machine-learning-analyses-of-lung-imaging-for-covid-19-falls-short-minded-launches-to-streamline-psychiatric-med-refills-and-more-digital-health-news-briefs/">Machine learning analyses of lung imaging for COVID-19 falls short, Minded launches to streamline psychiatric med refills and more digital health news briefs</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/machine-learning-analyses-of-lung-imaging-for-covid-19-falls-short-minded-launches-to-streamline-psychiatric-med-refills-and-more-digital-health-news-briefs/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>New technology uses near-infrared imaging and machine learning to find hidden tumors</title>
		<link>https://www.aiuniverse.xyz/new-technology-uses-near-infrared-imaging-and-machine-learning-to-find-hidden-tumors/</link>
					<comments>https://www.aiuniverse.xyz/new-technology-uses-near-infrared-imaging-and-machine-learning-to-find-hidden-tumors/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Wed, 03 Feb 2021 05:41:25 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[Imaging]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[near-infrared]]></category>
		<category><![CDATA[New]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[tumors]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=12657</guid>

					<description><![CDATA[<p>Source &#8211; https://www.news-medical.net/ Tumors can be damaging to surrounding blood vessels and tissues even if they&#8217;re benign. If they&#8217;re malignant, they&#8217;re aggressive and sneaky, and often irrevocably <a class="read-more-link" href="https://www.aiuniverse.xyz/new-technology-uses-near-infrared-imaging-and-machine-learning-to-find-hidden-tumors/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/new-technology-uses-near-infrared-imaging-and-machine-learning-to-find-hidden-tumors/">New technology uses near-infrared imaging and machine learning to find hidden tumors</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.news-medical.net/</p>



<p>Tumors can be damaging to surrounding blood vessels and tissues even if they&#8217;re benign. If they&#8217;re malignant, they&#8217;re aggressive and sneaky, and often irrevocably damaging. In the latter case, early detection is key to treatment and recovery. But such detection can sometimes require advanced imaging technology, beyond what is available commercially today.</p>



<p>For instance, some tumors occur deep inside organs and tissues, covered by a mucosal layer, which makes it difficult for scientists to directly observe them with standard methods like endoscopy (which inserts a small camera into a patient&#8217;s body via a thin tube) or reach them during biopsies. In particular, gastrointestinal stromal tumors (GISTs)&#8211;typically found in the stomach and the small intestines&#8211;require demanding techniques that are very time-consuming and prolong the diagnosis.</p>



<p>Now, to improve GIST diagnosis, Drs. Daiki Sato, Hiroaki Ikematsu, and Takeshi Kuwata from the National Cancer Center Hospital East in Japan, Dr. Hideo Yokota from the RIKEN Center for Advanced Photonics, Japan, and Drs. Toshihiro Takamatsu and Kohei Soga from Tokyo University of Science, Japan, led by Dr. Hiroshi Takemura, have developed a technology that uses near-infrared hyperspectral imaging (NIR-HSI) along with machine learning. Their findings are published in Nature&#8217;s <em>Scientific Reports</em>.</p>



<p>This should mean that scientists can safely investigate something that is hidden inside tissues, but until the study by Dr. Takemura and his colleagues, no one had attempted to use NIR-HSI on deep tumors like GISTs. Speaking of what got them to go down this line of investigation, Dr. Takemura pays homage to the late professor who began their journey: &#8220;This project has been possible only because of late Prof. Kazuhiro Kaneko, who broke the barriers between doctors and engineers and established this collaboration. We are following his wishes.&#8221;</p>



<p>Dr. Takemura&#8217;s team performed imaging experiments on 12 patients with confirmed cases of GISTs, who had their tumors removed through surgery. The scientists imaged the excised tissues using NIR-HSI, and then had a pathologist examine the images to determine the border between normal and tumor tissue. These images were then used as training data for a machine-learning algorithm, essentially teaching a computer program to distinguish between the pixels in the images that represent normal tissue versus those that represent tumor tissue.</p>



<p>The scientists found that even though 10 out of the 12 test tumors were completely or partly covered by a mucosal layer, the machine-learning analysis was effective in identifying GISTs, correctly color-coding tumor and non-tumor sections at 86% accuracy. &#8220;This is a very exciting development,&#8221; Dr. Takemura explains, &#8220;Being able to accurately, quickly, and non-invasively diagnose different types of submucosal tumors without biopsies, a procedure that requires surgery, is much easier on both the patient and the physicians.&#8221;</p>



<p>Dr. Takemura acknowledges that there are still challenges ahead, but feels they are prepared to solve them. The researchers identified several areas that would improve on their results, such as making their training dataset much larger, adding information about how deep the tumor is for the machine-learning algorithm, and including other types of tumors in the analysis. Work is also underway to develop an NIR-HSI system that builds on top of existing endoscopy technology.</p>



<p>&#8220;We&#8217;ve already built a device that attaches an NIR-HSI camera to the end of an endoscope and hope to perform NIR-HSI analysis directly on a patient soon, instead of just on tissues that had been surgically removed,&#8221; Dr. Takemura says, &#8220;In the future, this will help us separate GISTs from other types of submucosal tumors that could be even more malignant and dangerous. This study is the first step towards much more groundbreaking research in the future, enabled by this interdisciplinary collaboration.&#8221;</p>
<p>The post <a href="https://www.aiuniverse.xyz/new-technology-uses-near-infrared-imaging-and-machine-learning-to-find-hidden-tumors/">New technology uses near-infrared imaging and machine learning to find hidden tumors</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/new-technology-uses-near-infrared-imaging-and-machine-learning-to-find-hidden-tumors/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Deep Learning Tool Accurately Selects High-Quality Embryos for IVF</title>
		<link>https://www.aiuniverse.xyz/deep-learning-tool-accurately-selects-high-quality-embryos-for-ivf/</link>
					<comments>https://www.aiuniverse.xyz/deep-learning-tool-accurately-selects-high-quality-embryos-for-ivf/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 17 Sep 2020 08:40:29 +0000</pubDate>
				<category><![CDATA[Deep Learning]]></category>
		<category><![CDATA[Analytics]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[deep learning]]></category>
		<category><![CDATA[Imaging]]></category>
		<category><![CDATA[Medical Imaging]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=11653</guid>

					<description><![CDATA[<p>Source: healthitanalytics.com A deep learning system was able to choose the most high-quality embryos for in-vitro fertilization (IVF) with 90 percent accuracy, according to a study published in eLife. When <a class="read-more-link" href="https://www.aiuniverse.xyz/deep-learning-tool-accurately-selects-high-quality-embryos-for-ivf/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/deep-learning-tool-accurately-selects-high-quality-embryos-for-ivf/">Deep Learning Tool Accurately Selects High-Quality Embryos for IVF</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: healthitanalytics.com</p>



<p>A deep learning system was able to choose the most high-quality embryos for in-vitro fertilization (IVF) with 90 percent accuracy, according to a study published in <em>eLife</em>.</p>



<p>When compared with trained embryologists, the deep learning model performed with an accuracy of approximately 75 percent while the embryologists performed with an average accuracy of 67 percent.</p>



<p>The average success rate of IVF is 30 percent, researchers stated. The treatment is also expensive, costing patients over $10,000 for each IVF cycle with many patients requiring multiple cycles in order to achieve successful pregnancy.</p>



<p>While multiple factors determine the success of IVF cycles, the challenge of non-invasive selection of the highest available quality embryos from a patient remains one of the most important factors in achieving successful IVF outcomes.</p>



<p>Currently, tools available to embryologists are limited and expensive, leaving most embryologists to rely on their observational skills and expertise. Researchers from Brigham and Women’s Hospital and Massachusetts General Hospital (MGH) set out to develop an assistive tool that can&nbsp;<a href="https://healthitanalytics.com/news/medical-imaging-machine-learning-to-align-in-10-key-areas">evaluate images</a>&nbsp;captured using microscopes traditionally available at fertility centers.</p>



<p>“There is so much at stake for our patients with each IVF cycle. Embryologists make dozens of critical decisions that impact the success of a patient cycle. With assistance from our AI system, embryologists will be able to select the embryo that will result in a successful pregnancy better than ever before,”&nbsp;<a href="https://www.eurekalert.org/pub_releases/2020-09/bawh-ais091520.php">said</a>&nbsp;co-lead author Charles Bormann, PhD, MGH IVF Laboratory director.</p>



<p>The team trained the deep learning system using images of embryos captured at 113 hours post-insemination. Among 742 embryos, the AI system was 90 percent accurate in choosing the most high-quality embryos.</p>



<p>The investigators further assessed the system’s ability to distinguish among high-quality embryos with the normal number of human chromosomes and compared the system’s performance to that of trained embryologists.</p>



<p>The results showed that the system was able to differentiate and identify embryos with the highest potential for success significantly better than 15 experienced embryologists from five different fertility centers across the US.</p>



<p>Researchers pointed out that in its current state, the deep learning system is meant to act&nbsp;<a href="https://healthitanalytics.com/news/artificial-intelligence-in-healthcare-augmentation-or-companionship">only as an assistive tool</a>&nbsp;for embryologists to make judgments during embryo selection.</p>



<p>“We believe that these systems will benefit clinical embryologists and patients,” said corresponding author&nbsp;Hadi Shafiee, PhD, of the Division of Engineering in Medicine at the Brigham. “A major challenge in the field is deciding on the embryos that need to be transferred during IVF. Our system has tremendous potential to improve clinical decision making and access to care.”</p>



<p>The team also stated that while the study demonstrates the potential for&nbsp;<a href="https://healthitanalytics.com/features/what-is-deep-learning-and-how-will-it-change-healthcare">deep learning</a>&nbsp;to outperform human clinicians, further research is needed before these tools can be deployed in regular clinical care.</p>



<p>“Advances in artificial intelligence have fostered numerous applications that have the potential to improve standard-of-care in the different fields of medicine. While other groups have also evaluated different use cases for machine learning in assisted reproductive medicine, this approach is novel in how it used a deep learning system trained on a large dataset to make predictions based on static images,” researchers said.</p>



<p>“Although the current retrospective study shows that these systems can perform better than highly-trained embryologists, randomized control trials are required before routine use in clinical practice is adopted.”</p>



<p>The findings offer hope for individuals seeking to undergo IVF, the group concluded.</p>



<p>“Our approach has shown the potential of AI systems to be used in aiding embryologists to select the embryo with the highest implantation potential, especially amongst high-quality embryos,” said Manoj Kumar Kanakasabapathy, one of the co-lead authors.</p>
<p>The post <a href="https://www.aiuniverse.xyz/deep-learning-tool-accurately-selects-high-quality-embryos-for-ivf/">Deep Learning Tool Accurately Selects High-Quality Embryos for IVF</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/deep-learning-tool-accurately-selects-high-quality-embryos-for-ivf/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
