<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>predict Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/predict/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/predict/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Tue, 13 Jul 2021 09:53:26 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>Data science can’t predict Covid trajectory, yet</title>
		<link>https://www.aiuniverse.xyz/data-science-cant-predict-covid-trajectory-yet/</link>
					<comments>https://www.aiuniverse.xyz/data-science-cant-predict-covid-trajectory-yet/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 13 Jul 2021 09:53:24 +0000</pubDate>
				<category><![CDATA[Data Science]]></category>
		<category><![CDATA[Covid]]></category>
		<category><![CDATA[data science]]></category>
		<category><![CDATA[predict]]></category>
		<category><![CDATA[trajectory]]></category>
		<guid isPermaLink="false">https://www.aiuniverse.xyz/?p=14936</guid>

					<description><![CDATA[<p>Source &#8211; https://www.thehindubusinessline.com/ The behaviour of the new and unknown disease is too complicated and unpredictable for data science to handle People today aspire to use Big <a class="read-more-link" href="https://www.aiuniverse.xyz/data-science-cant-predict-covid-trajectory-yet/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/data-science-cant-predict-covid-trajectory-yet/">Data science can’t predict Covid trajectory, yet</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.thehindubusinessline.com/</p>



<h2 class="wp-block-heading">The behaviour of the new and unknown disease is too complicated and unpredictable for data science to handle</h2>



<p>People today aspire to use Big Data in elections, sports, healthcare, business, national planning, and where not. Michael Lewis’ 2003 book&nbsp;<em>Moneyball</em>, depicted how the manager of Oakland Athletics built up a successful baseball team by using data and computer analytics to recruit new players. ‘Moneyball’ culture soon began to dominate every bit of our life. And Silicon Valley entered into it with a new set of professionals — data scientists — who, according to&nbsp;<em>Harvard Business Review</em>, is the most attractive job of the 21st century.</p>



<p>People today expect data science to devise profit-making business strategy, come up with winnable election tactics, or generate world cup-triumphing playing mechanisms. Often, data scientists also aspire to do so by using the heart-beat of data. But, can they succeed? Big Data Analytics maybe like ‘churning the ocean’ in search of ‘nectar’ hidden deep in it, as depicted in the great epic <em>Mahabharata</em>. That’s a gigantic project, for sure. One needs a lot of efforts and expertise to obtain the nectar, but there’s every chance to get deceived by other substances — including deadly poison — obtained in the process of churning.</p>



<p>The ongoing pandemic, however, provided a golden opportunity for data science to exhibit its strength. It was its litmus test as well. As early as April 2020, a&nbsp;<em>Harvard Business Review</em>&nbsp;article perceived: “In many ways, this is our most meaningful Big Data and analytics challenge so far. With will and innovation, we could rapidly forecast the spread of the virus not only at a population level but also, and necessarily, at a hyper-local, neighbourhood level.”</p>



<h2 class="wp-block-heading">Misleading predictions</h2>



<p>As Covid-19 yielded loads of freely available data, various data scientists came up with lots of predictions and strategies — that of the eventual number of infected, eventual number of deaths, duration of lockdown needed to control the pandemic, etc.</p>



<p>In fact, forecasting the trajectory of the disease over time became almost a fashionable exercise to many. No wonder, in many cases, these were even contradictory in nature, and eventually most of these predictions proved to be utterly wrong, misleading and useless.</p>



<p>Predicting the future course of events by using the techniques of data science reminds one of the Tom Cruise starrer 2002 Spielberg movie&nbsp;<em>Minority Report</em>, where the PreCrime police force of Washington DC in 2054 even predicts future murders using data mining and predictive analyses!</p>



<p>In practice, data science often use statistical models and techniques, which are based on various underlying assumptions. Often, the real data doesn’t satisfy the assumptions of these models.</p>



<p>For example, for analysing the data of the pandemic, models such as SIR, SEIR or some of their variants were widely used. But, the dynamics of a new and unknown disease maybe far more complicated and unpredictable, and it’s most likely that they would fail to satisfy the assumptions of those classical models or their tweaks. Thus, serious error is bound to occur, which would get compounded with loads of data. Then, running routine software packages for analysing big data is never adequate, and is often incorrect.</p>



<p>With the ever-expanding horizon of ‘Internet of Things’, data is growing exponentially. The size of the digital universe was predicted to double every two years beyond 2020. The ongoing pandemic might have induced a higher rate of increase!</p>



<p>However, unless some event like Cambridge Analytica breaks, we can’t usually understand that our every footstep is added to the ocean of data. The world has become data-addicted. But, with so much data, the needle is bound to come in an increasingly larger haystack.</p>



<p>In 2008, Google launched the web service ‘<em>Google Flu Trends</em>’ project, with an objective to make accurate predictions about outbreaks of flu by aggregating Google Search queries. The project, however, failed — people often search for disease symptoms that are similar to flu, but are not actually flu. And when the much-hyped ‘<em>Google Flu Trends</em>’ project turned to a disastrous failure, people came to understand that big data might not be the holy grail.</p>



<p>Also, current computational equipment are certainly inadequate to handle millions of variables and billions of data points. The number of pairs of variables showing significant ‘spurious’ or ‘nonsense’ correlation would increase in the order of the ‘square of the number of variables’, which are almost impossible to identify.</p>



<p>Thus, churning the ocean of big data may yield both nectar and poison. Separating them out is a daunting task. Statistics is still in its infancy in this context, and is not equipped yet to handle these kinds of problems. Let’s be honest to admit that.</p>



<p>Overall, data science, being reliant on ‘statistics’ for its models and analyses, may not be ready yet for complex predictions such as the complicated yet verifiable trajectory of Covid-19. For the time being, data science’s best bet maybe to get engaged into open-ended unverifiable problems.</p>



<p>The writer is Professor of Statistics, Indian Statistical Institute, Kolkata</p>
<p>The post <a href="https://www.aiuniverse.xyz/data-science-cant-predict-covid-trajectory-yet/">Data science can’t predict Covid trajectory, yet</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/data-science-cant-predict-covid-trajectory-yet/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Artificial Intelligence May Soon Predict How Electronics Fail</title>
		<link>https://www.aiuniverse.xyz/artificial-intelligence-may-soon-predict-how-electronics-fail/</link>
					<comments>https://www.aiuniverse.xyz/artificial-intelligence-may-soon-predict-how-electronics-fail/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Wed, 23 Jun 2021 11:01:57 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[electronics]]></category>
		<category><![CDATA[predict]]></category>
		<category><![CDATA[Soon]]></category>
		<guid isPermaLink="false">https://www.aiuniverse.xyz/?p=14483</guid>

					<description><![CDATA[<p>Source &#8211; https://www.eletimes.com/ Think of them as master Lego builders, only at an atomic scale. Engineers at CU Boulder have taken a major step forward in combing <a class="read-more-link" href="https://www.aiuniverse.xyz/artificial-intelligence-may-soon-predict-how-electronics-fail/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/artificial-intelligence-may-soon-predict-how-electronics-fail/">Artificial Intelligence May Soon Predict How Electronics Fail</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.eletimes.com/</p>



<p>Think of them as master Lego builders, only at an atomic scale. Engineers at CU Boulder have taken a major step forward in combing advanced computer simulations with artificial intelligence to try to predict how electronics, like the transistors in your cell phone, will fail.</p>



<p>In the latest study, researchers mapped out the physics of small building blocks made up of atoms, then used machine learning techniques to estimate how larger structures created from those same building blocks might behave. It’s a bit like looking at a single Lego brick to try to predict the strength of a much larger castle.</p>



<p>It’s a pursuit that could be a boon for the electronics that underpin our daily lives, from smartphones and electric cars to emerging quantum computers. One day, engineers could use the team’s methods to pinpoint in advance weak points in the design of electronic components.</p>



<p>The project is part of a larger focus on how the world of very small things, such as the wiggling of atoms, can help people build new and more efficient computers—even ones that take their inspiration from human brains. Artem Pimachev, a research associate in aerospace engineering at CU Boulder, is a co-author of the new study.</p>



<p>Rather than wait for years to figure out why devices fail, our methods can give us a priori knowledge on how a device is going to work before we even build it.</p>



<p><strong>Heating up</strong></p>



<p>Their latest research focuses on a big sticking point in the electronics industry: Hotspots.</p>



<p>And, no, that doesn’t mean the mobile WiFi hookups. Most modern computing tools carry a large number of imperfections––small defects in electronic components that cause heat to build up at certain sites, a bit like how a bicycle slows down when you ride over rough terrain. Such “hotspots” also make your smartphone a lot less efficient.</p>



<p>The problem is that engineers drawing on computer simulations, or models, struggle to predict ahead of time where those weak points are likely to turn up.</p>



<p>We can use physics models to understand systems with approximately 100 atoms in them. But that doesn’t compare to the billions of atoms in these devices.</p>



<p><strong>From atoms to devices</strong></p>



<p>Think back to those individual Lego bricks, which, in this case, are clumps of 16 silicon and germanium atoms, the main ingredients in many computer components.</p>



<p>In the new study, researchers developed a computer model that uses artificial intelligence&nbsp;to learn the&nbsp;physical properties&nbsp;within those building blocks—or how atoms and electrons come together to determine the energy landscape within a material. The model can then extrapolate from those basic blocks to estimate the distribution of energy in a much larger chunk of atoms.</p>



<p>It collects information from each individual unit and combines them to predict the final properties of the collective system, which can be made up of two, three or more units.</p>



<p>The team still has a long way to go before it can pinpoint all of the potential weak points in a device the size of your phone. But, so far, the group’s model has proved effective.</p>



<p>The researcher is also drawing on her understanding of how heat and energy flow at very small scales to not just improve existing devices, but also help create the devices of the future.</p>



<p>What I want to do is poke at this world of atoms in your handheld device and understand how materials and <strong>electronics</strong> come together to make a device work.</p>
<p>The post <a href="https://www.aiuniverse.xyz/artificial-intelligence-may-soon-predict-how-electronics-fail/">Artificial Intelligence May Soon Predict How Electronics Fail</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/artificial-intelligence-may-soon-predict-how-electronics-fail/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Combining AI with cardiac imaging helps predict heart attacks, cardiovascular deaths</title>
		<link>https://www.aiuniverse.xyz/combining-ai-with-cardiac-imaging-helps-predict-heart-attacks-cardiovascular-deaths/</link>
					<comments>https://www.aiuniverse.xyz/combining-ai-with-cardiac-imaging-helps-predict-heart-attacks-cardiovascular-deaths/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 15 Jun 2021 05:08:26 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[attacks]]></category>
		<category><![CDATA[cardiac]]></category>
		<category><![CDATA[Cardiovascular]]></category>
		<category><![CDATA[Combining]]></category>
		<category><![CDATA[deaths]]></category>
		<category><![CDATA[Imaging]]></category>
		<category><![CDATA[predict]]></category>
		<guid isPermaLink="false">https://www.aiuniverse.xyz/?p=14305</guid>

					<description><![CDATA[<p>Source &#8211; https://www.cardiovascularbusiness.com/ Researchers have developed a deep learning network capable of accurately predicting a person’s risk of adverse cardiac events, presenting their findings virtually at the <a class="read-more-link" href="https://www.aiuniverse.xyz/combining-ai-with-cardiac-imaging-helps-predict-heart-attacks-cardiovascular-deaths/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/combining-ai-with-cardiac-imaging-helps-predict-heart-attacks-cardiovascular-deaths/">Combining AI with cardiac imaging helps predict heart attacks, cardiovascular deaths</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.cardiovascularbusiness.com/</p>



<p>Researchers have developed a deep learning network capable of accurately predicting a person’s risk of adverse cardiac events, presenting their findings virtually at the Society of Nuclear Medicine and Molecular Imaging (SNMMI) 2021 Annual Meeting.</p>



<p>The new analysis included data from more than 20,000 patients who underwent single photon emission CT (SPECT) myocardial perfusion imaging (MPI). The advanced algorithm used those SPECT MPI results to determine each patient’s risk of a major adverse cardiac event—myocardial infarctions or cardiovascular deaths, for example—and then patients were followed for an average of nearly five years to test the algorithm’s accuracy.</p>



<p>Overall, the authors found, the annual rate of major adverse cardiac events among patients with the highest deep learning scores was 9.7%. This represented a 10.2-fold increase compared to the annual rate among patients with the lowest scores.</p>



<p>“These findings show that artificial intelligence could be incorporated in standard clinical workstations to assist physicians in accurate and fast risk assessment of patients undergoing SPECT MPI scans,” Ananya Singh, MS, a research software engineer in the Slomka Lab at Cedars-Sinai Medical Center in Los Angeles, said in a prepared statement. “This work signifies the potential advantage of incorporating artificial intelligence techniques in standard imaging protocols to assist readers with risk stratification.”</p>
<p>The post <a href="https://www.aiuniverse.xyz/combining-ai-with-cardiac-imaging-helps-predict-heart-attacks-cardiovascular-deaths/">Combining AI with cardiac imaging helps predict heart attacks, cardiovascular deaths</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/combining-ai-with-cardiac-imaging-helps-predict-heart-attacks-cardiovascular-deaths/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Deep Learning, Genomic Data May Help Predict Alzheimer’s Disease</title>
		<link>https://www.aiuniverse.xyz/deep-learning-genomic-data-may-help-predict-alzheimers-disease/</link>
					<comments>https://www.aiuniverse.xyz/deep-learning-genomic-data-may-help-predict-alzheimers-disease/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 02 Apr 2021 06:35:02 +0000</pubDate>
				<category><![CDATA[Deep Learning]]></category>
		<category><![CDATA[Alzheimer’s]]></category>
		<category><![CDATA[data]]></category>
		<category><![CDATA[deep learning]]></category>
		<category><![CDATA[Disease]]></category>
		<category><![CDATA[Genomic]]></category>
		<category><![CDATA[predict]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=13883</guid>

					<description><![CDATA[<p>Source &#8211; https://healthitanalytics.com/ Deep learning methods analyzed genomic data from whole blood samples and found differences in patients with Alzheimer’s disease. Using deep learning and genomic data, <a class="read-more-link" href="https://www.aiuniverse.xyz/deep-learning-genomic-data-may-help-predict-alzheimers-disease/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/deep-learning-genomic-data-may-help-predict-alzheimers-disease/">Deep Learning, Genomic Data May Help Predict Alzheimer’s Disease</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://healthitanalytics.com/</p>



<p>Deep learning methods analyzed genomic data from whole blood samples and found differences in patients with Alzheimer’s disease.</p>



<p>Using deep learning and genomic data, researchers from Beaumont Health have discovered a simple blood test that may help predict Alzheimer’s disease in patients.</p>



<p>In a study published in <em>PLOS ONE</em>, the team described using deep learning processes to analyze extracted genomic DNA from whole blood samples. The analysis uncovered 152 significant genetic differences in patients with Alzheimer’s compared to healthy patients.</p>



<p>The new deep learning method has the potential to diagnose patients much earlier in the disease process, before symptoms develop and the brain is irreversibly damaged. Experts believe that the brain changes in Alzheimer’s disease precede the onset of symptoms by years.</p>



<p>Globally, more than 47 million individuals have Alzheimer’s, with women making up more than 60 percent of patients. As the population continues to age, it’s expected that 75 million people will be affected by Alzheimer’s by 2030, with a subsequent rise to 131 million by 2050.</p>



<p>“The holy grail is to identify patients in the pre-clinical stage so effective early interventions, including new medications, can be studied and ultimately used,” said Ray Bahado-Singh, chairman of the Beaumont Department of Obstetrics and Gynecologist and an expert in women&#8217;s health. “That&#8217;s why we are excited about the results of this research.”</p>



<p>Most patients with Alzheimer’s aren’t diagnosed until later stages of the disease, when the brain has already suffered irreversible damage. There is currently no cure for Alzheimer’s, and treatment is limited to drugs that attempt to treat symptoms and have little impact on the disease’s progression.</p>



<p>“Drugs used in the late stage of the disease do not seem make much difference, so there is a tremendous amount of interest in diagnosis in the early stages of the disease,” said Khaled Imam, Beaumont Health&#8217;s Director of Geriatric Medicine.</p>



<p>“Any delay in symptom onset is likely to be very beneficial.&nbsp; Also, a spinal tap or MRI can identify the start of the disease. But that is invasive and/or expensive. And you cannot do a spinal tap on everyone over age 65. So, blood is thought to be a desirable way of approaching this. And it would be relatively cheap and minimally invasive as compared to an MRI or spinal tap.”</p>



<p>In the analysis, researchers compared blood samples from 24 Alzheimer’s patients and 24 cognitively healthy patients. The team analyzed white blood cells in the blood samples and compared biomarkers to see if they had been generally affected in patients with Alzheimer’s disease.</p>



<p>Part of the Alzheimer’s disease process is brain inflammation thought to trigger the production of white blood cells, or leukocytes, which then become genetically altered while fighting the disease. Researchers looked for telling genetic markers, or methylation marks, an important chemical modification of genes leading to altered gene function that indicate the disease process has started.</p>



<p>“It&#8217;s almost as if the leukocytes have become a newspaper to tell us, &#8216;This is what&#8217;s going on in the brain,&#8217;” Bahado-Singh said.</p>



<p>The team used six different artificial intelligence and deep learning platforms to look at about 800,000 changes in the genome of the leukocytes.</p>



<p>Researchers noted that the results could potentially advance precision medicine for Alzheimer’s disease, and provide evidence that epigenetic factors may play a critical role in Alzheimer’s development.</p>



<p>Going forward, the group will aim to organize a much larger study to replicate the study’s initial findings over the next year or so.</p>



<p>“What the results said to us is there are significant changes in accessible blood cells that we can use possibly to detect Alzheimer&#8217;s,” Bahado-Singh said.</p>



<p>“We found that the genetic analysis accurately predicted the absence or presence of Alzheimer&#8217;s, allowing us to read what is going on in the brain through the blood.&nbsp; The results also gave us a readout of the abnormalities that are causing Alzheimer&#8217;s disease. This has future promise for developing targeted treatment to interrupt the disease process.”</p>
<p>The post <a href="https://www.aiuniverse.xyz/deep-learning-genomic-data-may-help-predict-alzheimers-disease/">Deep Learning, Genomic Data May Help Predict Alzheimer’s Disease</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/deep-learning-genomic-data-may-help-predict-alzheimers-disease/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>To Predict Mortality After MI, Machine Learning Needs Better Intel</title>
		<link>https://www.aiuniverse.xyz/to-predict-mortality-after-mi-machine-learning-needs-better-intel/</link>
					<comments>https://www.aiuniverse.xyz/to-predict-mortality-after-mi-machine-learning-needs-better-intel/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Sat, 13 Mar 2021 06:40:08 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[better]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[MI]]></category>
		<category><![CDATA[Mortality]]></category>
		<category><![CDATA[needs]]></category>
		<category><![CDATA[predict]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=13445</guid>

					<description><![CDATA[<p>Source &#8211; https://www.tctmd.com/ In order for AI-based algorithms to perform better, data sets need to become less crude, study author says. Squelching some of the mounting excitement <a class="read-more-link" href="https://www.aiuniverse.xyz/to-predict-mortality-after-mi-machine-learning-needs-better-intel/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/to-predict-mortality-after-mi-machine-learning-needs-better-intel/">To Predict Mortality After MI, Machine Learning Needs Better Intel</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.tctmd.com/</p>



<p>In order for AI-based algorithms to perform better, data sets need to become less crude, study author says.</p>



<p>Squelching some of the mounting excitement over artificial intelligence, a new study shows no improvement in predicting in-hospital mortality after acute MI with machine learning over standard logistic regression models.</p>



<p>“Existing models were not perfect, and our thought was using advanced models we could derive additional insights from these presumably rich data sets,” lead author Rohan Khera, MBBS (Yale School of Medicine, New Haven, CT), told TCTMD. “But we were unable to discern any additional information, suggesting that our current way of abstracting data into fixed fields, like we do in registries, does not capture the entirety of the patient phenotype. And patients still have a lot of features that we probably capture in our day-to-day clinical care that are not put into these structured fields in a registry.”</p>



<p>It’s not that the data show a problem with machine learning, echoed Ann Marie Navar, MD, PhD (UT Southwestern Medical Center, Dallas, TX), who co-authored an editorial accompanying the study. “It&#8217;s as much a reflection that our current statistical tools for more traditional risk prediction are actually pretty good,” she told TCTMD. “So it&#8217;s kind of hard to build a better mouse trap there.”</p>



<p>For the study, published online this week in&nbsp;<em>JAMA Cardiology</em>, Khera and colleagues compared the predictive values of several machine-learning-based models with logistic regression for in-hospital death among 755,402 patients who were hospitalized for acute MI between 2011 and 2016 and enrolled in the American College of Cardiology Chest Pain &#8211; MI Registry. Overall in-hospital mortality was 4.4%.</p>



<p>Model performance, including area under the receiver operator curve (AUROC), sensitivity, and specificity, was similar for logistic regression and all machine learning-based algorithms.</p>



<p>Notably, both the XGBoost and meta-classifier models showed near-perfect calibration in independent validation, with each reclassifying 27% and 25%, respectively, of people who had been deemed low risk by logistic regression as being moderate-to-high risk, which was more consistent with observed events.</p>



<p>“The general conclusion that we draw is that our data streams have to become better for us to be able to leverage them completely for all clinical applications,” Khera said. “Our current data are very crude—they&#8217;re manually abstracted into a fixed number of data fields—and our assumption that a model that does a little better at detecting relationships in these few variables will do better is probably not the case.”</p>



<p>If currently available models work, “why would you replace it with something else that has more computational power but requires more coding skill and everything involved?” Khera asked. “If both the skill set and the computational power are higher in developing such models. It only makes sense to develop such models if you&#8217;re application markedly improves the rate of predictions or understanding quality or new signatures of patients.”</p>



<p>This means that healthcare systems have work to do, he continued. “Hospitals and healthcare systems should band together to participate in rich data-sharing platforms that can allow us to aggregate this rich information from individual hospitals into a common consortium,” Khera said, noting that current electronic health record (EHR) research is often single institution based. “What registries offer at the other end of the spectrum is you could have a thousand hospitals contributing their data.”</p>



<p>Similarly, he called for national cardiovascular societies “to now go to the next level by incorporating these rich signals from the EHR directly into a higher dimensional registry rather than these manually extracted registries.”</p>



<p><strong>In the ‘Gray Area’</strong></p>



<p>In their editorial, Navar along with Matthew M. Engelhard, MD, PhD, and Michael J. Pencina, PhD (both Duke University School of Medicine, Durham, NC), write that “when working with images, text, or time series, machine learning is almost sure to add value, whereas when working with a fewer, weakly correlated clinical variables, logistic regression is likely to do just as well. In the substantial gray area between these extremes, judgment and experimentation are required.”</p>



<p>This study falls in this category while also hinting at the potential benefits of machine learning. “When correctly applied, it might lead to more meaningful gains in calibration than discrimination,” they say. “This is an important finding, because the role of calibration is increasingly recognized as key for unbiased clinical decision-making, especially when threshold-based classification rules are used. The correctly applied caveat is also important; unfortunately, many developers of machine learning models treat calibration as an afterthought.”</p>



<p>Navar explained that the importance of calibration is dependent on how the model is being used. For example, if it is being deployed to find the patients within the top 10% highest risk in order to best dole out a targeted intervention, discrimination is more important, she said. “But if you have a model to tell somebody that their chance of a heart attack in the next few years is 20% or 10% or 15% and you&#8217;re giving that actual number to a patient, you kind of want to make sure that number is as close to right as possible.” Calibration is also vital for cost-effective models, Navar added.</p>



<p>In this case, for risk prediction, “a traditional modeling approach is really nice because you can see what is going on with all the different variables, you can cross that to what you know about the biology and the epidemiology of whatever it is that you&#8217;re looking at, and then providers can see it,” she said. “We can see how, if we&#8217;re using a model, blood pressure goes up, risk goes up; someone&#8217;s a smoker, risk goes up; and that&#8217;s not always so obvious if you just package up a machine-learning model and just deploy it to a physician without them being able to see what&#8217;s going on underneath the hood.”</p>



<p>For now, this advantage gives traditional models the “upper hand,” Navar said. “But that doesn&#8217;t mean that the insights from those machine learning models are wrong. It just means that the other models are a little bit easier to use.”</p>



<p>“Recent feats of machine learning in clinical medicine have seized our collective attention, and more are sure to follow,” the editorial concludes. “As medical professionals, we should continue building familiarity with these technologies and embrace them when benefits are likely to outweigh the costs, including when working with complex data. However, we must also recognize that for many clinical prediction tasks, the simpler approach—the generalized linear model—may be all that we need.”</p>
<p>The post <a href="https://www.aiuniverse.xyz/to-predict-mortality-after-mi-machine-learning-needs-better-intel/">To Predict Mortality After MI, Machine Learning Needs Better Intel</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/to-predict-mortality-after-mi-machine-learning-needs-better-intel/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Now Artificial Intelligence can help predict storms, cyclones: Here&#8217;s all you need to know</title>
		<link>https://www.aiuniverse.xyz/now-artificial-intelligence-can-help-predict-storms-cyclones-heres-all-you-need-to-know/</link>
					<comments>https://www.aiuniverse.xyz/now-artificial-intelligence-can-help-predict-storms-cyclones-heres-all-you-need-to-know/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Mon, 08 Jul 2019 12:44:43 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[cyclones]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[predict]]></category>
		<category><![CDATA[storms]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=3998</guid>

					<description><![CDATA[<p>Source: timesnownews.com New York:&#160;Using Artificial Intelligence (AI), researchers have developed an algorithm to detect cloud formations that lead to storms, hurricanes, and cyclones. The study, published in <a class="read-more-link" href="https://www.aiuniverse.xyz/now-artificial-intelligence-can-help-predict-storms-cyclones-heres-all-you-need-to-know/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/now-artificial-intelligence-can-help-predict-storms-cyclones-heres-all-you-need-to-know/">Now Artificial Intelligence can help predict storms, cyclones: Here&#8217;s all you need to know</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: timesnownews.com</p>



<p><strong>New York:</strong>&nbsp;Using Artificial Intelligence (AI), researchers have developed an algorithm to detect cloud formations that lead to storms, hurricanes, and cyclones. The study, published in the journal IEEE Transactions on Geoscience and Remote Sensing, shows a model that can help forecasters recognise potential severe storms more quickly and accurately.</p>



<p>The researchers created a framework based on Machine Learning (ML) &#8212; a kind of AI &#8212; that detects rotational movements in clouds from satellite images that might have otherwise gone unnoticed.&nbsp;</p>



<p>&#8220;The very best forecasting incorporates as much data as possible, there&#8217;s so much to take in as the atmosphere is infinitely complex. By using the models and the data we have, we&#8217;re taking a snapshot of the most complete look of the atmosphere,&#8221; said Steve Wistar, Senior Forensic Meteorologist at AccuWeather in the US.</p>



<p>For the study, researchers analysed more than 50,000 US weather satellite images and identified and labelled the shape and motion of &#8216;comma-shaped&#8217; clouds.&nbsp;These cloud patterns are strongly associated with cyclone formations which can lead to severe weather events including hail, thunderstorms, high winds, and blizzards, they said.</p>



<p>Then, using computer vision and ML techniques, the researchers taught computers to automatically recognize and detect &#8216;comma-shaped&#8217; clouds in satellite images.&nbsp;The computers could then assist experts by pointing out in real time where, in an ocean of data, could they focus their attention in order to detect the onset of severe weather.</p>



<p>&#8220;Because the &#8216;comma-shaped&#8217; cloud is a visual indicator of severe weather events, our scheme can help meteorologists to forecast such events,&#8221; said study lead author Rachel Zheng from Penn State University in the US. The researchers found that their method can effectively detect &#8216;comma-shaped&#8217; clouds with 99 per cent accuracy, at an average of 40 seconds per prediction.&nbsp;</p>



<p>It was also able to predict 64 per cent of severe weather events, outperforming other existing severe weather detection methods. This research is an early attempt to show feasibility of AI-based interpretation of weather-related visual information.&nbsp;</p>
<p>The post <a href="https://www.aiuniverse.xyz/now-artificial-intelligence-can-help-predict-storms-cyclones-heres-all-you-need-to-know/">Now Artificial Intelligence can help predict storms, cyclones: Here&#8217;s all you need to know</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/now-artificial-intelligence-can-help-predict-storms-cyclones-heres-all-you-need-to-know/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
