<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>better Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/better/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/better/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Fri, 09 Jul 2021 08:54:27 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>IMPLEMENTATION OF BIG DATA ANALYTICS FOR BETTER CUSTOMER EXPERIENCE</title>
		<link>https://www.aiuniverse.xyz/implementation-of-big-data-analytics-for-better-customer-experience/</link>
					<comments>https://www.aiuniverse.xyz/implementation-of-big-data-analytics-for-better-customer-experience/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 09 Jul 2021 08:54:23 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[Analytics]]></category>
		<category><![CDATA[better]]></category>
		<category><![CDATA[Big data]]></category>
		<category><![CDATA[Experience]]></category>
		<category><![CDATA[IMPLEMENTATION]]></category>
		<guid isPermaLink="false">https://www.aiuniverse.xyz/?p=14841</guid>

					<description><![CDATA[<p>Source &#8211; https://www.analyticsinsight.net/ The organization leverages big data analytics for effective customer experience. With digital transformation and innovation, access to a huge pile of information as well as data <a class="read-more-link" href="https://www.aiuniverse.xyz/implementation-of-big-data-analytics-for-better-customer-experience/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/implementation-of-big-data-analytics-for-better-customer-experience/">IMPLEMENTATION OF BIG DATA ANALYTICS FOR BETTER CUSTOMER EXPERIENCE</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.analyticsinsight.net/</p>



<h2 class="wp-block-heading">The organization leverages big data analytics for effective customer experience.</h2>



<p>With digital transformation and innovation, access to a huge pile of information as well as data which includes buying behaviour and preferences of customers has become easily available. Unlike traditional CRM information, data is now collected increasingly from blogs, social media posts through the use of smartphones and other digital sources. This varied information and data are known as big data.</p>



<p>Modern technology serves in providing tools to all industries that enhance operations leading to maximization of revenue and minimization of waste. As a result, many industries now have access to updated information about customer experience, financial transactions, product acceptance. Decision-makers can view, analyse and reply in real-time.</p>



<p>Unlike traditional data, big data is unstructured and distributed, and it is impossible for traditional SQL databases to manage and control those data. Therefore, big data analytics are required to evaluate and measure a huge pile of information collected from customers.</p>



<p>Big data analytics, is a method of advanced analytics, which includes multifaceted applications with elements such as predictive models, statistical algorithms powered by analytics systems.</p>



<p>Organizations can make use of big data analytics to make informed data-driven decisions that can result in an improved business-related outcome. This can lead to more effective and efficient marketing, customer personalization, and new revenue opportunities.</p>



<p>How to make use of big data analytics for better customer experience-</p>



<h4 class="wp-block-heading">Build a <strong>data centre</strong></h4>



<p>Collecting all information and data and assembling them into one place creates a clear, enterprise-wide&nbsp;<a href="https://www.analyticsinsight.net/how-to-tell-a-story-with-data/">data</a>&nbsp;layer giving prominence to all departments. This enables organizations to make informed choices for customers and employees. Big data analytics can simplify and transmute customer and company relationships. Using data analytics helps the organization to know their customers better and this helps in minimizing or rather bridging the gap in their marketing strategies.</p>



<h4 class="wp-block-heading">Identify customers effectively through data</h4>



<p>Building a strong data platform and using big data analytics helps in gaining access to all the information about the customers which leads to effective identification of customers. Also, it helps in collecting customer feedback which is useful for product innovation and market strategy. Through big data, analytics organizations can build loyalty towards the customer as well as respond to their complaints. This will also help in tracking marketing trends.</p>



<h4 class="wp-block-heading">Advancing talent recruitment and retention</h4>



<p>Accessing functional data helps in fuelling improvements in the field of hiring and getting candidates more effectively and efficiently. This helps in reducing hiring time, decreases costs, and also enhances candidate’s and company’s experience. Big data analytics helps in the growth of a company’s digital and analytical models. It is important to put more employees at the forefront to make use of big data analytics to access data which will allow them to make effective decisions for the organization as well as customers.</p>



<h4 class="wp-block-heading">Access to personal information</h4>



<p>Big data analytics helps in collecting personal information (gathered through big data) about a customer, thus helps in interacting with the customer in a meaningful and productive way. Thus, this level of customer personalization in brand communications can be expected.</p>



<h4 class="wp-block-heading">Track customer behaviour and motivation</h4>



<p>Big data analytics can be used to find out customer sentiments and behaviour. Through big data analytics, it is also easy to track customer experience and review which enables an organization to make an informed decision about customer behaviour.</p>



<p>No matter what industry or organization it is, big data analytics are used to find a way to improve market strategies by getting access to a huge pile of data through social media posts, blogs, etc. With more data available than ever before, big data analytics is used to improve and enhance customer experience.</p>
<p>The post <a href="https://www.aiuniverse.xyz/implementation-of-big-data-analytics-for-better-customer-experience/">IMPLEMENTATION OF BIG DATA ANALYTICS FOR BETTER CUSTOMER EXPERIENCE</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/implementation-of-big-data-analytics-for-better-customer-experience/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>HOW BIG DATA HAS MADE CLINICAL TRIALS FASTER, BETTER AND CHEAPER</title>
		<link>https://www.aiuniverse.xyz/how-big-data-has-made-clinical-trials-faster-better-and-cheaper/</link>
					<comments>https://www.aiuniverse.xyz/how-big-data-has-made-clinical-trials-faster-better-and-cheaper/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 18 Jun 2021 05:52:36 +0000</pubDate>
				<category><![CDATA[Big Data]]></category>
		<category><![CDATA[better]]></category>
		<category><![CDATA[Big data]]></category>
		<category><![CDATA[CHEAPER]]></category>
		<category><![CDATA[Clinical Trials]]></category>
		<category><![CDATA[faster]]></category>
		<guid isPermaLink="false">https://www.aiuniverse.xyz/?p=14398</guid>

					<description><![CDATA[<p>Source &#8211; https://www.analyticsinsight.net/ Implementing&#160;big data&#160;has revolutionized clinical trials, making it efficient, accurate, and cheaper. Technological developments have boosted the healthcare community to improve their research. Over the <a class="read-more-link" href="https://www.aiuniverse.xyz/how-big-data-has-made-clinical-trials-faster-better-and-cheaper/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-big-data-has-made-clinical-trials-faster-better-and-cheaper/">HOW BIG DATA HAS MADE CLINICAL TRIALS FASTER, BETTER AND CHEAPER</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.analyticsinsight.net/</p>



<h2 class="wp-block-heading">Implementing&nbsp;<strong>big data</strong>&nbsp;has revolutionized clinical trials, making it efficient, accurate, and cheaper.</h2>



<p>Technological developments have boosted the healthcare community to improve their research. Over the past few years, clinical research has witnessed a significant growth. Researchers and healthcare specialists are implementing big data tools and technologies to accelerate the research procedure and get a cost-effective measure for accurate results. The need for faster results is mainly due to the increasing demand for a deeper understanding of various diseases and viruses and to find out the perfect treatment for these ailments.</p>



<p>Implementing&nbsp;big data&nbsp;in clinical trials has been a transformative move. &nbsp;The fields of Healthcare and medicine are evolving and are becoming better and cheaper.</p>



<p>The routinely collected data (RCD) or any data, that is gathered during any medical test or population health chart is collected under big data. Some of our actions like using a Fitbit, purchasing medicines from a counter, or booking an appointment with the doctor virtually leave electronic footprints. These footprints are collected to improve healthcare and medicinal facilities.</p>



<h4 class="wp-block-heading"><strong>Clinical Trials and Instant Analytics</strong></h4>



<p>Big data analytics uses the latest technologies to study real-time data and draw insights from them. Researchers use these insights to facilitate improved and accurate results of the trials. With the help of predictive analytics and data analysis tools, healthcare specialists can detect early signs of diseases and aid in monitoring the collected data. These tools analyze the data thoroughly and continuously, not just after the trials are completed.</p>



<p>Healthcare practitioners can track and detect the drug exposure levels, the immunity provided by the medicine, the tolerability and safety of the treatment, and other factors that are crucial for patients’ &nbsp;safety.&nbsp;Big data-powered strategies boost the speed of clinical trials and improve the accuracy of the results.</p>



<p>Other benefits of big data in clinical trials are:</p>



<ul class="wp-block-list"><li>Improved patient analytics</li><li>Boosts the sales and marketing</li><li>Reduces drug pricing and improves promotion analytics</li><li>Enhances efficiency in trials</li></ul>



<h4 class="wp-block-heading"><strong>The Reformation in Clinical Trials</strong></h4>



<p>Real-time data analysis boosts the outcome of clinical trials. Different problems, like drug accountability, protocol compliance, consent, and other complexities, can be easily resolved with the help of data analysis. The RCD can be used in the daily care of patients to analyze the outcomes and avoid unmanageable, costly treatments and follow-ups.</p>



<p>Even though the procedure has just started, the limitations of using RCD, the viability of the treatments, the details and format level, patient privacy issues, and several other complexities are already getting resolved. Even though there are several hindrances that are &nbsp;present, but in the future, clinical trials will become more efficient with innovative developments.</p>



<p>Real-time data analysis reduces the chances of error and also aids in eradicating human errors. With the help of data analysis and big data tools, healthcare IT services will be able to predict the problems, hence the solution will be provided swifter than ever imagined. Data analytics has also provided several other benefits to the healthcare industry that has got in some remarkable changes</p>
<p>The post <a href="https://www.aiuniverse.xyz/how-big-data-has-made-clinical-trials-faster-better-and-cheaper/">HOW BIG DATA HAS MADE CLINICAL TRIALS FASTER, BETTER AND CHEAPER</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-big-data-has-made-clinical-trials-faster-better-and-cheaper/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>BIG DATA ENGINEER VS AI ENGINEER: WHICH CAREER IS BETTER?</title>
		<link>https://www.aiuniverse.xyz/big-data-engineer-vs-ai-engineer-which-career-is-better/</link>
					<comments>https://www.aiuniverse.xyz/big-data-engineer-vs-ai-engineer-which-career-is-better/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Wed, 16 Jun 2021 05:01:04 +0000</pubDate>
				<category><![CDATA[Big Data]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[better]]></category>
		<category><![CDATA[Big data]]></category>
		<category><![CDATA[Career]]></category>
		<category><![CDATA[engineer]]></category>
		<guid isPermaLink="false">https://www.aiuniverse.xyz/?p=14336</guid>

					<description><![CDATA[<p>Source &#8211; https://www.analyticsinsight.net/ Analytics Insight explains the difference between big data engineers vs Artificial intelligence engineers. ‘A domain for the nerds,’ this is what technology was called <a class="read-more-link" href="https://www.aiuniverse.xyz/big-data-engineer-vs-ai-engineer-which-career-is-better/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/big-data-engineer-vs-ai-engineer-which-career-is-better/">BIG DATA ENGINEER VS AI ENGINEER: WHICH CAREER IS BETTER?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.analyticsinsight.net/</p>



<h2 class="wp-block-heading">Analytics Insight explains the difference between big data engineers vs Artificial intelligence engineers.</h2>



<p>‘A domain for the nerds,’ this is what technology was called in the late 1900s. However, a lot of things changed in the 21st century. In the digital world, we welcome hundreds of new Artificial intelligence-powered tools and solutions every day. Owing to the drastic surge in the implementation of artificial intelligence, the technology market has opened its door to more jobs. On the other hand, big data is also bringing many organizational changes to companies. Big data was previously seen as useless content occupying most of the memory in data centers. Fortunately, when technology evolved and became advanced, businesses realized the importance of big data and used it to get data-driven decisions. Following the upsurge in big data and artificial intelligence, two profiles namely big data engineer and artificial intelligence engineer took center stage. According to LinkedIn’s 2020 Emerging Jobs report, artificial intelligence engineers and data-related jobs continue to make a strong showing as the top emerging job roles for 2020 with 74% annual growth in the past four years. Big Data Engineering vs Artificial Intelligence Engineering is two data job roles that are often used interchangeably due to their overlapping skillset but are actually different. In this article, Analytics Insight explains the difference between big data engineers vs AI engineers and helps you choose the right career.</p>



<ul class="wp-block-list"><li>10 MUST-HAVE SKILLS FOR DATA ENGINEERING JOBS</li><li>RECRUITMENT: TOP 10 BEST WORKPLACES TO GROW YOUR BIG DATA CAREER</li><li>5 HABITS OF A SUCCESSFUL AI ENGINEER. DO YOU HAVE THESE?</li></ul>



<h4 class="wp-block-heading"><strong>Definition&nbsp;</strong></h4>



<p><strong>Big data engineer:</strong> Big data engineering is a branch of data science that deals with the practical applications of data analysis and collection. A big data engineer is in charge of the design and development of data pipelines. They intensely work to collect data from various sources and give it for further processing to analysts and data scientists. Even though the profile is not directly connected to business teams and business decision-making, it centers on developing systems for better flow and access to information.</p>



<p><strong>Artificial intelligence engineer:</strong> An artificial intelligence engineer is someone who works with algorithms, neural networks, and other tools to advance the field of artificial intelligence. They deal with artificial intelligence problems and solve them. Artificial intelligence engineers develop techniques and use them in commerce, science, and other fields. They must be able to extract data efficiently from a variety of sources, design algorithms, build and test machine learning models, then deploy those models to create AI-powered applications capable of performing complex tasks.</p>



<h4 class="wp-block-heading"><strong>Roles and responsibilities</strong></h4>



<p><strong>Big data engineer:</strong>&nbsp;A big data engineer has to design, develop, construct, install, test, and maintain the complete data management and processing system. Their key role is to seek the raw data and make it usable for other professionals. Without a big data engineer, the company can’t collect data from various sources. Not just collection, they also engage in managing the collection of data and handles its storage, and process it for further use. Some of the other routine responsibilities of a big data engineer are as follows,</p>



<ul class="wp-block-list"><li>Build highly scalable, robust, and fault-tolerant systems to manage high volumes of data.</li><li>To introduce new big data management tools and technologies to stay ahead in the race.</li><li>Explore various choices of data acquisitions and try out new ways to use existing data.</li><li>Create a complete solution by integrating a variety of programming languages and tools together.</li><li>Employ disaster recovery techniques in case of mishaps.</li></ul>



<p>Besides the basic responsibilities, big data engineers are expected to be well-versed in a set of technological aspects. They should have in-depth knowledge of big data technology and communicate the ideas within and out of the team. In order to carry out the task, they should be experts in the following context.</p>



<ul class="wp-block-list"><li>Basic knowledge about Java, data structuring, and big data.</li><li>Familiarity with NoSQL solutions, Cassandra, HIVE, CouchDB, and HBase.</li><li>Experience in analytics, OLAP technologies, and more.</li></ul>



<p><strong>Artificial intelligence engineer:</strong>&nbsp;Besides creating techniques, artificial intelligence engineers are assigned other organizational responsibilities as well. In order to integrate their technique across the enterprise, artificial intelligence engineers must be able to overcome the unique challenges that result from combining the logic of traditional business applications with the learned logic of machine learning models. Some of the other responsibilities are as follows,</p>



<ul class="wp-block-list"><li>Build artificial intelligence and machine learning models, then convert the machine learning models into application program interfaces (APIs) so that other applications can use them.</li><li>Help stakeholders understand the output yielding.</li><li>Set up and manage AI product infrastructure and the automation of the infrastructure used by an organization’s data science team.</li><li>Conduct statistical analysis and interpret the results to help organizations drive data decisions.</li></ul>



<h4 class="wp-block-heading"><strong>So, what should you choose as your career option?</strong></h4>



<p>According to the World Economic Forum, artificial intelligence was anticipated to create over 58 million jobs by the end of 2020. As we are already in the middle of 2021, artificial intelligence engineering and big data engineering are seeing a sweeping demand rise in the job market. But while choosing a career between these two, you should validate your interest and preferences. If you are someone who is solely interested in data and big data management, it is safe to say that you are destined to work as a big data engineer. If you like coordinating with other teams and want to work out of the clustered data, then artificial intelligence engineering will better suit you.</p>
<p>The post <a href="https://www.aiuniverse.xyz/big-data-engineer-vs-ai-engineer-which-career-is-better/">BIG DATA ENGINEER VS AI ENGINEER: WHICH CAREER IS BETTER?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/big-data-engineer-vs-ai-engineer-which-career-is-better/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>To Predict Mortality After MI, Machine Learning Needs Better Intel</title>
		<link>https://www.aiuniverse.xyz/to-predict-mortality-after-mi-machine-learning-needs-better-intel/</link>
					<comments>https://www.aiuniverse.xyz/to-predict-mortality-after-mi-machine-learning-needs-better-intel/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Sat, 13 Mar 2021 06:40:08 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[better]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[MI]]></category>
		<category><![CDATA[Mortality]]></category>
		<category><![CDATA[needs]]></category>
		<category><![CDATA[predict]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=13445</guid>

					<description><![CDATA[<p>Source &#8211; https://www.tctmd.com/ In order for AI-based algorithms to perform better, data sets need to become less crude, study author says. Squelching some of the mounting excitement <a class="read-more-link" href="https://www.aiuniverse.xyz/to-predict-mortality-after-mi-machine-learning-needs-better-intel/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/to-predict-mortality-after-mi-machine-learning-needs-better-intel/">To Predict Mortality After MI, Machine Learning Needs Better Intel</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.tctmd.com/</p>



<p>In order for AI-based algorithms to perform better, data sets need to become less crude, study author says.</p>



<p>Squelching some of the mounting excitement over artificial intelligence, a new study shows no improvement in predicting in-hospital mortality after acute MI with machine learning over standard logistic regression models.</p>



<p>“Existing models were not perfect, and our thought was using advanced models we could derive additional insights from these presumably rich data sets,” lead author Rohan Khera, MBBS (Yale School of Medicine, New Haven, CT), told TCTMD. “But we were unable to discern any additional information, suggesting that our current way of abstracting data into fixed fields, like we do in registries, does not capture the entirety of the patient phenotype. And patients still have a lot of features that we probably capture in our day-to-day clinical care that are not put into these structured fields in a registry.”</p>



<p>It’s not that the data show a problem with machine learning, echoed Ann Marie Navar, MD, PhD (UT Southwestern Medical Center, Dallas, TX), who co-authored an editorial accompanying the study. “It&#8217;s as much a reflection that our current statistical tools for more traditional risk prediction are actually pretty good,” she told TCTMD. “So it&#8217;s kind of hard to build a better mouse trap there.”</p>



<p>For the study, published online this week in&nbsp;<em>JAMA Cardiology</em>, Khera and colleagues compared the predictive values of several machine-learning-based models with logistic regression for in-hospital death among 755,402 patients who were hospitalized for acute MI between 2011 and 2016 and enrolled in the American College of Cardiology Chest Pain &#8211; MI Registry. Overall in-hospital mortality was 4.4%.</p>



<p>Model performance, including area under the receiver operator curve (AUROC), sensitivity, and specificity, was similar for logistic regression and all machine learning-based algorithms.</p>



<p>Notably, both the XGBoost and meta-classifier models showed near-perfect calibration in independent validation, with each reclassifying 27% and 25%, respectively, of people who had been deemed low risk by logistic regression as being moderate-to-high risk, which was more consistent with observed events.</p>



<p>“The general conclusion that we draw is that our data streams have to become better for us to be able to leverage them completely for all clinical applications,” Khera said. “Our current data are very crude—they&#8217;re manually abstracted into a fixed number of data fields—and our assumption that a model that does a little better at detecting relationships in these few variables will do better is probably not the case.”</p>



<p>If currently available models work, “why would you replace it with something else that has more computational power but requires more coding skill and everything involved?” Khera asked. “If both the skill set and the computational power are higher in developing such models. It only makes sense to develop such models if you&#8217;re application markedly improves the rate of predictions or understanding quality or new signatures of patients.”</p>



<p>This means that healthcare systems have work to do, he continued. “Hospitals and healthcare systems should band together to participate in rich data-sharing platforms that can allow us to aggregate this rich information from individual hospitals into a common consortium,” Khera said, noting that current electronic health record (EHR) research is often single institution based. “What registries offer at the other end of the spectrum is you could have a thousand hospitals contributing their data.”</p>



<p>Similarly, he called for national cardiovascular societies “to now go to the next level by incorporating these rich signals from the EHR directly into a higher dimensional registry rather than these manually extracted registries.”</p>



<p><strong>In the ‘Gray Area’</strong></p>



<p>In their editorial, Navar along with Matthew M. Engelhard, MD, PhD, and Michael J. Pencina, PhD (both Duke University School of Medicine, Durham, NC), write that “when working with images, text, or time series, machine learning is almost sure to add value, whereas when working with a fewer, weakly correlated clinical variables, logistic regression is likely to do just as well. In the substantial gray area between these extremes, judgment and experimentation are required.”</p>



<p>This study falls in this category while also hinting at the potential benefits of machine learning. “When correctly applied, it might lead to more meaningful gains in calibration than discrimination,” they say. “This is an important finding, because the role of calibration is increasingly recognized as key for unbiased clinical decision-making, especially when threshold-based classification rules are used. The correctly applied caveat is also important; unfortunately, many developers of machine learning models treat calibration as an afterthought.”</p>



<p>Navar explained that the importance of calibration is dependent on how the model is being used. For example, if it is being deployed to find the patients within the top 10% highest risk in order to best dole out a targeted intervention, discrimination is more important, she said. “But if you have a model to tell somebody that their chance of a heart attack in the next few years is 20% or 10% or 15% and you&#8217;re giving that actual number to a patient, you kind of want to make sure that number is as close to right as possible.” Calibration is also vital for cost-effective models, Navar added.</p>



<p>In this case, for risk prediction, “a traditional modeling approach is really nice because you can see what is going on with all the different variables, you can cross that to what you know about the biology and the epidemiology of whatever it is that you&#8217;re looking at, and then providers can see it,” she said. “We can see how, if we&#8217;re using a model, blood pressure goes up, risk goes up; someone&#8217;s a smoker, risk goes up; and that&#8217;s not always so obvious if you just package up a machine-learning model and just deploy it to a physician without them being able to see what&#8217;s going on underneath the hood.”</p>



<p>For now, this advantage gives traditional models the “upper hand,” Navar said. “But that doesn&#8217;t mean that the insights from those machine learning models are wrong. It just means that the other models are a little bit easier to use.”</p>



<p>“Recent feats of machine learning in clinical medicine have seized our collective attention, and more are sure to follow,” the editorial concludes. “As medical professionals, we should continue building familiarity with these technologies and embrace them when benefits are likely to outweigh the costs, including when working with complex data. However, we must also recognize that for many clinical prediction tasks, the simpler approach—the generalized linear model—may be all that we need.”</p>
<p>The post <a href="https://www.aiuniverse.xyz/to-predict-mortality-after-mi-machine-learning-needs-better-intel/">To Predict Mortality After MI, Machine Learning Needs Better Intel</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/to-predict-mortality-after-mi-machine-learning-needs-better-intel/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>INTELLIGENT PRESENT &#038; FUTURE: ARTIFICIAL INTELLIGENCE IS CHANGING OUR DAILY LIVES FOR THE BETTER</title>
		<link>https://www.aiuniverse.xyz/intelligent-present-future-artificial-intelligence-is-changing-our-daily-lives-for-the-better/</link>
					<comments>https://www.aiuniverse.xyz/intelligent-present-future-artificial-intelligence-is-changing-our-daily-lives-for-the-better/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 23 Feb 2021 10:08:32 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[better]]></category>
		<category><![CDATA[CHANGING]]></category>
		<category><![CDATA[DAILY]]></category>
		<category><![CDATA[Intelligent]]></category>
		<category><![CDATA[PRESENT]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=13016</guid>

					<description><![CDATA[<p>Source &#8211; https://www.analyticsinsight.net/ If you analyze closely, artificial intelligence is everywhere around you. You carry it in your phone in the form of almost every social media <a class="read-more-link" href="https://www.aiuniverse.xyz/intelligent-present-future-artificial-intelligence-is-changing-our-daily-lives-for-the-better/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/intelligent-present-future-artificial-intelligence-is-changing-our-daily-lives-for-the-better/">INTELLIGENT PRESENT &#038; FUTURE: ARTIFICIAL INTELLIGENCE IS CHANGING OUR DAILY LIVES FOR THE BETTER</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.analyticsinsight.net/</p>



<p>If you analyze closely, artificial intelligence is everywhere around you. You carry it in your phone in the form of almost every social media app, Alexa, Siri, Google is AI-powered digital assistants, and high-end cars come with AI-enables self-parking systems. These are some of the many examples of how this futuristic technology is impacting our daily lives.&nbsp;</p>



<p>Not just for entertainment, AI has uses in almost every industry to speed up common actions. In the on-going pandemic, AI has gone far as to even detect potential COVID-19 cases within a particular radius using bots. Let’s take a look at how artificial intelligence has affected every aspect of life, from leisure to work. </p>



<h4 class="wp-block-heading"><strong>Business Front&nbsp;</strong></h4>



<p>At work, artificial intelligence has improved manpower morale and business efficiency. If a company is looking to hire employees, artificial intelligence programs help in identifying ideal profiles to create an interview pool. Business giant JP Morgan and fast-food company McDonald’s uses Pymetrics which is an AI-powered software that collects data to process potential candidates for company hiring. Pymetrics assess profiles on the basis of the person’s resume and objective behavioral data that is checked according to the requirements of the company.&nbsp;</p>



<h4 class="wp-block-heading"><strong>Social Networking&nbsp;</strong></h4>



<p>Facebook is notoriously famous for its controversies. Artificial intelligence helped Facebook’s algorithm detect all types of content that violated its social hate policy. As a result of that AI-program, 97% of the hate-fueling content was removed without being reported by the users. Not just this, the ads you actively see on any social media platform are backed by AI and machine learning. </p>



<h4 class="wp-block-heading"><strong>Academics</strong></h4>



<p>Coronavirus drastically changed everyone’s daily routine. One of the changes was online school/college/university classes. This distance learning has put significant pressure on educational institutes to retain student attention during virtual tutoring. Affective Spotlight, a Microsoft tool, uses artificial intelligence to test the attention levels during the class. It analyses body movements, facial expressions, body language which is given numerical values for the participants.&nbsp;</p>



<h4 class="wp-block-heading"><strong>Healthcare&nbsp;</strong></h4>



<p>Not swaying away from the topic of the on-going global pandemic, the healthcare sector had to make too many dynamic changes to serve better patient care. Many hospitals and clinics adopted telemedicine practices where they put AI-powered bots for first-level patient interaction (making appointments), providing initial diagnosis, reminding patients to take medicines on time, educating them about common tests and procedures, etc. AI is also helping in the vaccine drive by identifying asymptomatic COVID-19 patients and digitizing patient files for a better digital tracking system. </p>



<h4 class="wp-block-heading"><strong>Gambling&nbsp;</strong></h4>



<p>Gambling involves taking big risks. Rdentify uses artificial intelligence to monitor live chat and identify problem gamblers. Its AI technology can monitor live chats and online gambling habits, providing players with a scoreboard of risk percentage for every game. The machine learning system then uses the risk percentage to intimate the player with a good time to leave the game and also highlights if a person is addicted to gambling. Rdentify has been adopted by many AAMS safe online casinos. This application’s scoring system is in sync with the operator’s CRM which helps in calculating the risks in real-time. Thanks to that, grave gambling risks will be reported immediately to customer support. To control the growing gambling issues in many parts of the world, Rdentify has partnered with AgeChecked, a verification and monitoring software. </p>



<p>Artificial Intelligence makes our lives better in many ways, be it leisure or jobs. Artificial intelligence has a bright future in human life as scientists continue to work on this ever-developing technology. We are already seeing chatbots, virtual assistants, home robots making simple life decisions for us, be it telling Siri to make a grocery list to using Rdentify to stop gamblers from burning a hole in their wallets.</p>
<p>The post <a href="https://www.aiuniverse.xyz/intelligent-present-future-artificial-intelligence-is-changing-our-daily-lives-for-the-better/">INTELLIGENT PRESENT &#038; FUTURE: ARTIFICIAL INTELLIGENCE IS CHANGING OUR DAILY LIVES FOR THE BETTER</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/intelligent-present-future-artificial-intelligence-is-changing-our-daily-lives-for-the-better/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Using Machine Learning To Create Better Gene Therapies</title>
		<link>https://www.aiuniverse.xyz/using-machine-learning-to-create-better-gene-therapies/</link>
					<comments>https://www.aiuniverse.xyz/using-machine-learning-to-create-better-gene-therapies/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Sat, 13 Feb 2021 06:12:28 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[better]]></category>
		<category><![CDATA[Create]]></category>
		<category><![CDATA[Gene]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[Therapies]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=12861</guid>

					<description><![CDATA[<p>Source &#8211; https://www.technologynetworks.com/ In their machine learning-based capsid diversification strategy, the team focused on a 28 amino acid peptide within a segment of the AAV2 VP3 capsid <a class="read-more-link" href="https://www.aiuniverse.xyz/using-machine-learning-to-create-better-gene-therapies/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/using-machine-learning-to-create-better-gene-therapies/">Using Machine Learning To Create Better Gene Therapies</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.technologynetworks.com/</p>



<p><em>In their machine learning-based capsid diversification strategy, the team focused on a 28 amino acid peptide within a segment of the AAV2 VP3 capsid protein that exposes the AAV capsid to neutralizing antibodies produced by individuals and thus can be the cause of an immune response against the virus. More purple colored portions of this peptide are buried deeper in the capsid, while yellow parts are exposed on the virus&#8217; surface. Credit: Wyss Institute at Harvard University (original by Drew Bryant).</em></p>



<p>Adeno-associated viruses (AAVs) have become promising vehicles for delivering gene therapies to defective tissues in the human body because they are non-pathogenic and can transfer therapeutic DNA into target cells. However, while the first gene therapy products approved by the Federal Drug Administration (FDA) use AAV vectors and others are likely to follow, AAV vectors still have not reached their full potential to meet gene therapeutic challenges.<br><br>First, currently used AAV capsids &#8211; the spherical protein structures enveloping the virus&#8217; single-stranded DNA genome which can be modified to encode therapeutic genes &#8211; are limited in their ability to specifically hone in on the tissue affected by a disease and their wider distribution throughout the human body causes them to be diluted. And secondly, patients&#8217; immune systems, after having been exposed to a similar AAV virus, can produce neutralizing antibodies that, even at low levels, can destroy AAVs upon re exposure (neutralization), blocking the delivery of their therapeutic DNA payloads.<br><br>To overcome this neutralization problem, researchers are engineering enhanced AAV capsids they hope to be able to evade the immune system. Currently used methods, including &#8220;directed evolution&#8221; strategies that fast-track the evolution of a protein in laboratory conditions, only can create a limited diversity of capsids with most of them still resembling the naturally occurring AAV variants known as serotypes. However, it remains difficult to generate sufficient diversity using this approach without losing other desired functions of the capsid, such as their stability or ability to bind to specific cell types.<br><br>Now, a new study initiated by Wyss Core Faculty member&nbsp;George Church&#8217;s Synthetic Biology team at Harvard&#8217;s Wyss Institute for Biologically Inspired Engineering, and driven by a collaboration with Google Research has applied a computational deep learning approach to design highly diverse capsid variants from the AAV2 serotype across DNA sequences encoding a key protein segment that plays a role in immune-recognition as well as infection of target tissues. AAV2 is the most-studied serotype and has been used in the first FDA approved gene therapy, to treat a blinding disease.<br><br>Starting from a relatively small collection of capsid data, the team trained multiple machine learning methods and used them to design 200,000 virus variants. 110,689 of these variants produced viable AAV viruses. Between any two naturally occurring AAV serotypes, 12 amino acids within this segment are expected to differ. The team&#8217;s effort produced more than 57,000 variants that exhibited much higher diversity than this, some containing up to 29 combined substituted or additionally inserted amino acids. The findings are published in Nature Biotechnology.<br><br>&#8220;Our approach achieves the highest functional diversity of any capsid library thus far. It unlocks vast areas of functional but previously unreachable sequence space, with many potential applications for generating improved viral vectors, like AAVs with much reduced immunogenicity and much improved target tissue selectivity, and also for highly efficient gene therapies,&#8221; said last-author Eric Kelsic, Ph.D., who started the project with Church, Ph.D., and co-founded the startup Dyno Therapeutics where he is now CEO. Dyno Therapeutics&#8217; mission is to develop advanced gene therapy delivery vehicles by employing cutting-edge artificial intelligence (AI) approaches.<br><br>Using multiple design strategies, the team first generated smaller data sets on which they could train several machine learning models. These were collections of AAV capsids with variable numbers of mutations introduced in a 28 amino acid segment of the AAV2 VP3 protein that forms part of the capsid and exposes it to neutralizing antibodies. A high-throughput method enabling the synthesis of mutated capsid sequences and in vitro experiments for testing which ones efficiency produced viable stable capsids, provided a highly effective test bed for their overall approach. The results from this first experimental study then were used by the team as training data for three alternative machine learning models that generated much larger numbers of diverse capsid variants to be tested with a final validation experiment.<br><br>A central bottleneck in the creation of diverse AAV capsids and variants that can evade neutralization is the production of capsids that remain stable: most of the variants will fail to assemble into functional capsids or package their AAV genomes. &#8220;The deep neural network models that we deployed with our Google collaborators accurately predicted capsid viability across extremely diverse variants. Reaching this level of diversity in the capsid segment is an important milestone that we can build on to find immune-evading capsids for gene therapy,&#8221; said co-first author Sam Sinai, Ph.D., a former graduate student of Church who joined Kelsic&#8217;s team at the Wyss Institute and is a co-founder leading the machine learning team at Dyno Therapeutics. &#8220;And we can take similar approaches to create AAV capsids with much improved tissue selectivity.&#8221;<br><br>In 2019, a former Wyss team including Kelsic, Sinai, and their mentor Church published a related approach in Science in which they mutated one by one each of the 735 amino acids within the entire AAV2 capsid in different ways. What they called a &#8220;wide&#8221; search resulted in a large AAV library that identified changes affecting AAV2&#8217;s viability and its &#8220;homing&#8221; potential to specific organs in mice, as well as a previously unknown accessory protein that binds to cell membranes and which was hidden within the capsid-encoding DNA sequence. In their previous study, the researchers used a simple experimental model to optimize the tissue targeting ability of the virus.<br><br>&#8220;This new study involving machine learning models developed with Google Research nicely complements our earlier work in that it focuses on a small, but very important, region of the AAV capsid with an unprecedented resolution,&#8221; said co-corresponding author Church. &#8220;It shows that neural networks combined with the high-throughput synthetic testing developed in our lab is changing the way we design gene delivery vehicles and protein drugs.&#8221; Church is the lead of the Wyss Institute&#8217;s Synthetic Biology platform where the project was started, and Professor of Genetics at Harvard Medical School and of Health Sciences and Technology at Harvard and MIT.</p>



<p><br>&#8220;This work gives a glimpse into the future as artificial intelligence approaches, such as machine learning, are opening up vast new design spaces that enable the development of entirely new drugs and drug delivery approaches for combating innumerable challenges to human health. It also highlights the Wyss Institute&#8217;s commitment to computational problem-solving in areas where new therapies are desperately needed,&#8221; said Wyss Founding Director Donald Ingber, M.D., Ph.D., who is also the Judah Folkman Professor of Vascular Biology at Harvard Medical School and Boston Children&#8217;s Hospital, and Professor of Bioengineering at SEAS<br><br><strong>Reference:</strong>&nbsp;Bryant DH, Bashir A, Sinai S, et al. Deep diversification of an AAV capsid protein by machine learning.&nbsp;<em>Nat Biotechnol</em>. 2021.&nbsp;doi:10.1038/s41587-020-00793-4.<br><br>This article has been republished from the following&nbsp;materials. Note: material may have been edited for length and content. For further information, please contact the cited source.</p>
<p>The post <a href="https://www.aiuniverse.xyz/using-machine-learning-to-create-better-gene-therapies/">Using Machine Learning To Create Better Gene Therapies</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/using-machine-learning-to-create-better-gene-therapies/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>How industry can build better AI for the military</title>
		<link>https://www.aiuniverse.xyz/how-industry-can-build-better-ai-for-the-military/</link>
					<comments>https://www.aiuniverse.xyz/how-industry-can-build-better-ai-for-the-military/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Mon, 10 Jun 2019 10:49:21 +0000</pubDate>
				<category><![CDATA[AI-ONE]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[better]]></category>
		<category><![CDATA[build]]></category>
		<category><![CDATA[industry]]></category>
		<category><![CDATA[Military]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=3698</guid>

					<description><![CDATA[<p>Source:- c4isrnet.com As AI becomes more prominent in the national security community, officials are grappling with where to use it most effectively. During a panel discussion at the <a class="read-more-link" href="https://www.aiuniverse.xyz/how-industry-can-build-better-ai-for-the-military/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-industry-can-build-better-ai-for-the-military/">How industry can build better AI for the military</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source:- c4isrnet.com</p>
<div class=" mco-body-item mco-body-type-text">
<p class="element element-paragraph">As AI becomes more prominent in the national security community, officials are grappling with where to use it most effectively.</p>
</div>
<div class=" mco-body-item mco-body-type-text">
<p class="element element-paragraph">During a panel discussion at the C4ISRNET conference June 6, leaders discussed the role of industry building AI that will be used by the military.</p>
</div>
<div class=" mco-body-item mco-body-type-text">
<p class="element element-paragraph">After studying small and big companies creating AI technology, Col. Stoney Trent, the chief of operations at the Pentagon’s Joint Artificial Intelligence Center, said he found commercial groups do not have the same motivations that exist in the government.</p>
</div>
<div class=" mco-body-item mco-body-type-text">
<p class="element element-paragraph">“Commercial groups are poorly incentivized for rigorous testing. For them that represents a business risk,” Trent said. Because of this, he the government needs to work with the commercial sector to create these technologies.</p>
</div>
<div class=" mco-body-item mco-body-type-text">
<p class="element element-paragraph">“What the Defense Department has to offer in this space is encouragement, an incentive structure for better testing tools and methods that allows us to understand how a product is going to perform when we are under conditions of national consequence because I can’t wait,” Trent said. “Hopefully, the nation will be at peace long enough to not have a high bandwidth of experiences with weapons implementations, but when that happens, we need them to absolutely work. That’s a quality of commercial technology development.”</p>
</div>
<div class=" mco-body-item mco-body-type-text">
<p class="element element-paragraph">For this to take place, the Department of Defense needs to help create the right environment.</p>
<div class=" mco-body-item mco-body-type-text">
<p class="element element-paragraph">“All of this is predicated on the Pentagon doing things as well,” said Kara Frederick, associate fellow for the technology and national security program at the Center for a New American Security. “Making an environment conducive to the behaviors that you are seeking to encourage. That environment can be the IT environment, common standards for data processing, common standards for interactions with industry, I think would help.”</p>
</div>
<div class=" mco-body-item mco-body-type-text">
<p class="element element-paragraph">Panelists said national security leaders also need to weigh the risks of relying more on AI technology, one of which is non-state actors using AI for nefarious purposes.</p>
<div class=" mco-body-item mco-body-type-text">
<p class="element element-paragraph">Trent said he sees AI as the new arms race but noted that in this arena, destruction may be easier than creation.</p>
</div>
<div class=" mco-body-item mco-body-type-text">
<p class="element element-paragraph">“AI is the modern-day armor anti-armor arms race,” Trent said. “The Joint AI Center, one of the important features of it is that it does offer convergence for best practices, data sources, data standards, etc. The flip side is we fully understand there are a variety of ways you can undermine artificial intelligence and most of those are actually easier than developing good resilient AI.”</p>
<div class=" mco-body-item mco-body-type-text">
<p class="element element-paragraph">Frederick said part of this problem stems from the structure of the AI community.</p>
</div>
<div class=" mco-body-item mco-body-type-text">
<p class="element element-paragraph">“I think what’s so singular about the AI community, especially the AI research community, is that its so open,” Frederick said. “Even at Facebook, we open source some of these algorithms and we put it our there for people to manipulate. [There is this] idea that non-state actors, especially those without strategic intent or ones that we can’t pin strategic intent to, could get a hold of some of these ways to code in certain malicious inputs [and] we need to start being serious about it.”</p>
</div>
<div class=" mco-body-item mco-body-type-text">
<p class="element element-paragraph">However, before tackling any of these problems, leaders need to first decide when it is appropriate to use AI</p>
</div>
<div class=" mco-body-item mco-body-type-text">
<p class="element element-paragraph">Rob Monto, lead of the Army’s Advanced Concepts and Experimentation office, described this process as an evolution that takes place between AI and its users.</p>
</div>
<div class=" mco-body-item mco-body-type-text">
<p class="element element-paragraph">“AI is like electricity,” he said. “It can be anywhere and everywhere. You can either get electrocuted by it or you target specific applications for it. You need to know what you want the AI to do, and then you spend months and years building out. If you don’t have your data set available, you do that upfront architecture and collection of information. Then you train your algorithms and build that specifically to support that specific use case…AI is for targeted applications to aid decisions, at least in the military space, to aid the user.”</p>
<div class=" mco-body-item mco-body-type-text">
<p class="element element-paragraph">Once the decision is made how and where to use AI, there are other technologies that must make advances to meet AI. One the biggest challenges, said Chad Hutchinson, director of engineering at the Crystal Group., is the question of hardware and characteristics such as thermal performance.</p>
</div>
<div class=" mco-body-item mco-body-type-text">
<p class="element element-paragraph">“AI itself is pushing the boundaries of what the hardware can do,” Hutchinson said.</p>
</div>
<div class=" mco-body-item mco-body-type-text">
<p class="element element-paragraph">Hardware technology is not the only obstacle in AI’s path. These issues could stem from policy or human resource shortfalls.</p>
<p class="element element-paragraph">“What we find is the non-technology barriers are far more significant than the technology barriers,” Trent said.</p>
</div>
</div>
</div>
</div>
</div>
<p>The post <a href="https://www.aiuniverse.xyz/how-industry-can-build-better-ai-for-the-military/">How industry can build better AI for the military</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-industry-can-build-better-ai-for-the-military/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
