<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Brings Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/brings/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/brings/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Sat, 03 Jul 2021 10:10:34 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>Machine Learning Algorithm Brings Predictive Analytics to Cell Study</title>
		<link>https://www.aiuniverse.xyz/machine-learning-algorithm-brings-predictive-analytics-to-cell-study/</link>
					<comments>https://www.aiuniverse.xyz/machine-learning-algorithm-brings-predictive-analytics-to-cell-study/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Sat, 03 Jul 2021 10:10:32 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[algorithm]]></category>
		<category><![CDATA[Analytics]]></category>
		<category><![CDATA[Brings]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[Predictive]]></category>
		<guid isPermaLink="false">https://www.aiuniverse.xyz/?p=14746</guid>

					<description><![CDATA[<p>Source &#8211; https://healthitanalytics.com/ A new machine learning algorithm system uses predictive analytics to determine which transcription factors are active in individual cells. Scientists at the University of <a class="read-more-link" href="https://www.aiuniverse.xyz/machine-learning-algorithm-brings-predictive-analytics-to-cell-study/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/machine-learning-algorithm-brings-predictive-analytics-to-cell-study/">Machine Learning Algorithm Brings Predictive Analytics to Cell Study</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://healthitanalytics.com/</p>



<p>A new machine learning algorithm system uses predictive analytics to determine which transcription factors are active in individual cells.</p>



<p>Scientists at the University of Illinois Chicago have introduced a new system that uses a machine learning algorithm and predictive analytics to find what transcription factors are most likely to be active in individual cells. The system was created to provide researchers with a more efficient method of identifying the regulators of genes.</p>



<p>Transcription factors are proteins that bind to DNA and have control around what genes are active inside a cell. Understanding and manipulating these signals in a cell is crucial to the biomedical field. Additionally, using this method of manipulating signals within a cell has proven to be an effective way to discover new treatments and illnesses.</p>



<p>However, there are hundreds of transcription factors inside a human cell. It could take years of research, and lots of trial and error, to determine the most active factor.</p>



<h4 class="wp-block-heading">Dig Deeper</h4>



<ul class="wp-block-list"><li>Machine Learning Predicts Dialysis, Death in COVID-19 Patients</li><li>Machine Learning Gauges Unconsciousness Under Anesthesia</li><li>AI, Predictive Analytics Pave Way for Premature Baby Care</li></ul>



<p>&#8220;One of the challenges in the field is that the same genes may be turned ‘on’ in one group of cells but turned ‘off’ in a different group of cells within the same organ,&#8221; Jalees Rehman, UIC professor in the department of medicine and the department of pharmacology and regenerative medicine at the College of Medicine, said in a press release.</p>



<p>&#8220;Being able to understand the activity of transcription factors in individual cells would allow researchers to study activity profiles in all the major cell types of major organs such as the heart, brain or lungs,&#8221; Rehman continued.</p>



<p>The system developed by the University of Illinois Chicago is named BITFAM, standing for Bayesian Inference Transcription Factor Activity Model. The machine learning algorithm system operates by “combining new gene expression profile data gathered from single cell RNA sequencing with existing biological data on transcription factor target genes,” UIC stated in a press release.</p>



<p>With all the information, the system will run multiple computer-based simulations to find the best fit and predict the activity for every transcription factor in the cell.</p>



<p>The system was tested on cells from tissue in the lung, heart, and brain by Rehman and fellow UIC researcher Yang Dai, UIC associate professor in the department of bioengineering at the College of Medicine and the College of Engineering.</p>



<p>&#8220;Our approach not only identifies meaningful transcription factor activities but also provides valuable insights into underlying transcription factor regulatory mechanisms,&#8221; Shang Gao, first author of the study and a doctoral student in the department of bioengineering said in a press release.</p>



<p>&#8220;For example, if 80% of a specific transcription factor&#8217;s targets are turned on inside the cell, that tells us that its activity is high. By providing data like this for every transcription factor in the cell, the model can give researchers a good idea of which ones to look at first when exploring new drug targets to work on that type of cell,&#8221; Gao continued.</p>



<p>According to the researchers, the machine learning algorithm system is available to the public and could be applied widely. Users can combine the system with additional analysis methods that may be better suited for their own studies. This could include finding new drug targets.</p>



<p>&#8220;This new approach could be used to develop key biological hypotheses regarding the regulatory transcription factors in cells related to a broad range of scientific hypotheses and topics. It will allow us to derive insights into the biological functions of cells from many tissues,&#8221; Dai said.</p>



<p>Rehman explained the application relevant to his lab is to use the new machine learning algorithm system to focus on factors that increase disease in certain cells.</p>



<p>“For example, we would like to understand if there is transcription factor activity that distinguished a healthy immune cell response from an unhealthy one, as in the case of conditions such as COVID-19, heart disease or Alzheimer&#8217;s disease where there is often an imbalance between healthy and unhealthy immune responses,&#8221; Rehman said.</p>
<p>The post <a href="https://www.aiuniverse.xyz/machine-learning-algorithm-brings-predictive-analytics-to-cell-study/">Machine Learning Algorithm Brings Predictive Analytics to Cell Study</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/machine-learning-algorithm-brings-predictive-analytics-to-cell-study/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Tiny machine learning brings AI to IoT devices</title>
		<link>https://www.aiuniverse.xyz/tiny-machine-learning-brings-ai-to-iot-devices/</link>
					<comments>https://www.aiuniverse.xyz/tiny-machine-learning-brings-ai-to-iot-devices/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 02 Apr 2021 06:13:41 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Brings]]></category>
		<category><![CDATA[devices]]></category>
		<category><![CDATA[IoT]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[Tiny]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=13861</guid>

					<description><![CDATA[<p>Source &#8211; https://www.edn.com/ One advantage that the IoT brought to design was the ability for a small local device to access the network’s virtually-unlimited computing power.  The Amazon <a class="read-more-link" href="https://www.aiuniverse.xyz/tiny-machine-learning-brings-ai-to-iot-devices/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/tiny-machine-learning-brings-ai-to-iot-devices/">Tiny machine learning brings AI to IoT devices</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.edn.com/</p>



<p>One advantage that the IoT brought to design was the ability for a small local device to access the network’s virtually-unlimited computing power.  The Amazon Echo is a classic example: a low-cost local device that provided powerful speech recognition AI and an immense application library by way of its Internet connection. Now, some of that AI is moving into the local device to help minimize bandwidth and latency concerns by employing an efficient form of machine learning (ML) for smaller devices.</p>



<p>An example of what can be accomplished by placing AI in edge devices can be found in the article AI helps turn gas sensor into electronic nose. In this instance the ML that generates the sensor’s algorithms takes place during the design cycle, and the local device simply runs the algorithm. This is a first step in bring AI to the edge, but there are more to come.</p>



<p>To reach its full potential, AI at the edge will need to be self-adaptive. This means that the edge device will have to implement ML locally. How, exactly, this is to be done with the limited compute power edge devices typically have available is currently the subject of considerable research and development. Providing a form for information and idea exchange in local machine learning is the goal of the tinyML Foundation.</p>



<p>pable of learning their tasks without excessive developer effort. Source: TensorFlow</p>



<p>The foundation held its first industry event – the tinyML Summit – in 2019 and generated considerable interest along with participation by more than 90 companies. That event revealed three essential trends:</p>



<ul class="wp-block-list"><li>Tiny ML-capable hardware is currently becoming “good enough” for many commercial applications with new and even better architectures on the horizon.</li><li>Algorithms, networks, and models have seen significant size reduction, with many sized down to 100 kBytes and below.</li><li>There is growing momentum demonstrated by technical progress and ecosystem development.</li></ul>



<p>This result demonstrated that ML is not only coming to the edge, in some cases it is already there.</p>



<p>COVID-19 prevented a 2020 event, but for 2021 the tinyML Foundation created a free online event that recently concluded but should be available as an archive for registered attendees. In addition, the organization has developed a series of lectures called the tinyML Talks that are available on YouTube and other platforms.</p>



<p>The trend is clearly gaining traction. The organization’s sponsors now span the range from major hardware players such as Arm, Cypress Semiconductor, and Samsung to software start-ups focusing on low-power AI applications. Most are focused on either vision or audio (voice recognition) systems for now, but smart sensors are gaining ground as a viable application as well.</p>



<p>This trend bodes well for IoT developers. Creating compact, low-power devices with reasonable cost that perform complex tasks can be a developers nightmare using conventional programming techniques. Yet depending on connectivity to network-based AI processing for the device’s performance has its own drawbacks. Home networks are already becoming clogged with demands from streaming media and communications; adding a host of network-hogging smart devices can overload the typical home connection. The latency of network communications can also be an issue, as can be the total failure of device operation when the network is down.</p>



<p>Moving the AI to the edge – at least for basic functionality – solves most of these concerns. With ML in the edge device, developers can craft their systems to learn how to meet customer demands without the developers needing to exhaustively analyze use cases in advance. Having AI in the edge device reduces the need for network bandwidth, eliminates network latency issues, and ensures operation in the network’s absence. The efforts to expand tiny ML technology will help speed the movement of AI into IoT devices.</p>
<p>The post <a href="https://www.aiuniverse.xyz/tiny-machine-learning-brings-ai-to-iot-devices/">Tiny machine learning brings AI to IoT devices</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/tiny-machine-learning-brings-ai-to-iot-devices/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Artificial intelligence brings new vision to healthcare</title>
		<link>https://www.aiuniverse.xyz/artificial-intelligence-brings-new-vision-to-healthcare/</link>
					<comments>https://www.aiuniverse.xyz/artificial-intelligence-brings-new-vision-to-healthcare/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 19 Mar 2021 06:32:45 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[applications]]></category>
		<category><![CDATA[Brings]]></category>
		<category><![CDATA[Healthcare]]></category>
		<category><![CDATA[Vision]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=13612</guid>

					<description><![CDATA[<p>Source &#8211; https://www.aa.com.tr/ Artificial intelligence applications support doctors&#8217; diagnostic decisions, automate certain tasks, says Turkish social media expert ANKARA A Turkish social media expert said artificial intelligence <a class="read-more-link" href="https://www.aiuniverse.xyz/artificial-intelligence-brings-new-vision-to-healthcare/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/artificial-intelligence-brings-new-vision-to-healthcare/">Artificial intelligence brings new vision to healthcare</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.aa.com.tr/</p>



<p>Artificial intelligence applications support doctors&#8217; diagnostic decisions, automate certain tasks, says Turkish social media expert</p>



<p><strong>ANKARA</strong></p>



<p>A Turkish social media expert said artificial intelligence (AI) brings a new vision to the healthcare sector.</p>



<p>&#8220;Artificial intelligence brings revolutionary developments in the field of health, as in all areas of life,&#8221; Deniz Unay told Anadolu Agency.</p>



<p>&#8220;Machine learning and assisted artificial intelligence have features that can develop an entire health system within the framework of a new vision,&#8221; Unay said.</p>



<p>Unay said the AI applications support doctors&#8217; diagnostic decisions and automate certain tasks underlining AI&#8217;s rise in healthcare applications.</p>



<p>AI can help better and more accurate detection of symptoms, analyzing the side effects of treatments, and processing large amounts of data produced by healthcare facilities, he said.</p>



<p>However, AI in the medical field is an area that is not yet fully sufficient and is considered to be developed especially for automated robotic surgery applications, he warned.</p>



<p>&#8220;Artificial intelligence software and robots, which will assist doctors in many areas, enable faster and safer health services,&#8221; he said, adding that AI developments in medicine are planned to be used widely on both general health practices and drugs.</p>



<p>Everyone in the field of health, from specialist doctors to first-aid workers, will start to benefit from artificial intelligence technology soon, he added.</p>



<p>The revenue from AI systems in healthcare worldwide in 2021 exceeded $6.6 billion, according to data from Germany-based statistics company Statista.</p>



<p>The investment amount is expected to increase significantly in line with the increasing desire to use artificial intelligence and robots in the health sector, he said, adding that the increase will lead to easier use of health services by large groups.</p>



<p>&#8220;Great importance is given to the use of artificial intelligence in the health sector to spread health services to wider areas. Health services offered in various regions are expected to increase, especially with the technological infrastructure,&#8221; he continued.</p>



<p>With the spread of medical instruments, doctors are forced to consider more data, he added.</p>



<p>He went on to say that AI is mostly used in medical imaging and interpretation of radiology.</p>



<p>Underlining that some cancers, such as lung or breast cancer, are tough to identify in images produced by scanners, he said programs can identify abnormalities that cannot be detected with the naked eye to detect early tumors more reliably and target better treatments.</p>



<p>Until recently, AI in healthcare was limited to research or predictive analytics, he said. Much focus is now on developing technologies that can improve robot-assisted surgery, he added.</p>



<p>&#8220;There are already significant uses of artificial intelligence that prove how it can improve the techniques used for several years in the field of robotic surgery, especially in the field of microsurgery,&#8221; Unay noted.</p>



<p></p>
<p>The post <a href="https://www.aiuniverse.xyz/artificial-intelligence-brings-new-vision-to-healthcare/">Artificial intelligence brings new vision to healthcare</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/artificial-intelligence-brings-new-vision-to-healthcare/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>AWS Brings Conversational AI to Alexa</title>
		<link>https://www.aiuniverse.xyz/aws-brings-conversational-ai-to-alexa/</link>
					<comments>https://www.aiuniverse.xyz/aws-brings-conversational-ai-to-alexa/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 11 Jun 2019 10:45:11 +0000</pubDate>
				<category><![CDATA[AI-ONE]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Alexa]]></category>
		<category><![CDATA[AWS]]></category>
		<category><![CDATA[Brings]]></category>
		<category><![CDATA[Conversational]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=3717</guid>

					<description><![CDATA[<p>Source:- nojitter.com If ever there was a song that described the state of voice assistants today, it’s Elvis Presley’s “A Little Less Conversation.” Queue up the record: A little <a class="read-more-link" href="https://www.aiuniverse.xyz/aws-brings-conversational-ai-to-alexa/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/aws-brings-conversational-ai-to-alexa/">AWS Brings Conversational AI to Alexa</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source:- nojitter.com</p>
<div>If ever there was a song that described the state of voice assistants today, it’s Elvis Presley’s “A Little Less Conversation.” Queue up the record:</div>
<div></div>
<div>A little less conversation, a little more action, please</div>
<div>All this aggravation ain&#8217;t satisfactioning me</div>
<div></div>
<div>The song came to mind last week while attending the Amazon Web Services (AWS) re: MARS event for machine learning, automation, robotics, and space and hearing about Alexa Conversations, an update to the company’s voice agent aimed at making a conversation more interactive rather than being a series of discrete statements.</div>
<div></div>
<div>Within the context of digital assistants, Elvis’ words mean that sometimes when it comes to conversations, less is more. At the event, Eric Posen, principal product manager from Atom Tickets supported this statement by discussing the process of using Alexa to purchase movie tickets through Atom.</div>
<div></div>
<div>The interactions would go something like this:</div>
<div></div>
<ul>
<li>Movie-goer: Alexa, launch Atom App.</li>
<li>Alexa: What do you want to do?</li>
<li>Movie-goer: Alexa, buy tickets.</li>
<li>Alexa: For which day?</li>
<li>Movie-goer: Alex, today.</li>
<li>Alexa: For what movie?</li>
<li>Movie-goer: Spiderman – Far From Home.</li>
<li>Alexa: What ZIP code are you in?</li>
</ul>
<div></div>
<div>And the conversation goes on and on. The full flow can be seen in the below image. You can see the tie to what Elvis meant when singing about wanting less conversations and more action to reach a desired level of “satisfactioning.” Faster, more direct interactions are obviously better. Prior to this update, the process could make using Alexa burdensome and tiring and more difficult than using an app, Posen admitted. Most users typically choose the path of least resistance, so if the app is easier to use than Alexa, guess what? Bye-bye Alexa, at least for this kind of use case.</div>
<div>
<div>The goal behind Alexa Conversations is to remove many of these steps by making discussions with the voice agent more natural. This is done by taking advantage of things Alexa knows and predicting what the user will want instead ahead of the request. Based on past history, Alexa should be able to guess where a user wants to see a movie, at what time, and how many tickets to buy. This can significantly cut the flow down and turn a long conversation into a very short one, which is likely faster than using the app.</div>
<div></div>
<div>Posen showed the “after” slide as he walked through the interaction using contextual AI. Once asked, Alexa responds with:</div>
<div></div>
<ul>
<li>Alexa: Spiderman – Far From Home is playing at Arclight Hollywood in premium or standard format. Which format would you like?</li>
<li>Movie-goer: Standard</li>
<li>Alexa: There is a 9:30 p.m. showing, is that what you want?</li>
<li>Movie-goer: Yes</li>
<li>Alexa: Your total is $20, should I book it?</li>
<li>Movie-goer: Yes</li>
</ul>
<div></div>
<div>Three words. That’s all it took to book tickets. This should meet the criteria of a little less conversation, a little more action… and raise a customer’s satisfactioning level.</div>
<div>
<div>In addition to providing a better customer experience, the dialog is much simpler to program. AWS has done much of the heavy-lifting in the area of AI training so programmers have less work to do. For Atom Tickets, the conversational AI took about one-third less code than previous Alexa programs, Posen estimated.</div>
<div></div>
<div>OpenTable and Uber presenters, also in this session, reiterated the challenge of using Alexa for simple conversations. Any time an interaction requires multiple steps, there’s a heavy burden of having to have the speaker say “Alexa” to initiate the interactions and have the voice assistant perform tasks. In the case of OpenTable, the phone is still its biggest competitor as people tend to call when requests aren’t straightforward. The conversational agent is better able to handle these requests as it’s using all of its inherent knowledge.</div>
<div></div>
<div>During the session, AWS highlighted an initiative called “Night Out” where all three of these services come together to create a complete evening. The user merely tells Alexa that he or she wants to book a night out for two and Alexa comes up with a recommended combination of movie and dinner and can even prearrange an Uber to take them. The user just needs to wait for the recommendations and say yes or no to them or perhaps even just a single yes to a series of proposed events.</div>
<div></div>
<div>I do believe we are at an inflection point for the voice interface to become real. Alexa, as well as its peers Apple Siri, Microsoft Cortana, and others, have made it much easier to speak a command to complete simple tasks. Now that people have grown accustomed to this, it’s time for the voice assistants to start tackling bigger problems. Alexa Conversations enables larger problems to be solved, with less effort and less programming. Seems like a win for all parties involved. Maybe next year, I’ll be able to ask Alexa to write this post for me!</div>
</div>
</div>
<p>The post <a href="https://www.aiuniverse.xyz/aws-brings-conversational-ai-to-alexa/">AWS Brings Conversational AI to Alexa</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/aws-brings-conversational-ai-to-alexa/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
