<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Develop Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/develop/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/develop/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Tue, 29 Jun 2021 10:50:35 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>Alakai, Physical Sciences to Further Develop Machine Learning Tools Under DHS SBIR Program</title>
		<link>https://www.aiuniverse.xyz/alakai-physical-sciences-to-further-develop-machine-learning-tools-under-dhs-sbir-program/</link>
					<comments>https://www.aiuniverse.xyz/alakai-physical-sciences-to-further-develop-machine-learning-tools-under-dhs-sbir-program/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 29 Jun 2021 10:50:34 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[Alakai]]></category>
		<category><![CDATA[Develop]]></category>
		<category><![CDATA[Further]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[Physical]]></category>
		<category><![CDATA[sciences]]></category>
		<guid isPermaLink="false">https://www.aiuniverse.xyz/?p=14639</guid>

					<description><![CDATA[<p>Source- https://blog.executivebiz.com/ The Department of Homeland Security has awarded funding worth $1 million each to Alakai Defense Systems and Physical Sciences Inc. to further develop their machine learning platforms <a class="read-more-link" href="https://www.aiuniverse.xyz/alakai-physical-sciences-to-further-develop-machine-learning-tools-under-dhs-sbir-program/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/alakai-physical-sciences-to-further-develop-machine-learning-tools-under-dhs-sbir-program/">Alakai, Physical Sciences to Further Develop Machine Learning Tools Under DHS SBIR Program</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source- https://blog.executivebiz.com/</p>



<p>The Department of Homeland Security has awarded funding worth $1 million each to Alakai Defense Systems and Physical Sciences Inc. to further develop their machine learning platforms to help improve the detection of explosives, narcotics, chemical agents and other threats as part of the second phase of the Small Business Innovation Research program.</p>



<p>“Our impetus for developing these machine-learning modules stems from the Transportation Security Administration’s operational needs for threat signature fusion, the ability to learn, detect and classify new threats without being explicitly programmed, and, ultimately, increase accuracy of detection,” Thoi Nguyen, program manager for the Next Generation Explosive Trace Detection program at DHS’ science and technology directorate, said in a statement published Friday.</p>



<p>Alakai will continue to develop its Agnostic Machine Learning Platform for Spectroscopy designed to detect hazardous chemicals from spectroscopic instruments as part of the two-year SBIR Phase II contract.</p>



<p>PSI will use the SBIR funding to continue to work on its deep learning algorithm meant to detect and classify opioids, narcotics and trace explosives for optical spectroscopic platforms.</p>



<p>DHS said it expects the awardees to come up with a prototype for demonstration and evaluation for Phase III funding. Under the third phase, the companies will seek private funding to bring their technologies to market.</p>



<p></p>
<p>The post <a href="https://www.aiuniverse.xyz/alakai-physical-sciences-to-further-develop-machine-learning-tools-under-dhs-sbir-program/">Alakai, Physical Sciences to Further Develop Machine Learning Tools Under DHS SBIR Program</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/alakai-physical-sciences-to-further-develop-machine-learning-tools-under-dhs-sbir-program/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>DHS Awards $2M for Small Businesses to Develop Machine Learning for Detection Technologies</title>
		<link>https://www.aiuniverse.xyz/dhs-awards-2m-for-small-businesses-to-develop-machine-learning-for-detection-technologies/</link>
					<comments>https://www.aiuniverse.xyz/dhs-awards-2m-for-small-businesses-to-develop-machine-learning-for-detection-technologies/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Sat, 26 Jun 2021 09:37:56 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[Awards]]></category>
		<category><![CDATA[Businesses]]></category>
		<category><![CDATA[detection]]></category>
		<category><![CDATA[Develop]]></category>
		<category><![CDATA[DHS]]></category>
		<category><![CDATA[Machine learning]]></category>
		<guid isPermaLink="false">https://www.aiuniverse.xyz/?p=14579</guid>

					<description><![CDATA[<p>Source &#8211; https://www.hstoday.us/ The Department of Homeland Security (DHS) Small Business Innovation Research (SBIR) Program recently awarded funding to two small businesses to develop non-contact, inexpensive machine learning training <a class="read-more-link" href="https://www.aiuniverse.xyz/dhs-awards-2m-for-small-businesses-to-develop-machine-learning-for-detection-technologies/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/dhs-awards-2m-for-small-businesses-to-develop-machine-learning-for-detection-technologies/">DHS Awards $2M for Small Businesses to Develop Machine Learning for Detection Technologies</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.hstoday.us/</p>



<p>The Department of Homeland Security (DHS) Small Business Innovation Research (SBIR) Program recently awarded funding to two small businesses to develop non-contact, inexpensive machine learning training and classification technologies. Integrated machine learning platforms can significantly reduce time, redundancy, cost, and improve the accuracy in detecting threats such as explosives, chemical agents, and narcotics.</p>



<p>“S&amp;T embraces the significant advances in artificial intelligence and machine learning capabilities and their ability to enhance threat detection,” said Kathryn Coulter Mitchell, DHS Senior Official Performing the Duties of the Under Secretary for Science and Technology. “The SBIR Program provides the opportunity for S&amp;T to partner with innovative small businesses and develop machine learning tools critical to addressing threat detection needs. I am looking forward to seeing the technologies that will be developed by these SBIR efforts.”</p>



<p>Physical Sciences Inc. (PSI), based in Andover, MA, and Alakai Defense Systems, Inc. (Alakai), based in Largo, FL, each received approximately $1 million in SBIR Phase II funding to develop technologies that can rapidly and accurately identify unknown spectrometer signals as safe or threatening. The DHS SBIR Program, managed by Program Director Dusty Lang and administered at the DHS Science and Technology Directorate (S&amp;T), selected PSI and Alakai to participate in Phase II of the program subsequent to demonstration of feasibility in Phase I, for each companies’ compact, accurate and rapid classification Machine Learning Module for Detection Technologies solutions.</p>



<p>Under Phase II, PSI will continue to develop their deep-learning algorithm for detection and classification of trace explosives, opioids, and narcotics on surfaces, for optical spectroscopic systems. PSI will extend the algorithm’s capabilities from infrared reflectance spectroscopy to include Raman spectroscopy, as well as a proposed operational module prototype, which will have a classification accuracy of greater than 90 percent.</p>



<p>During their Phase II efforts, Alakai, will continue development of the Agnostic Machine Learning Platform for Spectroscopy (AMPS) that rapidly and accurately detects trace quantities of hazardous and related chemicals from a variety of spectroscopic instruments.</p>



<p>“Our impetus for developing these machine-learning modules stems from the Transportation Security Administration’s operational needs for threat signature fusion, the ability to learn, detect and classify new threats without being explicitly programmed, and, ultimately, increase accuracy of detection,” said Thoi Nguyen, DHS S&amp;T Program Manager for the Next Generation Explosive Trace Detection (NGETD) Program. “With experienced industrial partners like Alakai and PSI, and our strong collaboration with TSA, we hope these efforts will contribute to wider applications of machine learning across the Homeland Security mission space.”</p>



<p>At the completion of the 24-month Phase II contract, SBIR awardees will have developed a prototype to demonstrate the advancement of the technology, spearheading the potential for Phase III funding.</p>



<p>Under Phase III, SBIR performers will seek to secure funding from private and/or non-SBIR government sources, with the eventual goal to commercialize and bring to market the technologies from Phases I and II.</p>
<p>The post <a href="https://www.aiuniverse.xyz/dhs-awards-2m-for-small-businesses-to-develop-machine-learning-for-detection-technologies/">DHS Awards $2M for Small Businesses to Develop Machine Learning for Detection Technologies</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/dhs-awards-2m-for-small-businesses-to-develop-machine-learning-for-detection-technologies/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>French Researchers Develop the First Artificial Intelligence Capable of Creating Human Genomes Sequences</title>
		<link>https://www.aiuniverse.xyz/french-researchers-develop-the-first-artificial-intelligence-capable-of-creating-human-genomes-sequences/</link>
					<comments>https://www.aiuniverse.xyz/french-researchers-develop-the-first-artificial-intelligence-capable-of-creating-human-genomes-sequences/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Mon, 22 Feb 2021 05:58:30 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Creating]]></category>
		<category><![CDATA[Develop]]></category>
		<category><![CDATA[Genomes]]></category>
		<category><![CDATA[human]]></category>
		<category><![CDATA[researchers]]></category>
		<category><![CDATA[Sequences]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=12988</guid>

					<description><![CDATA[<p>Source &#8211; https://www.gilmorehealth.com/ Artificial intelligence (AI) has made it possible for the first time to create fully artificial human genome sequences that are indistinguishable from the DNA <a class="read-more-link" href="https://www.aiuniverse.xyz/french-researchers-develop-the-first-artificial-intelligence-capable-of-creating-human-genomes-sequences/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/french-researchers-develop-the-first-artificial-intelligence-capable-of-creating-human-genomes-sequences/">French Researchers Develop the First Artificial Intelligence Capable of Creating Human Genomes Sequences</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.gilmorehealth.com/</p>



<p>Artificial intelligence (AI) has made it possible for the first time to create fully artificial human genome sequences that are indistinguishable from the DNA of real donors. A European team just created entire sequences of human DNA, using this AI. Their work was published in the journal PLOS Genetics.</p>



<h2 class="wp-block-heading">An algorithm that can generate artificial human genomes</h2>



<p>“Generative neural networks have been used effectively in many different fields over the past decade, including photorealistic imaging,” say the authors of this new work. Applying a similar concept with genetic data, the researchers built their neural networks using the sequences of 2,500 people stored in databases. The system had to generate sequences with similar characteristics and then mix their creations with real ones to see if they could tell the difference. Through training, the artificial genomes created turned out to faithfully reproduce features of the real genomes, such as allele frequencies (the different versions of a gene). One of the biggest challenges of this work was to verify their reliability, said Aurélien Decelle, co-author of this work and a researcher at the University of Paris-Saclay. “So we spent some time studying the statistical properties of the generated sequences,” he explains.</p>



<h2 class="wp-block-heading">Only sequences, not whole genomes</h2>



<p>These “realistic” and “high-quality” genomes are a first, the researchers note in the paper. This type of neural network has already been used in genetics to generate short sequences “on the order of tens or hundreds of base pairs” (the building blocks of our DNA, of which there are about 3 billion in humans), explains Flora Jay, who co-led this work at the University of Paris-Saclay. “But the generation of such long sequences (about 10,000 variants comprising several million base pairs) and in the context of population genetics is new and represents a major step forward,” she adds.</p>



<p>As a result, these artificial genomes “are indistinguishable from the other genomes in the biobank that we used for our algorithm, except for one detail: they do not belong to any real donor,” Luca Pagani, co-author of the study, explains in a press release.</p>



<p>However, the process still needs to be perfected. “One of the main drawbacks is that these models cannot yet be used to create whole artificial genomes due to computational limitations,” and they must be limited to bits and pieces, the authors explain. In addition, very rare alleles are difficult to represent with the algorithm. The final challenge is to “closely monitor the originality of the generated data, i.e., whether they are sufficiently different from the genomes of real donors,” Flora Jay says, adding that this is an ongoing research topic.</p>



<h2 class="wp-block-heading">Human genome study without concerns for privacy</h2>



<p>Far from being without a purpose other than the scientific achievement itself, this type of artificial intelligence can solve the ethical problems associated with genetic databases. “In population genetics, researchers need to regularly compare the data they produce to some reference genomes or sometimes even to a large reference panel. Ideally, these genomes should reflect genetic diversity,” says Flora Jay. Artificial genomes could perform this function reliably and safely.</p>



<p>“Existing genomic databases are an invaluable resource for biomedical research, but they are not publicly available or are protected by lengthy and exhaustive application procedures due to legitimate ethical concerns,” explains author Burak Yelmen. “Artificial genomes can help us overcome this problem within a safe ethical framework.” Looking ahead, Flora Jay predicts that these artificial genomes “will contribute to applications as diverse as understanding our evolutionary past or medical epidemiology by incorporating greater genetic diversity”.</p>



<p></p>
<p>The post <a href="https://www.aiuniverse.xyz/french-researchers-develop-the-first-artificial-intelligence-capable-of-creating-human-genomes-sequences/">French Researchers Develop the First Artificial Intelligence Capable of Creating Human Genomes Sequences</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/french-researchers-develop-the-first-artificial-intelligence-capable-of-creating-human-genomes-sequences/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Liverpool scientists deploy Artificial Intelligence to develop model that predicts the next pandemic</title>
		<link>https://www.aiuniverse.xyz/liverpool-scientists-deploy-artificial-intelligence-to-develop-model-that-predicts-the-next-pandemic/</link>
					<comments>https://www.aiuniverse.xyz/liverpool-scientists-deploy-artificial-intelligence-to-develop-model-that-predicts-the-next-pandemic/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 18 Feb 2021 04:42:28 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Develop]]></category>
		<category><![CDATA[Liverpool]]></category>
		<category><![CDATA[Pandemic]]></category>
		<category><![CDATA[Predicts]]></category>
		<category><![CDATA[scientists]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=12888</guid>

					<description><![CDATA[<p>Source &#8211; https://www.timesnownews.com/ A team of scientists at the UK&#8217;s Liverpool University has used artificial intelligence (AI) to work out where the next novel coronavirus could emerge. <a class="read-more-link" href="https://www.aiuniverse.xyz/liverpool-scientists-deploy-artificial-intelligence-to-develop-model-that-predicts-the-next-pandemic/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/liverpool-scientists-deploy-artificial-intelligence-to-develop-model-that-predicts-the-next-pandemic/">Liverpool scientists deploy Artificial Intelligence to develop model that predicts the next pandemic</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.timesnownews.com/</p>



<p>A team of scientists at the UK&#8217;s Liverpool University has used artificial intelligence (AI) to work out where the next novel coronavirus could emerge.</p>



<h2 class="wp-block-heading">KEY HIGHLIGHTS</h2>



<ul class="wp-block-list"><li>The COVID-19 pandemic was the first such massive and natural calamity to strike mankind in almost a century.</li></ul>



<ul class="wp-block-list"><li>Mankind had just not provided for such an eventuality and was caught off guard on almost all counts of preparedness.</li></ul>



<ul class="wp-block-list"><li>With climate change being real and threat of pandemics looming large, it would certainly help to know if a disease is going to acquire pandemic proportions.</li></ul>



<p>In a rapidly advancing globalisation that has turned the entire Earth into one huge village, speedy connectivity and communication also ensured a rapid advance of the COVID-19 pandemic that began with a strain of the novel coronavirus that first emerged in Wuhan, China in late 2019. Now, as per a science paper published in Nature Communications, &#8220;The spread of influenza can be modelled and forecast using a machine-learning-based analysis of anonymized mobile phone data. The mobility map, presented in Nature Communications this week, is shown to accurately forecast the spread of influenza in New York City and Australia.&#8221;</p>



<p>The year 2020 dawned with the world bracing to handle a possible crisis and by the end of the year, global deaths reached nearly 2 million.</p>



<p>To cut the long story short, mankind has now been through so much in terms of mental agony, pain, loss, death, long-lasting illnesses and economic downslide &#8211; all on account of this pandemic &#8211; despite rapid advances in science &#8211; that it has begun to dread the prediction by environmentalists and scientists that we have just entered a pandemic era and more such pandemics are likely to come.<br><br><strong>Predicting the onset of a Pandemic:</strong><br>According to a report in the&nbsp;<em>BBC</em>, a team of scientists has used artificial intelligence (AI) to work out where the next novel coronavirus could emerge.</p>



<p>The researchers are reportedly putting to use a combination of learnings from fundamental biology and tools pertaining to machine learning.</p>



<p>This is not mere conjecture and the scientists are taking ahead of what they have gained from similar experiments in the past. Their computer algorithm predicted many more potential hosts of new virus strains that have previously been detected.&nbsp;The findings have been published in the journal&nbsp;<em>Nature Communications.&nbsp;</em></p>



<p>According to this report in&nbsp;<em>Nature Communications</em>, the spread of viral diseases through a population is dependent on interactions between infected people and uninfected people. The Building-models that predict how the diseases will spread across a city or country currently make use of data that are sparse and imprecise, such as commuter surveys or internet search data.</p>



<p>Dr Marcus Blagrove, a virologist from the University of Liverpool, UK, who was involved in the study, emphasises the need to know where the next coronavirus might come from.</p>



<p>&#8220;One way they&#8217;re generated is through recombination between two existing coronaviruses &#8211; so two viruses infect the same cell and they recombine into a &#8216;daughter&#8217; virus that would be an entirely new strain.&#8221;</p>



<p>Scientists say that to get the prediction algorithm right, the first step was to look for species that were able to harbour several viruses at once. Lead researcher Dr Maya Wardeh, who is also from the University of Liverpool, successfully deployed existing biological knowledge to teach the algorithm to search for patterns that made this more likely to happen.</p>



<p>This step concluded that many more mammals were potential hosts for new coronaviruses than previous surveillance work &#8211; screening animals for viruses &#8211; had shown.</p>
<p>The post <a href="https://www.aiuniverse.xyz/liverpool-scientists-deploy-artificial-intelligence-to-develop-model-that-predicts-the-next-pandemic/">Liverpool scientists deploy Artificial Intelligence to develop model that predicts the next pandemic</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/liverpool-scientists-deploy-artificial-intelligence-to-develop-model-that-predicts-the-next-pandemic/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>How robots will help in the house of the future</title>
		<link>https://www.aiuniverse.xyz/how-robots-will-help-in-the-house-of-the-future/</link>
					<comments>https://www.aiuniverse.xyz/how-robots-will-help-in-the-house-of-the-future/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Wed, 14 Oct 2020 06:19:05 +0000</pubDate>
				<category><![CDATA[Robotics]]></category>
		<category><![CDATA[Automation]]></category>
		<category><![CDATA[Develop]]></category>
		<category><![CDATA[Future]]></category>
		<category><![CDATA[Intelligence Amplification]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=12201</guid>

					<description><![CDATA[<p>Source: engineerlive.com The use of robots in factories is now the norm, but the same cannot be said in the home, where movement patterns are less predictable. <a class="read-more-link" href="https://www.aiuniverse.xyz/how-robots-will-help-in-the-house-of-the-future/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-robots-will-help-in-the-house-of-the-future/">How robots will help in the house of the future</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: engineerlive.com</p>



<p>The use of robots in factories is now the norm, but the same cannot be said in the home, where movement patterns are less predictable. As society ages this could change.&nbsp;</p>



<p>The UN has predicted that by 2050 over 1.5 billion people will be over 65, more than doubling from the figure today. With this in mind, the Toyota Research Institute (TRI) in California has been quietly working away to develop helper robots to enable older people to stay independent for longer. The company is currently building a mock home at its Los Altos HQ complete with a kitchen, dining area and bathroom. Items can be moved around to test ideas on different interior lay outs.&nbsp;</p>



<p>The philosophy is based on what the scientists call Intelligence Amplification (IA), using the tech to aid the humans by amplifying their ability. It’s based on the Japanese idea of Ikigai, defined as every person’s life should have meaning and purpose. Gill Pratt, CEO of TRI explained, “Studies of Ikigai teach us that we feel most fulfilled when our lives incorporate work that we love and that helps society. To enable more people to achieve their Ikigai, TRI is pursuing new forms of ‘automation with a human touch’ (called Jidoka in the Toyota Production System) to develop capabilities that amplify, rather than replace, human ability with the goal of bringing deep happiness and fulfilment to all people.”</p>



<p>Max Bajracharya, VP of Robotics added, “TRI robotics research is focused on the home because it is in that environment that robots can provide the greatest assistance in achieving human fulfilment. It is also one of the most complex environments for robots to master. Our work is focused on two key challenges: teaching robots from human behaviour and using simulation to both train and validate robot behaviours. Collectively, we think of this idea as fleet learning, where when one machine learns something, they all learn something. We believe this is going to be the key to making robots in human environments practical.”</p>



<p>Several interesting ideas are being trialled, including a ceiling-mounted ‘gantry robot’ which would swoop down to load a dishwasher, wipe surfaces or clear clutter. It would simply fold away when not in use. Also soft grippers have been examined, which could also have industrial uses.&nbsp;</p>



<h3 class="wp-block-heading">Where emotions come in to this</h3>



<p>The team has also considered what is beneficial and what may not be. Steffi Paepcke, a TRI User Experience Team Lead said, “We rely heavily on observational research techniques such as contextual inquiries. Before Covid-19, we went to Japan to work with our research partners to visit the homes of older adults and observe them going about their daily lives, making note of friction points, challenges and opportunities. We observed that cooking is a beloved activity for many, though it can get more strenuous over time. Sharing meals and feeding loved ones also can serve as a focal point for social connection, so giving elderly people a fully automated cooking robot or pre-cooked meals might be physically beneficial but emotionally detrimental.”</p>



<p>The researchers hope the project will translate into consumer products one day.</p>
<p>The post <a href="https://www.aiuniverse.xyz/how-robots-will-help-in-the-house-of-the-future/">How robots will help in the house of the future</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-robots-will-help-in-the-house-of-the-future/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Engineers Develop New Machine-Learning Method Capable of Cutting Energy Use</title>
		<link>https://www.aiuniverse.xyz/engineers-develop-new-machine-learning-method-capable-of-cutting-energy-use/</link>
					<comments>https://www.aiuniverse.xyz/engineers-develop-new-machine-learning-method-capable-of-cutting-energy-use/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Mon, 28 Sep 2020 07:32:34 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[Reinforcement Learning]]></category>
		<category><![CDATA[Develop]]></category>
		<category><![CDATA[ENGINEERS]]></category>
		<category><![CDATA[Machine learning]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=11805</guid>

					<description><![CDATA[<p>Source:unite.ai Engineers at Swiss Center for Electronics and Microtechnology have developed a new machine-learning method capable of cutting energy use, as well as allowing artificial intelligence (AI) <a class="read-more-link" href="https://www.aiuniverse.xyz/engineers-develop-new-machine-learning-method-capable-of-cutting-energy-use/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/engineers-develop-new-machine-learning-method-capable-of-cutting-energy-use/">Engineers Develop New Machine-Learning Method Capable of Cutting Energy Use</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source:unite.ai</p>



<p>Engineers at Swiss Center for Electronics and Microtechnology have developed a new machine-learning method capable of cutting energy use, as well as allowing artificial intelligence (AI) to complete tasks that were once considered too sensitive.&nbsp;</p>



<h3 class="wp-block-heading"><strong>Reinforcement Learning Limitations</strong></h3>



<p>Reinforcement learning, where a computer continuously improves upon itself by learning from its past experiences, is a major aspect of artificial intelligence. However, this technology is oftentimes difficult to apply to real-life scenarios and situations, such as training climate-control systems. Applications such as this are not able to deal with drastic changes in temperatures, which would be brought on by reinforcement learning.&nbsp;</p>



<p>This exact issue is what the CSEM engineers set out to address, and that is when they came up with the new approach. The engineers demonstrated that simplified theoretical models could first be used to train computers, and then they would turn to real-life systems. This allows for the machine learning process to be more accurate by the time it reaches the real-life system, learning from its previous trial-and-errors with the theoretical model. This means that there will be no drastic fluctuations for the real-life system, solving the example issue with climate-control technology. </p>



<p>Pierre-Jean Alet is head of smart energy systems research at CSEM, as well as co-author of the study.&nbsp;</p>



<p>“It’s like learning the driver’s manual before you start a car,” Alet says. “With this pre-training step, computers build up a knowledge base they can draw on so they aren’t flying blind as they search for the right answer.”</p>



<h3 class="wp-block-heading"><strong>Energy Cuts</strong></h3>



<p>One of the most important aspects of this new method is that it can cut energy use by over 20%. The engineers tested the method on a heating, ventilation and air conditioning (HVAC) system, which was located in a 100-room building.&nbsp;</p>



<p>The engineers relied on three steps, the first of which was training a computer on a “virtual mode.” This model was constructed through simple equations explaining the behavior of the building. Real building data such as temperature, weather conditions and other variables were then fed to the computer, which resulted in more accurate training. The last step was to allow the computer to run the reinforcement learning algorithms, which would eventually result in the best approach forward for the HVAC system.&nbsp;</p>



<p>The new method developed by the CSEM engineers could have big implications for machine learning. Many applications that were once thought to be “untouchable” by reinforcement learning, like those with large fluctuations, could now be approached in a new manner. This would result in lower energy usage, lower financial costs and many other benefits.&nbsp;</p>



<p>The research was published in the journal IEEE Transactions on Neural Networks and Learning Systems, titled “A hybrid learning method for system identification and optimal control.” </p>



<p>The authors include: Baptiste Schubnel, Rafael E. Carrillo, Pierre-Jean Alet and Andreas Hutter.&nbsp;</p>
<p>The post <a href="https://www.aiuniverse.xyz/engineers-develop-new-machine-learning-method-capable-of-cutting-energy-use/">Engineers Develop New Machine-Learning Method Capable of Cutting Energy Use</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/engineers-develop-new-machine-learning-method-capable-of-cutting-energy-use/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Doctors develop new data mining method to detect young people with emerging psychosis</title>
		<link>https://www.aiuniverse.xyz/doctors-develop-new-data-mining-method-to-detect-young-people-with-emerging-psychosis/</link>
					<comments>https://www.aiuniverse.xyz/doctors-develop-new-data-mining-method-to-detect-young-people-with-emerging-psychosis/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 15 Sep 2020 07:23:32 +0000</pubDate>
				<category><![CDATA[Data Mining]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[data mining]]></category>
		<category><![CDATA[Develop]]></category>
		<category><![CDATA[doctors]]></category>
		<category><![CDATA[emerging psychosis]]></category>
		<category><![CDATA[Natural language processing]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=11587</guid>

					<description><![CDATA[<p>Source: news-medical.net Doctors have developed a new data mining method to detect many young people with emerging psychosis. The new methods, based on advanced data mining to <a class="read-more-link" href="https://www.aiuniverse.xyz/doctors-develop-new-data-mining-method-to-detect-young-people-with-emerging-psychosis/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/doctors-develop-new-data-mining-method-to-detect-young-people-with-emerging-psychosis/">Doctors develop new data mining method to detect young people with emerging psychosis</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: news-medical.net</p>



<p>Doctors have developed a new data mining method to detect many young people with emerging psychosis.</p>



<p>The new methods, based on advanced data mining to pick up early risk sign from schools, hospitals, and general doctors, will be presented at the ECNP virtual congress, and is in press with a peer-reviewed journal.</p>



<p>Psychosis is a condition which causes you to lose touch with reality, causing you to suffer from hallucinations or delusions.</p>



<p>There are a variety of possible causes, including migration and social stress, trauma, substance abuse, etc. It represents a significant care burden, affecting about 20 million people and costing Europe around €94 billion European every year (2011 estimate).</p>



<p>Clinical experience has shown that the best way to manage it is to stop it developing. Over the last 25 years doctors have developed ways of detecting young people at risk of developing psychosis and predicting which young people might go on to develop the disorder, and so have been able to take steps to lower risk.</p>



<p>However the way clinicians were detecting young people was not systematic and may have missed many at-risk people. Now doctors in the UK have developed new data mining methods which can potentially detect most people who are at risk of developing psychosis.</p>



<p>This, in turn would allow to offer them preventive psychological interventions that can halve their risk of developing full-blown psychosis.</p>



<p>&#8220;We have developed a data mining method (using Natural Language Processing), to search medical records for those at risk of progressing to psychosis. Many medical records are fairly unstructured, with information of mental health being hidden in sections which do not allow systematic research.</p>



<p>Our data-mining system does a more complete search of the records people who have been referred to hospital (secondary care), looking for keywords such as weight loss, insomnia, cocaine, guilt, etc. We can look for 14 different terms which we then evaluate for the risk of psychosis.</p>



<p>At that point patients might be invited for a one-to-one interview. We have found that prevention can halve the risk of psychosis developing&#8221;.</p>



<p>The systems have evaluated 92,151 patients over a long follow up period. They were able to confirm that their method worked well to detect young people at risk, although Professor Fusar-Poli cautioned that &#8220;these results need further replication in other countries before they can enter clinical routine but they look very promising.</p>



<p>Replication will be facilitated by international research consortia such as the ECNP-funded Prevention of Mental Disorders and Mental Health Promotion Network&#8221;</p>



<p>Prof. Fusar-Poli suggested that detection of these young people is the first step towards prevention. Preventive interventions in these people can translate in several benefits:</p>



<p>&#8220;This translates into real benefits. Although the initial cost for establishing specialised services detecting young people at risk of psychosis is greater, intervening before the onset of psychosis is associated with fewer treatments, fewer days in hospital, in addition to the tangible and social health benefits, meaning that the NHS saved around £1000 per patient diagnosed.</p>



<p>Our detection systems can extend these benefits to many other young people who might be at risk of psychosis&#8221;<br>Professor Fusar-Poli will present the work while chairing a session on the prevention of mental disorders (see below) at the ECNP congress.</p>



<p>&#8220;We have been working with the ECNP special group on Prevention of Mental Disorders and Mental Health Promotion, and with the EU-Funded European Brain Research Area  to set up a Europe-wide system of advance warning for young people at risk of psychosis. It is essential that we bring the best expertise to bear on this problem, and we can all learn from the experience of others&#8221;</p>



<p>Commenting, Professor Andreas Meyer-Lindenberg (Mannheim), member of the ECNP executive board said:<br>&#8220;This work is an excellent example of the transformative role of artificial intelligence and big data processing in psychiatry. While much attention in this field has been focused on biological data and biomarkers, this result shows the gains that can be made if the wealth of written information that clinicians produce in their daily work is mined using innovative approaches.&#8221;</p>
<p>The post <a href="https://www.aiuniverse.xyz/doctors-develop-new-data-mining-method-to-detect-young-people-with-emerging-psychosis/">Doctors develop new data mining method to detect young people with emerging psychosis</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/doctors-develop-new-data-mining-method-to-detect-young-people-with-emerging-psychosis/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Researchers develop artificial intelligence platform to combat infectious diseases</title>
		<link>https://www.aiuniverse.xyz/researchers-develop-artificial-intelligence-platform-to-combat-infectious-diseases/</link>
					<comments>https://www.aiuniverse.xyz/researchers-develop-artificial-intelligence-platform-to-combat-infectious-diseases/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Sat, 12 Sep 2020 11:53:51 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[COVID-19]]></category>
		<category><![CDATA[Develop]]></category>
		<category><![CDATA[platform]]></category>
		<category><![CDATA[researchers]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=11559</guid>

					<description><![CDATA[<p>Source: expresscomputer.in As the COVID-19 crisis continues to develop, researchers around the world are attempting to find the most effective treatment to combat the poorly understood virus <a class="read-more-link" href="https://www.aiuniverse.xyz/researchers-develop-artificial-intelligence-platform-to-combat-infectious-diseases/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/researchers-develop-artificial-intelligence-platform-to-combat-infectious-diseases/">Researchers develop artificial intelligence platform to combat infectious diseases</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: expresscomputer.in</p>



<p>As the COVID-19 crisis continues to develop, researchers around the world are attempting to find the most effective treatment to combat the poorly understood virus behind this disease.</p>



<p>Traditionally, when dangerous new bacterial and viral infections emerge, the response is to develop a treatment that combines several different drugs. However, this process is laborious and time-consuming, with drug combinations chosen sub-optimally, and selection of doses is a matter of trial and error. This costly and inefficient way of developing a treatment is problematic when a rapid response is crucial to tackle a global pandemic and resources need to be conserved.</p>



<p>With this in mind, Professor Dean Ho from the National University of Singapore (NUS) led a multidisciplinary team of researchers to come up with a pioneering artificial intelligence (AI) platform known as ‘IDentif.AI’ (Identifying Infectious Disease Combination Therapy with Artificial Intelligence) to dramatically increase the efficiency of this development.</p>



<p>Their results were published in Advanced Therapeutics on 16 April 2020.</p>



<p><strong>Drawbacks of traditional drug screening</strong><br>Conventional selection of drugs for treatment involves examining virus or bacteria growth in response to different potential candidates. The drugs are given to the bacteria or viruses at increasing dosages until maximal prevention of their growth is observed. Additional drugs are then added together to amplify the effect. However, these methods become ineffective when several drugs are simultaneously studied as candidates. Also, these approaches often result in positive outcomes for in vitro studies, but are not observed in human studies.</p>



<p>“If 10 or more drugs are examined, it is virtually impossible to study the effects of all the possible drug combinations and dosages needed to identify the best possible combination using traditional methods,” explained Prof Ho, Director of The N.1 Institute for Health and Institute for Digital Medicine (WisDM) at NUS.</p>



<p>Furthermore, in traditional screening, if a drug from a pool of candidate therapies is shown to have no apparent effect on the pathogen, this drug will generally no longer be considered. “However, if this drug is systematically combined with more drugs, each at the correct doses, this could very well result in the best possible combination. Unfortunately, this remarkable level of required precision cannot be arbitrarily derived,” added Prof Ho, who is also the Head of the NUS Department of Biomedical Engineering.</p>



<p><strong>Using artificial intelligence to optimise drug therapies</strong><br>To avoid the drawbacks of traditional drug combination therapy development, Prof Ho and his team, together with collaborators from Shanghai Jiao Tong University harnessed the processing power of AI.</p>



<p>The research team carefully selected 12 drugs which are viable candidates for treating an infection in lung cells caused by the vesicular stomatitis virus (VSV). They then used IDentif.AI to markedly reduce the number of experiments needed to interrogate the full range of combinations and optimal dosages of these 12 drugs.</p>



<p>“Using IDentif.AI, we took three days to identify multiple optimal drug regimens out of billions of possible combinations that reduced the VSV infection to 1.5 per cent with no apparent adverse impact. This speed and accuracy in discovering new drug combination therapies is completely unprecedented,” said Prof Ho.</p>



<p>Importantly, the team saw that when the top-ranked drug combination was optimally dosed, it was seven times more effective compared to sub-optimal doses. This shows the critical importance of ideal drug and dose identification.</p>



<p>Similarly, when a single drug was substituted out from the top-ranked drug combination, and this new combination was administered at sub-optimal doses, the combination was 14 times less effective.</p>



<p>“There is a notion in drug discovery that if you discover the right molecule, the work is done. Our results with IDentif.AI prove that it is critically important to think about how the drug is developed into a combination and subsequently administered. How do you combine it with the right drugs? How do you dose this drug properly? Answering these questions can dramatically increase efficacy at the clinical stage of drug development,” shared Prof Ho.</p>



<p>In addition to validating IDentif.AI, this study also included insights by a team of experts in operations research and healthcare economics from NUS Business School and KPMG Global Health and Life Sciences Centre of Excellence, as well as global health security and surveillance experts from EpiPointe LLC and MRIGlobal. They concluded that strategies such as IDentif.AI, which can rapidly optimise drug repurposing under austere economic conditions amidst pandemics, could play a key role in improving patient outcomes compared to standard approaches.</p>



<p><strong>Using IDentif.AI against COVID-19 and more</strong><br>Having proved the effectiveness of IDentif.AI to rapidly provide treatments for infectious diseases, the team is currently setting their sights on COVID-19.</p>



<p>Prof Ho said, “As the development of vaccines and antibody therapies for COVID-19 are ongoing, we will need a rapid therapeutic strategy that addresses the virus which may evolve over time. Our strength is that we can perform one experiment and come out with a list of drug combinations for treatment within days. And in time, if patients do not respond well to the first combinations of drugs, we can derive new combinations within days to re-optimise their care. Our platform is useful to address the possibility that patients will need different drug combinations depending on when treatment was initiated, and if downstream infection with a different strain occurs.”</p>



<p>Furthermore, IDentif.AI could be immediately deployed to address any other infectious diseases in the future. Prof Ho concluded, “When an aggressive pathogen hits, a rapid response is needed, and this response may need to evolve quickly as the pathogen evolves. Now, with IDentif.AI, we will be ready.”</p>
<p>The post <a href="https://www.aiuniverse.xyz/researchers-develop-artificial-intelligence-platform-to-combat-infectious-diseases/">Researchers develop artificial intelligence platform to combat infectious diseases</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/researchers-develop-artificial-intelligence-platform-to-combat-infectious-diseases/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Microsoft targets its fastest Azure AI instance to date at large neural networks</title>
		<link>https://www.aiuniverse.xyz/microsoft-targets-its-fastest-azure-ai-instance-to-date-at-large-neural-networks/</link>
					<comments>https://www.aiuniverse.xyz/microsoft-targets-its-fastest-azure-ai-instance-to-date-at-large-neural-networks/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 20 Aug 2020 07:24:07 +0000</pubDate>
				<category><![CDATA[Microsoft Azure Machine Learning]]></category>
		<category><![CDATA[Azure]]></category>
		<category><![CDATA[chipmaker]]></category>
		<category><![CDATA[Develop]]></category>
		<category><![CDATA[ENGINEERING]]></category>
		<category><![CDATA[GPT-3]]></category>
		<category><![CDATA[Mellanox]]></category>
		<category><![CDATA[Microsoft]]></category>
		<category><![CDATA[NETWORKS]]></category>
		<category><![CDATA[OpenAI]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=11070</guid>

					<description><![CDATA[<p>SOURCE:-siliconangle Microsoft Corp. today previewed a new Azure instance for training artificial intelligence models that targets the emerging class of advanced, ultra-large neural networks being pioneered by <a class="read-more-link" href="https://www.aiuniverse.xyz/microsoft-targets-its-fastest-azure-ai-instance-to-date-at-large-neural-networks/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/microsoft-targets-its-fastest-azure-ai-instance-to-date-at-large-neural-networks/">Microsoft targets its fastest Azure AI instance to date at large neural networks</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>SOURCE:-siliconangle</p>



<p>Microsoft Corp. today previewed a new Azure instance for training artificial intelligence models that targets the emerging class of advanced, ultra-large neural networks being pioneered by the likes of OpenAI.</p>



<p>The instance, called the ND A100 v4, is being touted by Microsoft as its most powerful AI-optimized virtual machine to date.</p>



<p>The ND A100 v4 aims to address an important new trend in AI development. Engineers usually develop a separate machine learning model for every use case they seek to automate, but recently, a shift has started toward building one big, multipurpose model and customizing it for multiple use cases. One notable example of such an AI is the OpenAI research group’s GPT-3 model, whose 175 billion learning parameters allow it to perform tasks as varied as searching the web and writing code.</p>



<p>Microsoft is one of OpenAI’s top corporate backers. The company has also adopted the multipurpose AI approach internally, disclosing in the instance announcement today that such large AI models are used to power features across Bing and Outlook.</p>



<p>The ND A100 v4 is aimed at helping other companies train their own supersized neural networks by providing eight of Nvidia Corp.’s latest A100 graphics processing units per instance. Customers can link multiple ND A100 v4 instances together to create an AI training cluster with up to “thousands” of GPUs.</p>



<p>Microsoft didn’t specify exactly how many GPUs are supported. But even at the low end of the possible range, assuming a cluster with a graphics card count in the low four figures, the performance is likely not far behind that of a small supercomputer. Earlier this year, Microsoft built an Azure cluster for OpenAI that qualified as one of the world’s top five supercomputers, and that cluster had 10,000 GPUs.</p>



<p>In the new ND A100 v4 instance, what facilitates the ability to cluster together GPUs is a dedicated 200-gigabit per second InfiniBand network link provisions to each chip. These connections allow the graphics cards to communicate with each across instances. The speed at which GPUs can share data is a big factor in how fast they can process that data, and Microsoft says its the ND A100 v4 VM offers 16 times more GPU-to-GPU bandwidth than any other major public cloud.</p>



<p>The InfiniBand connections are powered by networking gear supplied by Nvidia’s Mellanox unit. To support the eight onboard GPUs, the new instance also packs a central processing unit from Advanced Micro Devices Inc.’s second-generation Epyc series of server processors.</p>



<p>The end result is what the company describes as a big jump in AI training performance. “Most customers will see an immediate boost of 2x to 3x compute performance over the previous generation of systems based on Nvidia V100 GPUs with no engineering work,” Ian Finder, a senior program manager at Azure, wrote in a blog post. He added that some customers may see performance improve by up to 20 times in some cases.</p>



<p>Microsoft’s decision to use Nvidia chips and Mellanox gear to power the instance shows how chipmaker is already reaping dividends from its $6.9 billion acquisition of Mellanox, which closed this year. Microsoft’s own investments in AI and related develop have likewise helped it win customers. Today’s debut of the new AI instance was preceded by the Tuesday announcement that the U.S. Energy Department has partnered with the tech giant to develop AI disaster response tools on Azure.</p>
<p>The post <a href="https://www.aiuniverse.xyz/microsoft-targets-its-fastest-azure-ai-instance-to-date-at-large-neural-networks/">Microsoft targets its fastest Azure AI instance to date at large neural networks</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/microsoft-targets-its-fastest-azure-ai-instance-to-date-at-large-neural-networks/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>10 INDUSTRY BEST PRACTICES FOR DATA SCIENCE LEADERS TO FOLLOW</title>
		<link>https://www.aiuniverse.xyz/10-industry-best-practices-for-data-science-leaders-to-follow/</link>
					<comments>https://www.aiuniverse.xyz/10-industry-best-practices-for-data-science-leaders-to-follow/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 21 Jul 2020 05:26:14 +0000</pubDate>
				<category><![CDATA[Data Mining]]></category>
		<category><![CDATA[Big data]]></category>
		<category><![CDATA[data mining]]></category>
		<category><![CDATA[data science]]></category>
		<category><![CDATA[Develop]]></category>
		<category><![CDATA[Internet of Things]]></category>
		<category><![CDATA[Technology]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=10330</guid>

					<description><![CDATA[<p>Source: analyticsinsight.net The number of data science and big data projects is growing, but very little has been spoken and leveraged to understand the industry best practices <a class="read-more-link" href="https://www.aiuniverse.xyz/10-industry-best-practices-for-data-science-leaders-to-follow/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/10-industry-best-practices-for-data-science-leaders-to-follow/">10 INDUSTRY BEST PRACTICES FOR DATA SCIENCE LEADERS TO FOLLOW</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: analyticsinsight.net</p>



<p>The number of data science and big data projects is growing, but very little has been spoken and leveraged to understand the industry best practices for Data Science leaders. The complex world of Big data characterised by its different V’s further complicates the process.  In addition, the proliferation of open-source technologies adds more layers of cloud. To demystify the complexities associated, here are the 10 Industry Best Practices for Data Science leaders to follow-</p>



<h4 class="wp-block-heading"><strong>Leverage Open Source Data</strong></h4>



<p>Because open-source tools are such an important part of the data science technology. Much of this data resides in silos, and data scientists with experience will have a better understanding of how to evaluate and manage open-source tools by looking at code activity, package metadata, release history, and project contributors.</p>



<h4 class="wp-block-heading"><strong>Embrace the Changing Data Landscape</strong></h4>



<p>The increased use of internet-connected smart devices has changed how does live data flow across organisations. The Internet of Things (IoT) creates large amounts of data quickly from sensors touted as one of the contributing factors in creating this category of big data. The rise of data science is primarily a result of big data, other than traditional structured data, such as text, machine-generated, and geospatial data. Big data and data science go hand in hand; thus, it is imperative for organisations to embrace the changing data landscape. But over time, data scientists will need to collaborate to develop processes and eliminate data redundancy, besides working with IT to understand how to put projects into production, assess the potential resources, and understand security standards.</p>



<h4 class="wp-block-heading"><strong>Integrate Data Management</strong></h4>



<p>Data science leaders must integrate the CRISP-DM data mining process, to have a clarity on business understanding including data preparation, modelling valuation and deployment.</p>



<p>To solve business problems, data science teams should understand how to speak the language of the business units they work with. It’s essential that common terms and acronyms are used in presentations with their respective lines of business. This will help establish common ground in defining and evaluating success.</p>



<h4 class="wp-block-heading"><strong>Developing the PoC Phase</strong></h4>



<p>Nearly half of data science projects never make it to production. One way to help ensure models ultimately make it into the hands of end-users and bring value to the business is to involve IT and software developers early in the process, especially for security protocols to be met early on.</p>



<h4 class="wp-block-heading"><strong>Promoting Collaborative Efforts</strong></h4>



<p>Data scientists do not work in silos. Data scientists scattered across the organization should meet regularly to discuss processes, tools, and projects, while those in centralized structures should meet regularly with business managers. Through regular communication, data scientists will learn more quickly, grow their skill set, make a better case for resources they need, and provide more value to the organization overall.</p>



<h4 class="wp-block-heading"><strong>Data Risk Mitigation</strong></h4>



<p>Data science and machine learning are increasingly used to help make decisions that impact people’s lives through credit scoring, job and college applicant scoring, and even potential healthcare outcomes. When implemented thoughtfully, machine learning can improve human decision-making and reduce racial disparity. On the other hand, when machine learning models are implemented without regard for bias or fairness, they can enforce and exacerbate human biases.</p>



<p>The most important steps data scientists can take are to understand biases in their data and understand how their models make decisions. Fortunately, several new open-source tools are available to help data scientists do this, such as FairLearn, InterpretML, and LIME.</p>



<h4 class="wp-block-heading"><strong>Understanding Ethics and Data Governance</strong></h4>



<p>As modern data science becomes more and more ingrained into day-to-day business practices, politics, and society, it’s important that questions around bias and fairness be on the minds of every data scientist, business leader, and academic.</p>



<p>A failure to proactively address these areas poses a strategic risk to enterprises and institutions across competitive, financial, and even legal dimensions. We see an opportunity for data professionals to exert leadership within their organizations and drive change.</p>



<h4 class="wp-block-heading"><strong>Putting Strict Controls</strong></h4>



<p>Democratization means that business analysts will try to use more advanced technology. Make sure controls are in place before a model is put into production. This might include confirming the validity of a model.</p>



<h4 class="wp-block-heading"><strong>Acting on Data</strong></h4>



<p>Analytics without action won’t yield measurable impact. Even if you aren’t ready to operationalize your analysis, it makes sense to start implementing a process to take action, even if it’s manual action. You’ll be building a more analytically-driven culture for when you want to build more operational intelligence.</p>



<h4 class="wp-block-heading"><strong>Building a Centre of Excellence</strong></h4>



<p>A CoE can be a great way to make sure that the infrastructure and analytics you implement are coherent. CoEs can help you disseminate information, provide training, and establish or maintain governance.</p>



<h4 class="wp-block-heading"><strong>Monitor the Data Structure</strong></h4>



<p>Data can get stale. Models can get stale. It’s important to revisit any kind of analysis where action is taking place on a periodic basis to make sure that your data is still relevant and that your model still makes sense.</p>
<p>The post <a href="https://www.aiuniverse.xyz/10-industry-best-practices-for-data-science-leaders-to-follow/">10 INDUSTRY BEST PRACTICES FOR DATA SCIENCE LEADERS TO FOLLOW</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/10-industry-best-practices-for-data-science-leaders-to-follow/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
