<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>How Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/how/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/how/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Mon, 28 Jun 2021 09:09:28 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>How Artificial Intelligence Is Taking Over Our Gadgets</title>
		<link>https://www.aiuniverse.xyz/how-artificial-intelligence-is-taking-over-our-gadgets/</link>
					<comments>https://www.aiuniverse.xyz/how-artificial-intelligence-is-taking-over-our-gadgets/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Mon, 28 Jun 2021 09:09:26 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Gadgets]]></category>
		<category><![CDATA[How]]></category>
		<category><![CDATA[TAKING]]></category>
		<guid isPermaLink="false">https://www.aiuniverse.xyz/?p=14614</guid>

					<description><![CDATA[<p>Source &#8211; https://www.bangkokpost.com/ AI is moving from data centers to devices, making everything from phones to tractors faster and more private. These newfound smarts also come with <a class="read-more-link" href="https://www.aiuniverse.xyz/how-artificial-intelligence-is-taking-over-our-gadgets/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-artificial-intelligence-is-taking-over-our-gadgets/">How Artificial Intelligence Is Taking Over Our Gadgets</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.bangkokpost.com/</p>



<p><em>AI is moving from data centers to devices, making everything from phones to tractors faster and more private. These newfound smarts also come with pitfalls.</em></p>



<p>If you think of AI as something futuristic and abstract, start thinking different.</p>



<p>We&#8217;re now witnessing a turning point for artificial intelligence, as more of it comes down from the clouds and into our smartphones and automobiles. While it&#8217;s fair to say that AI that lives on the &#8220;edge&#8221; &#8212; where you and I are &#8212; is still far less powerful than its datacenter-based counterpart, it&#8217;s potentially far more meaningful to our everyday lives.</p>



<p>One key example: This fall, Apple&#8217;s Siri assistant will start processing voice on iPhones.</p>



<p>Right now, even your request to set a timer is sent as an audio recording to the cloud, where it is processed, triggering a response that&#8217;s sent back to the phone.</p>



<p>By processing voice on the phone, says Apple, Siri will respond more quickly. This will only work on the iPhone XS and newer models, which have a compatible built-for-AI processor the company calls a &#8220;neural engine.&#8221;</p>



<p>People might also feel more secure knowing that their voice recordings aren&#8217;t being sent to unseen computers in faraway places.</p>



<p>Google actually led the way with on-phone processing: In 2019, it introduced a Pixel phone that could transcribe speech to text and perform other tasks without any connection to the cloud.</p>



<p>One reason Google decided to build its own phones was that the company saw potential in creating custom hardware tailor-made to run AI, says Brian Rakowski, product manager of the Pixel group at Google.</p>



<p>These so-called edge devices can be pretty much anything with a microchip and some memory, but they tend to be the newest and most sophisticated of smartphones, automobiles, drones, home appliances, and industrial sensors and actuators.</p>



<p>Edge AI has the potential to deliver on some of the long-delayed promises of AI, like more responsive smart assistants, better automotive safety systems, new kinds of robots, even autonomous military machines.</p>



<p>The challenges of making AI work at the edge &#8212; that is, making it reliable enough to do its job and then justifying the additional complexity and expense of putting it in our devices &#8212; are monumental.</p>



<p>Existing AI can be inflexible, easily fooled, unreliable and biased. In the cloud, it can be trained on the fly to get better &#8212; think about how Alexa improves over time. When it&#8217;s in a device, it must come pre-trained, and be updated periodically.</p>



<p>Yet the improvements in chip technology in recent years have made it possible for real breakthroughs in how we experience AI, and the commercial demand for this sort of functionality is high.</p>



<p><strong>From swords to plowshares</strong></p>



<p>Shield AI, a contractor for the Department of Defense, has put a great deal of AI into quadcopter-style drones which have already carried out &#8212; and continue to be used in &#8212; real-world combat missions.</p>



<p>One mission is to help soldiers scan for enemy combatants in buildings that must be cleared.</p>



<p>The DoD has been eager to use the company&#8217;s drones, says Shield AI&#8217;s co-founder, Brandon Tseng, because even if they fail, they can be used to reduce human casualties.</p>



<p>&#8220;In 2016 and early 2017, we had early prototypes with something like 75% reliability, something you would never take to market, and the DoD were saying, &#8216;We&#8217;ll take that overseas and use that in combat right now&#8217;,&#8221; Mr. Tseng says.</p>



<p>When he protested that the system wasn&#8217;t ready, the response from within the military was that anything was better than soldiers going through a door and being shot.</p>



<p>In a combat zone, you can&#8217;t count on a fast, robust, wireless cloud connection, especially now that enemies often jam wireless communication and GPS signals. When on a mission, processing and image recognition must occur on the company&#8217;s drones themselves.</p>



<p>Shield AI uses a small, efficient computer made by Nvidia, designed for running AI on devices, to create a quadcopter drone no bigger than a typical camera-wielding consumer model.</p>



<p>The Nova 2 can fly long enough to enter a building, and use AI to recognize and examine dozens of hallways, stairwells and rooms, cataloging objects and people it sees along its way.</p>



<p>Meanwhile, in the town of Salinas, Calif., birthplace of&nbsp;Grapes of Wrath&nbsp;author John Steinbeck and an agricultural center to this day, a robot the size of an SUV is spending this year&#8217;s growing season raking the earth with its 12 robotic arms.</p>



<p>Made by FarmWise Labs Inc., the robot trundles along fields of celery as if it were any other tractor. Underneath its metal shroud, it uses computer vision and an edge AI system to decide, in less than a second, whether a plant is a food crop or a weed, and directs its plow-like claws to avoid or eradicate the plant accordingly.</p>



<p>FarmWise&#8217;s huge, diesel robo-weeder can generate its own electricity, enabling it to carry a veritable supercomputer&#8217;s worth of processing power &#8212; four GPUs and 16 CPUs which together draw 500 watts of electricity.</p>



<p>In our everyday lives, things like voice transcription that work whether or not we have a connection, or how good it is, could mean shifts in how we prefer to interact with our mobile devices.</p>



<p>Getting always-available voice transcription to work on Google&#8217;s Pixel phone &#8220;required a lot of breakthroughs to run on the phone as well as it runs on a remote server,&#8221; says Mr. Rakowski.</p>



<p>Google has almost unlimited resources to experiment with AI in the cloud, but getting those same algorithms, for everything from voice transcription and power management to real-time translation and image processing, to work on phones required the introduction of custom microprocessors like the Pixel Neural Core, he adds.</p>



<p><strong>Turning cats into pure math</strong></p>



<p>What nearly all edge AI systems have in common is that, as pre-trained AI, they are only performing &#8220;inference,&#8221; says Dennis Laudick, vice president of marketing for AI and machine learning at Arm Holdings, which licenses chip designs and instructions to companies such as Apple, Samsung, Qualcomm, Nvidia and others.</p>



<p>Generally speaking, machine-learning AI consists of four phases:</p>



<p>Data is captured or collected: Say, for example, in the form of millions of cat pictures.</p>



<p>Humans label the data: Yes, these are cat photos.</p>



<p>AI is trained with the labeled data: This process selects for models that identify cats.</p>



<p>Then the resulting pile of code is turned into an algorithm and implemented in software: Here&#8217;s a camera app for cat lovers!</p>



<p>(Note: If this doesn&#8217;t exist yet, consider it your million-dollar idea of the day.)</p>



<p>The last bit of the process &#8212; something like that cat-identifying software &#8212; is the inference phase.</p>



<p>The software on many smart surveillance cameras, for example, is performing inference, says Eric Goodness, a research vice president at technology-consulting firm Gartner.</p>



<p>These systems can already identify how many patrons are in the restaurant, if any are engaging in undesirable behavior, or if the fries have been in the fryer too long.</p>



<p>It&#8217;s all just mathematical functions, ones so complicated that it would take a monumental effort by humans to write them, but which machine-learning systems can create when trained on enough data.</p>



<p><strong>Robot pratfalls</strong></p>



<p>While all of this technology has enormous promise, making AI work on individual devices, whether or not they can connect to the cloud, comes with a daunting set of challenges, says Elisa Bertino, a professor of computer science at Purdue University.</p>



<p>Modern AI, which is primarily used to recognize patterns, can have difficulty coping with inputs outside of the data it was trained on. Operating in the real world only makes it tougher &#8212; just consider the classic example of a Tesla that brakes when it sees a stop sign on a billboard.</p>



<p>To make edge AI systems more competent, one edge device might gather some data but then pair with another, more powerful device, which can integrate data from a variety of sensors, says Dr. Bertino.</p>



<p>If you&#8217;re wearing a smartwatch with a heart-rate monitor, you&#8217;re already witnessing this: The watch&#8217;s edge AI pre-processes the weak signal of your heart rate, then passes that data to your smartphone, which can further analyze that data &#8212; whether or not it&#8217;s connected to the internet.</p>



<p>The overwhelming majority of AI algorithms are still trained in the cloud. They can also be retrained using more or fresher data, which lets them continually improve.</p>



<p>Down the road, says Mr. Goodness, edge AI systems will begin to learn on their own &#8212; that is, they&#8217;ll become powerful enough to move beyond inference and actually gather data and use it to train their own algorithms.</p>



<p>AI that can learn all by itself, without connection to a cloud superintelligence, might eventually raise legal and ethical challenges.</p>



<p>How can a company certify an algorithm that&#8217;s been off evolving in the real world for years after its initial release, asks Dr. Bertino.</p>



<p>And in future wars, who will be willing to let their robots decide when to pull the trigger? Whoever does might end up with an advantage &#8212; but also all the collateral damage that happens when, inevitably, AI makes mistakes.</p>
<p>The post <a href="https://www.aiuniverse.xyz/how-artificial-intelligence-is-taking-over-our-gadgets/">How Artificial Intelligence Is Taking Over Our Gadgets</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-artificial-intelligence-is-taking-over-our-gadgets/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>HOW MUCH DOES ARTIFICIAL INTELLIGENCE COST IN 2021?</title>
		<link>https://www.aiuniverse.xyz/how-much-does-artificial-intelligence-cost-in-2021/</link>
					<comments>https://www.aiuniverse.xyz/how-much-does-artificial-intelligence-cost-in-2021/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 11 Jun 2021 05:15:14 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[2021]]></category>
		<category><![CDATA[COST]]></category>
		<category><![CDATA[evaluation]]></category>
		<category><![CDATA[How]]></category>
		<guid isPermaLink="false">https://www.aiuniverse.xyz/?p=14197</guid>

					<description><![CDATA[<p>Source &#8211; https://www.analyticsinsight.net/ An evaluation of the cost of&#160;artificial intelligence&#160;in 2021 Technology is developing at an exponential speed. These technological developments are changing every aspect of our <a class="read-more-link" href="https://www.aiuniverse.xyz/how-much-does-artificial-intelligence-cost-in-2021/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-much-does-artificial-intelligence-cost-in-2021/">HOW MUCH DOES ARTIFICIAL INTELLIGENCE COST IN 2021?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.analyticsinsight.net/</p>



<h2 class="wp-block-heading">An evaluation of the cost of&nbsp;<strong>artificial intelligence</strong>&nbsp;in 2021</h2>



<p>Technology is developing at an exponential speed. These technological developments are changing every aspect of our lives. Artificial intelligence is the chief factor that is driving technological developments.</p>



<p>Almost every business organization in the world is embracing&nbsp;artificial intelligence.&nbsp;The benefits reaped from AI in business processes are ground-breaking. AI solutions are changing traditional business processes and are driving them towards digital transformation. AI has helped companies amplify their growth, reach more audiences and work with agility.</p>



<p>With the growing technological developments, the cost of implementing them will also fluctuate. So the question is, how much does&nbsp;artificial intelligence cost?</p>



<p>Well, the answer is, it depends. The cost of artificial intelligence depends on a few crucial factors. To determine the&nbsp;<a href="https://www.analyticsinsight.net/ai-can-be-the-protector-of-privacy-and-respect-decision-making/">AI</a>&nbsp;costs, we have to understand the factors leading to the current pricing.</p>



<ul class="wp-block-list"><li><strong>The type of software the company is building</strong>: Artificial intelligence refers to any device or application which is programmed for decision-making based on the information it consumes, thus mimicking human intelligence. Chatbots and voice assistants, understanding questions uttered in natural language, the CT scan machines detecting tumors and other diseases in our bodies, and the security cameras identifying people from live video footages, everything falls under artificial intelligence. But their costs vary based on their complexity, performance, and the purpose of the software.</li><li><strong>The level of intelligence of the AI application</strong>: AI solutions implemented in business operations are described as narrow artificial intelligence, which means that they are programmed to perform a particular task. We can consider an AI application as highly intelligent if it can perform tasks with little to no human instructions. The cost of implementing such technologies depends upon the level of intelligence that the companies expect from the applications.</li><li><strong>The performance of the AI algorithm</strong>: Sufficient algorithm performance is one of the major cost factors to consider because accurate predictions require several rounds of tuning sessions, which raises the cost of implementing AI solutions. The higher the accuracy and efficiency of the AI predictions, the more the cost of implementing these solutions.</li><li><strong>The complexity of the AI solution: </strong>While discussing the cost of artificial intelligence, it is important to mention the cost of creating proper AI software, with a cloud-driven back-end, ETL/streaming tools, voice assistants, cloud dashboards, and other services. It is of utmost importance to build an efficient AI system since it will be the brain of the company’s technological system, pushing data and drawing insights to make informed decisions.</li><li><strong>The amount of data the AI application will consume: </strong>AI devices perform according to the data loaded in the system. It can absorb both structured and unstructured data. But when it comes to cost, structured data are cheaper to work with, especially if there is enough data to boost the AI algorithm’s accuracy.</li></ul>



<h4 class="wp-block-heading"><strong>So, how much does AI cost?</strong></h4>



<p>It is a common misconception that leveraging artificial intelligence technologies might cost a fortune. It might have been true in the past, but the scenario has changed with modern technology.</p>



<p>Earlier, only giant tech companies like Facebook, Microsoft, and Google could afford to develop AI-powered software and applications. Thanks to the advent of various affordable tools, libraries, and frameworks, which has made AI more available to small-scale organizations and startups, has reduced the cost of implementation.</p>



<p>Prices of AI solutions based on the different portfolios:</p>



<ul class="wp-block-list"><li>Prototype development starts from US$2500</li><li>Developing the Minimum Viable Product (MVP) based on the client’s data starts from US$8000 and can cost up to US$15000</li><li>The cost of implementing complete AI solutions may vary from US$20000 to US$1000000</li></ul>



<h4 class="wp-block-heading"><strong>The Future is AI</strong></h4>



<p>We use AI technology in our daily lives to satisfy our personal and professional needs. It not only helps businesses by providing creative insights from data analysis and solutions to complex business problems, but it also helps us by automating our daily monotonous activities.</p>



<p>Almost every industrial sector in the world is leveraging AI and machine learning to simplify the business process and expand their markets. According to reports, the global AI market is estimated to reach US$176 billion by 2027.</p>
<p>The post <a href="https://www.aiuniverse.xyz/how-much-does-artificial-intelligence-cost-in-2021/">HOW MUCH DOES ARTIFICIAL INTELLIGENCE COST IN 2021?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-much-does-artificial-intelligence-cost-in-2021/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>How Artificial Intelligence Can Slow the Spread of COVID-19</title>
		<link>https://www.aiuniverse.xyz/how-artificial-intelligence-can-slow-the-spread-of-covid-19/</link>
					<comments>https://www.aiuniverse.xyz/how-artificial-intelligence-can-slow-the-spread-of-covid-19/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Wed, 03 Mar 2021 09:17:22 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[COVID-19]]></category>
		<category><![CDATA[How]]></category>
		<category><![CDATA[Slow]]></category>
		<category><![CDATA[Spread]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=13196</guid>

					<description><![CDATA[<p>Source &#8211; https://knowledge.wharton.upenn.edu/ A new machine learning approach to COVID-19 testing has produced encouraging results in Greece. The technology, named Eva, dynamically used recent testing results collected <a class="read-more-link" href="https://www.aiuniverse.xyz/how-artificial-intelligence-can-slow-the-spread-of-covid-19/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-artificial-intelligence-can-slow-the-spread-of-covid-19/">How Artificial Intelligence Can Slow the Spread of COVID-19</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://knowledge.wharton.upenn.edu/</p>



<p>A new machine learning approach to COVID-19 testing has produced encouraging results in Greece. The technology, named Eva, dynamically used recent testing results collected at the Greek border to detect and limit the importation of asymptomatic COVID-19 cases among arriving international passengers between August and November 2020, which helped contain the number of cases and deaths in the country.</p>



<p>The findings of the project are explained in a paper titled “Deploying an Artificial Intelligence System for COVID-19 Testing at the Greek Border,” authored by Hamsa Bastani, a Wharton professor of operations, information and decisions and affiliated faculty at Analytics at Wharton; Kimon Drakopoulos and Vishal Gupta from the University of Southern California; Jon Vlachogiannis from investment advisory firm Agent Risk; Christos Hadjicristodoulou from the University of Thessaly; and Pagona Lagiou, Gkikas Magiorkinis, Dimitrios Paraskevis and Sotirios Tsiodras from the University of Athens.</p>



<p>The analysis showed that Eva on average identified 1.85 times more asymptomatic, infected travelers than what conventional, random surveillance testing would have achieved. During the peak travel season of August and September, the detection of infection rates was up to two to four times higher than random testing.</p>



<p>“Our work paves the way for leveraging [artificial intelligence] and real-time data for public health goals, such as border control during a pandemic,” the paper stated. With the rapid spread of a new coronavirus strain, Eva also holds the promise of maximizing the already overburdened testing infrastructure in most countries.</p>



<p>“The main issue was, given the fixed budget for tests, whether we could conduct the tests in a smarter way with dynamic surveillance to identify more infected travelers,” said Bastani. One of the biggest challenges governments face in dealing with COVID-19 is the inability of the testing infrastructure at their national borders to realistically check every arriving passenger. Such comprehensive testing would be both costly and time-consuming, which is why most countries screen either arriving passengers from specific countries or conduct random testing for COVID-19.</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow"><p>“The main issue was, given the fixed budget for tests, whether we could conduct the tests in a smarter way with dynamic surveillance to identify more infected travelers.”–Hamsa Bastani</p></blockquote>



<p>Eva also allowed Greece to identify when a country was exhibiting a spike in COVID-19 infections a median of nine days earlier than what would have been possible with machine learning-based algorithms using only publicly available data.</p>



<p>The underlying technology of Eva is a “contextual bandit algorithm,” a machine-learning framework built for “sequential decision-making,” taking into account various practical challenges like time-varying information and port-specific testing budgets, Bastani explained. The algorithm balances the need to maintain high-quality surveillance estimates of COVID-19 prevalence across countries and the allocation of limited testing results to catch likely infected travelers. Eva is the first instance of that technology being applied to address a public health challenge, although such algorithms have found use in online advertising and A/B testing, she added.</p>



<p><strong>Overcoming Data Challenges</strong></p>



<p>Eva is an advancement over conventional border control policies because it does not rely on publicly reported data, which has a number of issues.</p>



<p>Publicly reported data is of “poor quality” chiefly because different countries follow different reporting protocols and testing strategies. It is common to focus testing resources on symptomatic patients, but the resulting prevalence rate may not be reflective of the asymptomatic population that is likely to travel. There is often also a reporting delay due to poor infrastructure, said Bastani. “We can tell, based on the data we’re actively collecting at borders, that a country’s COVID cases are spiking typically nine days before you will see that reflected in the public data.”</p>



<p>“Testing is usually targeted towards symptomatic individuals rather than asymptomatic individuals,” Bastani said in an interview with the Wharton Business Daily radio show on SiriusXM last July, as the Greek deployment was getting underway. (Listen to the podcast from that episode above). “You can imagine tourists who are coming in are probably asymptomatic.” That underscores the criticality of not relying on publicly reported data, but using data that accurately reflects the prevalence of asymptomatic COVID-19 travelers across countries.</p>



<p>Eva’s algorithm overcomes the poor quality of public data by dynamically collecting testing results at the Greek border, thereby maintaining high-quality surveillance estimates of the prevalence in each country. “By adaptively adjusting border policies nine days earlier, Eva prevented additional infected travelers from arriving,” the paper noted, referring to the Greece deployment. “That is a long period of time in which a lot of high-risk people would probably have come in and infected other citizens,” said Bastani.</p>



<p>It is common for border control policies to use publicly reported data, but such data are often unreliable and inconsistent across countries, said Bastani. The inconsistencies arise from censorship of testing data by some countries, and even varying definitions of a COVID-19 death, she added. She pointed to the recent discovery of undercounting of COVID-19 deaths in nursing homes in New York City as an example of flawed data. “That issue is exacerbated when you compare death counts in different countries because in some places they’re accounting very accurately and in other places they’re not.”</p>



<p>Greece is the first country to design border controls based on the dynamic random surveillance testing approach that Eva uses. The model specifies the infrastructure required to collect COVID-19 test results, using those to form estimates and to inform future testing decisions in a dynamic feedback loop.</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow"><p>“No country should just be relying on public data; they should be actively monitoring who is coming to their borders, testing at least a subset of them, and using that to make informed decisions about border control.”–Hamsa Bastani</p></blockquote>



<p>In using the Eva model, Greece required every individual or family planning to enter the country to fill out 24 hours before arrival a digitized “Passenger Locator Form,” where they provided some basic information about themselves such as other countries they have visited in the past year. All those who submitted those forms received a QR code that allowed tracking. Eva’s algorithm processes the information in the forms to identify those who need to get tested for COVID-19. Greece’s border control authorities processed an average of 38,500 forms each day; some 18% of those who submitted the forms did not eventually show up.</p>



<p><strong>Keeping COVID-19 at Bay</strong></p>



<p>Eva’s targeted testing that allowed for adaptive border control policies helped Greece keep its case count “very low pretty much all of the summer” said Bastani. The country was able to maintain some economic activity, unlike many others that had to completely shut down, she noted. Greece imposed a second lockdown and travel restrictions in November after a spike in COVID-19 cases.</p>



<p>The Greek government acknowledged Eva’s accomplishments in a press conference last July. “The AI system developed by Bastani, Drakopoulos, Gupta, and Vlachogiannis has been an asset both for preparing the opening of the country to visitors from all over the world, as well as for allowing flexibility in decision-making regarding our COVID-19 strategy,” said Nikos Hardalias, Greece’s civil protection and deputy minister for crisis management, who heads the COVID-19 Response Taskforce for the country.</p>



<p><strong>Free-to-use Technology</strong></p>



<p>Eva is an open-source technology, which means Bastani and her team will provide it free of cost to any country that might want it. They have made presentations to COVID task forces in several countries in the European Union. Adapting it to other countries would involve designing passenger locator forms that are appropriate for different immigration processes and dovetailing back-end resources such as testing labs.</p>



<p>Bastani made a strong pitch for governments to capture private data such as that generated by the passenger locator forms used in the Greece deployment, and customize them to suit their specific situations. “No country should just be relying on public data; they should be actively monitoring who is coming to their borders, testing at least a subset of them, and using that to make informed decisions about border control,” she said. “That said, if a country doesn’t have the resources to do that, it’s probably better to use a policy that mimics another country that is doing that rather than relying only on public data.”</p>



<p>Bastani and her colleagues are working on refining Eva to incorporate more passenger-specific information than they used in the Greece deployment. Europe’s General Data Protection Regulation limited the scope of data they could use with Eva; they used only anonymized and aggregated data with limited demographic information. Other countries with less stringent data protection regulations could gather a wider range of data, such as on occupation, Bastani said. “We know that certain occupations carry a much higher COVID-19 risk than others.”</p>



<p>Eva could also be trained to incorporate pooling to mitigate constraints faced by testing labs, she added. Overloaded labs could share their samples with other labs that may have spare capacity at any given point in time, she explained. In much the same way, Eva could also use dynamic data to help determine optimal staffing levels at labs and other locations in the testing infrastructure, she added.</p>
<p>The post <a href="https://www.aiuniverse.xyz/how-artificial-intelligence-can-slow-the-spread-of-covid-19/">How Artificial Intelligence Can Slow the Spread of COVID-19</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-artificial-intelligence-can-slow-the-spread-of-covid-19/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>How Can Artificial Intelligence Help Medicine?</title>
		<link>https://www.aiuniverse.xyz/how-can-artificial-intelligence-help-medicine/</link>
					<comments>https://www.aiuniverse.xyz/how-can-artificial-intelligence-help-medicine/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 25 Feb 2021 05:21:46 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Artificial]]></category>
		<category><![CDATA[Can]]></category>
		<category><![CDATA[How]]></category>
		<category><![CDATA[Intelligence]]></category>
		<category><![CDATA[medicine]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=13070</guid>

					<description><![CDATA[<p>Source &#8211; https://www.healthtechzone.com/ Thanks to the technology that we have at our hands these days, our everyday lives are much easier. We are able to complete multiple <a class="read-more-link" href="https://www.aiuniverse.xyz/how-can-artificial-intelligence-help-medicine/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-can-artificial-intelligence-help-medicine/">How Can Artificial Intelligence Help Medicine?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.healthtechzone.com/</p>



<p>Thanks to the technology that we have at our hands these days, our everyday lives are much easier. We are able to complete multiple tasks without even leaving the comfort of our homes, stay updated on the latest news, and much more. Medicine was one of the areas that progressed immensely due to technological advancements, and we couldn&#8217;t be happier about it.</p>



<p>People receive the proper care, get accurate diagnostics, and the devices used to make every service both effective and efficient. In the past couple of years, there have been talks of implementing artificial intelligence in the medicinal sector as a way to improve this industry and make it near-perfect. We wanted to discuss the question of how can AI help this sector, but we are also going to take a look at one industry where AI is used to the fullest potential.</p>



<p><strong>Where is AI Used The Best?</strong></p>



<p>One of the industries that have managed to incorporate this technology and use its full potential is the online casino industry. Casino sites use artificial intelligence to protect their players, but also to enforce fair-play. Let us explain how.</p>



<p>In order for every player to have equal chances of winning, online casinos use Random Number Generators. This AI system creates random outcomes of each game and gives equal chances of winning to all players. Casimba Casino is a good example of an online casino that features this and the security AI, which we are about to explain.</p>



<p>The AI-powered security system at the aforementioned casino and many other casino sites goes by the name SSL-encryption software. This software takes all the data from the players and turns it into an unbreakable code, thus making it impossible for unwanted third parties to gain access. Both of these AI systems utilize algorithms to ensure safety and fair-play.</p>



<p><strong>How Wil AI Help Medicine?</strong></p>



<p>Through the use of algorithms, medicine can reap great benefits. Medicine sites can use SSL certificates to keep their patients’ data safe and out of harm’s way. Not only that, but this type of AI can also impact other areas such as radiology, pathology, cardiology, and ophthalmology. How? The algorithms are ever-learning and can analyze the data from various patients much faster than a doctor can. Then, by comparing it with other diagnoses, it can aid the doctor and pinpoint the exact treatment needed for that particular case.</p>



<p>These AI systems would aid in practices like diagnosis, treatment protocol development, patient monitoring, drug development, personalized medicine, and care.</p>



<p>Being efficient in this line of work is very important. Additionally, AI has the potential to be less prone to errors, which is a big advantage, especially when it comes to determining the diagnosis and the right treatment.</p>



<p>The only problem here is; AI is still in its development stages and authorities do not trust it that much to give it such an important role. While basic artificial intelligence is used in some sectors, many believe that it is still early to fully incorporate it. But, as technology keeps evolving, we do not doubt that AI will help medicine become much more effective.</p>
<p>The post <a href="https://www.aiuniverse.xyz/how-can-artificial-intelligence-help-medicine/">How Can Artificial Intelligence Help Medicine?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-can-artificial-intelligence-help-medicine/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>What is Artificial Intelligence? How Does AI Work?</title>
		<link>https://www.aiuniverse.xyz/what-is-artificial-intelligence-how-does-ai-work/</link>
					<comments>https://www.aiuniverse.xyz/what-is-artificial-intelligence-how-does-ai-work/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 19 Feb 2021 05:41:11 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Artificial]]></category>
		<category><![CDATA[How]]></category>
		<category><![CDATA[Intelligence]]></category>
		<category><![CDATA[What]]></category>
		<category><![CDATA[work]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=12931</guid>

					<description><![CDATA[<p>Source &#8211; https://www.business2community.com/ “Depending on who you ask, AI is either man’s greatest invention since the discovery of fire”, as Google’s CEO said at Google’s I/O 2017 <a class="read-more-link" href="https://www.aiuniverse.xyz/what-is-artificial-intelligence-how-does-ai-work/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/what-is-artificial-intelligence-how-does-ai-work/">What is Artificial Intelligence? How Does AI Work?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.business2community.com/</p>



<p>“Depending on who you ask, AI is either man’s greatest invention since the discovery of fire”, as Google’s CEO said at Google’s I/O 2017 keynote, or it is a technology that might one day make man superfluous. What’s inarguable is major companies have embraced AI as if it was one of the most important discoveries ever invented. In the US, Amazon, Apple, Microsoft, Facebook, IBM, SAS, and Adobe have all infused AI and machine learning throughout their operations, while in China the big four – Baidu, Alibaba, Tencent, Xiaomi – are coordinating with the government and all working on unique and almost siloed AI initiatives.</p>



<p>In her article Understanding Three Types of Artificial Intelligence, Anjali UJ explains “The term AI was coined by John McCarthy, an American computer scientist in 1956.” Anjali speaks of the following three types of AI, including:</p>



<ol class="wp-block-list"><li>Narrow Artificial Intelligence: AI that has been trained for a narrow task.</li><li>Artificial General Intelligence: AI containing generalized cognitive abilities, which understand and reason the environment the way humans do.</li><li>Artificial Super Intelligence: AI that surpasses human intelligence and allows machines to mimic human thought.</li></ol>



<p>AI is not a new technology, in reality, it’s decades old. In his MIT Technology Review article Is AI Riding a One-Trick Pony?, James Somers states “Just about every AI advance you’ve heard of depends on a breakthrough that’s three decades old.” Recent advances in chip technology, as well as improvements in hardware, software, and electronics have turned AI’s enormous potential into reality.</p>



<h2 class="wp-block-heading"><strong>Neural Nets</strong></h2>



<p>AI is founded on Artificial Neural Networks (ANN) or just “Neural Nets”, which are non-linear statistical data modelling tools used when the true nature of a relationship between input and output is unknown. In his article Machine Learning Applications for Data Center Optimization, Jim Gao describes neural nets as “a class of machine learning algorithms that mimic cognitive behavior via interactions between artificial neurons.” Neural nets search for patterns and interactions between features to automatically generate a best­ fit model.</p>



<p>They do not require the user to predefine a model’s feature interactions. Speech recognition, image processing, chatbots, recommendation systems, and autonomous software agents are common examples of machine learning. There are three types of training in neural networks; supervised, which is the most common, as well as unsupervised training and reinforcement learning. AI can be broken down into three areas:</p>



<h2 class="wp-block-heading"><strong>Machine Learning</strong></h2>



<p>A branch of computer science, machine learning explores the composition and application of algorithms that learn from data. These algorithms build models based on inputs and use those results to predict or determine actions and results, rather than following strict instructions.</p>



<p>Supervised learning’s goal is to learn a general rule that maps inputs to outputs and the computer is provided with example inputs as well as the desired outputs. With unsupervised learning, however, labeled data isn’t provided to the learning algorithm and it must find the input’s structure on its own. In reinforcement learning, the computer utilizes trial and error to solve a problem. Like Pavlov’s dog, the computer is rewarded for good actions it performs and the goal of the program is to maximize reward.</p>



<h2 class="wp-block-heading"><strong>Deep learning</strong></h2>



<p>A subset of machine learning, deep learning utilizes multi-layered neural nets to perform classification tasks directly from image, text, and/or sound data. In some cases, deep learning models are already exceeding human-level performance. Google Meet’s ability to transcribe a human voice during a live conference call is an example of deep learning’s impressive capabilities.</p>



<p>ML and deep learning are useful for personalization marketing, customer recommendation, spam filtering, fraud detection, network security, optical character recognition (OCR), computer vision, voice recognition, predictive asset maintenance, sentiments analysis, language translations, and online search, among others.</p>



<h2 class="wp-block-heading"><strong>7 Patterns of AI</strong></h2>



<p>In her Forbes article The Seven Patterns of AI, Kathleen Walch lays out a theory that, regardless of the application of AI, there are seven commonalities to all AI applications. These are “hyperpersonalization, autonomous systems, predictive analytics and decision support, conversational/human interactions, patterns and anomalies, recognition systems, and goal-driven systems.” Walch adds that, while AI might require its own programming and pattern recognition, each type can be combined with others, but they all follow their own pretty standard set of rules.</p>



<p>The ‘Hyperpersonalization Pattern’ can be boiled down to the slogan, ‘Treat each customer as an individual’. ‘Autonomous systems’ will reduce the need for manual labor. Predictive analytics portends “some future value for data, predicting behavior, predicting failure, assisted problem resolution, identifying and selecting best fit, identifying matches in data, optimization activities, giving advice, and intelligent navigation,” says Walch. The ‘Conversational Pattern’ includes chatbots, which allow humans to communicate with machines via voice, text, or image.</p>



<p>The ‘Patterns and Anomalies’ type utilizes machine learning to discern patterns in data and it attempts to discover higher-order connections between data points, explains Walch. The recognition pattern helps identify and determine objects within image, video, audio, text, or other highly unstructured data notes Walch. The ‘Goal-Driven Systems Pattern’ utilizes the power of reinforcement learning to help computers beat humans on some of the most complex games imaginable, including&nbsp;<em>Go&nbsp;</em>and&nbsp;<em>Dota 2</em>, a complicated multiplayer online battle arena video game.</p>



<h2 class="wp-block-heading"><strong><sup>Conclusion</sup></strong></h2>



<p>A few years ago, the AI hype had reached such a fever pitch that companies just had to add ‘AI’, ‘ML’, or ‘Deep Learning’ to their pitch decks, and funding flooded through the door. However, businesses are investing in AI powered solutions like AIOps to reduce IT operations cost. Today, investors are a little wiser to the fact that not all that glitters is AI gold, and a lot of companies who pitched themselves as AI experts really didn’t know the difference between a neural net and a&nbsp;<em>k</em>-means algorithm.</p>



<p>Jumping head-first into AI is a recipe for disaster. Only “1 in 3 AI projects are successful and it takes more than 6 months to go from concept to production, with a significant portion of them never making it to production—creating an AI dilemma for organizations,” says Databricks. Not only is AI old, but it is also a difficult technology to implement. Anyone delving into AI needs to have a strong understanding of technology, what it is, where it came from, what limitations might hold it back, so although AI is exceptional technology, the waters are deep. It is far from the panacea that many software companies claim it is. AI has had not one but two AI winters. CEOs looking to make a substantial investment in AI should be well aware of the old saying that ‘a fool and his money are easily parted’, as that fool could be an AI fool, too.</p>
<p>The post <a href="https://www.aiuniverse.xyz/what-is-artificial-intelligence-how-does-ai-work/">What is Artificial Intelligence? How Does AI Work?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/what-is-artificial-intelligence-how-does-ai-work/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>How has a career in data analytics changed?</title>
		<link>https://www.aiuniverse.xyz/how-has-a-career-in-data-analytics-changed/</link>
					<comments>https://www.aiuniverse.xyz/how-has-a-career-in-data-analytics-changed/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 14 Jun 2019 10:15:04 +0000</pubDate>
				<category><![CDATA[Data Science]]></category>
		<category><![CDATA[Analytics]]></category>
		<category><![CDATA[Career]]></category>
		<category><![CDATA[changed]]></category>
		<category><![CDATA[data]]></category>
		<category><![CDATA[How]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=3832</guid>

					<description><![CDATA[<p>Source:- .siliconrepublic.com We spoke to EY’s Eoin O’Reilly to find out more about how the data analytics role has changed in recent years. At this stage, referring to <a class="read-more-link" href="https://www.aiuniverse.xyz/how-has-a-career-in-data-analytics-changed/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-has-a-career-in-data-analytics-changed/">How has a career in data analytics changed?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source:- .siliconrepublic.com</p>
<p>We spoke to EY’s Eoin O’Reilly to find out more about how the data analytics role has changed in recent years.</p>
<p>At this stage, referring to data as ‘the new oil’ has been rendered a cliché, but it nevertheless rings enduringly true. In an increasingly digital world, data can completely change the way organisations do business and, as such, is deeply valuable to C-level executives.</p>
<p>This has inspired a lot of change in the field, particularly in terms of the types of duties someone working in data analytics has and how they are perceived within the broader scheme of the company. Yet what specifically has changed?</p>
<p>To find out, we chatted to Eoin O’Reilly, a partner at EY Ireland and leader of the company’s analytics and emerging tech business. For the Irish hub, that translates to roughly 130 people, having grown sharply from a team of only six in 2014.</p>
<p>His team essentially applies advanced techniques and AI to business problems to help its clients ensure they’re functioning at the highest possible level. It aids clients in their analytics strategies and helps them think about where analytics can be applied. EY also applies those kinds of techniques to more traditional services such as audit and tax, services that can be augmented and improved with the use of innovative technologies.</p>
<p>“It’s fascinating, actually. Those traditional services are all being disrupted. So how we use analytics and AI in those areas is becoming increasingly important for our clients and, increasingly, a way that we differentiate our services,” O’Reilly explains.</p>
<p>One way O’Reilly notes that roles in data analytics have changed is how they are perceived. They used to be, as he puts it, “lower-level” positions. In all likelihood, data analytics was once widely thought of as an esoteric and highly technical pocket of the large engine of a company.</p>
<p>“What we’re seeing in the market is that analytics and innovation [are] now seen as strategically important to organisations. We’re starting to see leadership roles in that arena. I think the traditional analytics professional was very focused on the tech part of the job, so building the models, applying science to data, but I think that’s probably changed a little bit. Now, people are seeing that a career in analytics is much wider. It might start in that technical domain but you have an opportunity to grow.”</p>
<p>As such, data analytics professionals now need to have a totally different set of skills. On top of the requisite upskilling to keep up with the breathless pace of technological advancement, your career in data analytics may very well now involve storytelling.</p>
<p>“How do [data professionals] tell a story about data to senior organisations, make it real? How do they collaborate in an organisation? How do they work with traditional skills in finance, supply chain and operations to really bring analytics to life? Analytics skills on their own don’t mean that you’re going to have a successful analytics programme,” said O’Reilly.</p>
<p>People working in this space will have a deep – and in many ways unprecedented – connection to the business side of an organisation. Not only does that require commercial acumen, but communication. Ultimately, many professionals working in data analytics will have to explain what they do to people without a data background, and do so in a sufficiently accessible way.</p>
<p>“It’s still a scientific discipline so the technical skills are still important. That should never be watered down. But I think if you can match these three Cs – creativity, communication and collaboration – you’ve got a really good standout analytics professional.”</p>
<p>The post <a href="https://www.aiuniverse.xyz/how-has-a-career-in-data-analytics-changed/">How has a career in data analytics changed?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-has-a-career-in-data-analytics-changed/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
