<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>computer chips Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/computer-chips/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/computer-chips/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Tue, 31 Mar 2020 09:20:18 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>A startup is building computer chips using human neurons</title>
		<link>https://www.aiuniverse.xyz/a-startup-is-building-computer-chips-using-human-neurons/</link>
					<comments>https://www.aiuniverse.xyz/a-startup-is-building-computer-chips-using-human-neurons/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 31 Mar 2020 09:20:12 +0000</pubDate>
				<category><![CDATA[Deep Learning]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[computer chips]]></category>
		<category><![CDATA[deep learning]]></category>
		<category><![CDATA[DeepMind]]></category>
		<category><![CDATA[Technology]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=7867</guid>

					<description><![CDATA[<p>Source: fortune.com One of the most promising approaches to artificial intelligence is to try to mimic how the human brain works in software. But now an Australian <a class="read-more-link" href="https://www.aiuniverse.xyz/a-startup-is-building-computer-chips-using-human-neurons/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/a-startup-is-building-computer-chips-using-human-neurons/">A startup is building computer chips using human neurons</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: fortune.com</p>



<p>One of the most promising approaches to artificial intelligence is to try to mimic how the human brain works in software.<br><br>But now an Australian startup has gone a step further. It’s actually building miniature disembodied brains, using real, biological neurons embedded on a specialized computer chip.<br><br>Cortical Labs, based in Melbourne, is hoping to teach these hybrid mini-brains to perform many of the same tasks that software-based artificial intelligence can, but at a fraction of the energy consumption. Currently, the company is working to get its mini-brains—which so far are approaching the processing power of a dragonfly brain—to play the old Atari arcade game <em>Pong</em>, Hon Weng Chong, the company’s cofounder and chief executive officer, said.<br><br>The benchmark is significant because Pong was among the early Atari games that DeepMind—the London-based A.I. company known for its work with artificial neural networks, software that in some ways mimics the functioning of human neurons—first used to demonstrate the performance of its A.I. algorithms in 2013. That demonstration helped lead to Google’s purchase of DeepMind the following year.<br><br>Cortical Labs uses two methods to create its hardware: It either extracts mouse neurons from embryos or it uses a technique in which human skin cells are transformed back into stem cells and then induced to grow into human neurons, Chong said.<br><br>These neurons are then embedded in a nourishing liquid medium on top of a specialized metal-oxide chip containing a grid of 22,000 tiny electrodes that enable programmers to provide electrical inputs to the neurons and also sense their outputs.<br><br>Right now, Cortical Labs is using mouse neurons for its <em>Pong</em> research.<br><br>“What we are trying to do is show we can shape the behavior of these neurons,” Chong said.<br><br>Although it is starting with Pong, a task Chong said he thinks Cortical Labs will be able to master by the end of the year, he added that the company’s hybrid chips could eventually be the key to delivering the kinds of complex reasoning and conceptual understanding that today’s A.I. can’t produce.<br><br>The company’s method, if it proves scalable, also offers a potential solution to one of the most vexing problems facing deep learning: It is extremely energy intensive.<br><br>AlphaGo, the deep-learning system DeepMind created to play Go and which beat the world’s best human player in that ancient strategy game in 2016, consumed one megawatt of power while playing the game, enough to power about 100 homes for a day, according to an estimate by technology company Ceva. By contrast, the human brain consumes about 20 watts of power, or 50,000 times less energy than AlphaGo used.<br><br>Karl Friston, a neuroscientist at University College London renowned for his work on brain imaging, as well as the theoretical underpinnings of how biological systems, including collections of neurons, self-organize, saw a demonstration of Cortical Labs’ technology earlier this year and said he is impressed with the company’s work.<br><br>Aspects of Cortical Labs’ system are based on Friston’s work and the research of some of his students, but the neuroscientist has no affiliation with the Australian startup.<br><br>Friston said he always assumed his ideas about how neurons organize would be used to build more efficient neuromorphic computer chips—hardware that tries to mimic how the brain processes information much more closely than today’s standard computer chips do. The idea of trying to integrate biological neurons with semiconductors is not, Friston said, an idea he’d anticipated.<br><br>“But to my surprise and delight they have gone straight for the real thing,” he said of Cortical Labs’ use of real biological neurons. “What this group has been able to do is, to my mind, the right way forward to making these ideas work in practice.”<br><br>Using real neurons avoids several other difficulties that software-based neural networks have. For instance, to get artificial neural networks to start learning well, their programmers usually have to engage in a laborious process of manually adjusting the initial coefficients, or weights, that will be applied to each type of data point the network processes. Another challenge is to get the software to balance how much it should be trying to explore new solutions to a problem versus relying on solutions the network has already discovered that work well.<br><br>“All these problems are completely eluded if you have a system that is based on biological neurons to begin with,” Friston said.<br><br>Chong, a former medical doctor who had founded a previous health technology company, began researching ways to create hybrid biologic-computer intelligence systems about two years ago, along with his cofounder and chief technology officer, Andy Kitchen.<br><br>Chong said the pair were interested in the idea of artificial general intelligence (AGI for short)—A.I. that has the flexibility to perform almost any kind of task as well or better than humans. “Everyone is racing to build AGI, but the only true AGI we know of is biological intelligence, human intelligence,” Chong said. He noted the pair figured the only way to get human-level intelligence was to use human neurons.</p>



<p>Mouse neurons, which Cortical Labs is also experimenting with, have long been used as proxies for human neurons by neuroscientists because there were long-established methods for extracting and culturing them. (The ability to culture engineer human neurons from skin cells has only been perfected in the past decade.) Recently scientists at the Allen Institute for Brain Science in Seattle have found differences in the proteins that coat mouse and human neurons, which may mean they have different electrical properties and that mouse neurons may not actually be good stand-ins for human ones.<br><br>Chong said he and Kitchen took inspiration from the work of Takuya Isomura, a researcher at the RIKEN Center for Brain Science outside Tokyo who has studied under Friston. Isomura had shown in 2015 how cultured cortical neurons overlaid on an electrode grid could learn to overcome the “cocktail party” effect, separating an individual audio signal, such as a person’s voice, from the cacophony of background noise.<br><br>Cortical Labs, which was founded formally only last June, has received about $610,000 in seed funding from Blackbird Ventures, a prominent Australian venture capital firm.<br><br>It is not the only company working on biological computing. A startup called Koniku, based in San Rafael, Calif., has developed a 64-neuron silicon chip, built using mouse neurons, that can sense certain chemicals. The company wants to use the chips in drones that it will sell to militaries and law enforcement for detecting explosives.<br><br>Meanwhile, researchers at the Massachusetts Institute of Technology have taken a different approach—using a specialized strain of bacteria in a hybrid chip to compute and store information.</p>
<p>The post <a href="https://www.aiuniverse.xyz/a-startup-is-building-computer-chips-using-human-neurons/">A startup is building computer chips using human neurons</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/a-startup-is-building-computer-chips-using-human-neurons/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Google experiments with AI to design its in-house computer chips</title>
		<link>https://www.aiuniverse.xyz/google-experiments-with-ai-to-design-its-in-house-computer-chips/</link>
					<comments>https://www.aiuniverse.xyz/google-experiments-with-ai-to-design-its-in-house-computer-chips/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Wed, 19 Feb 2020 06:02:17 +0000</pubDate>
				<category><![CDATA[Google AI]]></category>
		<category><![CDATA[AI experiment]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[computer chips]]></category>
		<category><![CDATA[Google]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=6881</guid>

					<description><![CDATA[<p>Source: zdnet.com Alphabet&#8217;s Google unit is trying out artificial intelligence programs to advance its internal development of dedicated chips to accelerate its software, according to Google&#8217;s head of AI <a class="read-more-link" href="https://www.aiuniverse.xyz/google-experiments-with-ai-to-design-its-in-house-computer-chips/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/google-experiments-with-ai-to-design-its-in-house-computer-chips/">Google experiments with AI to design its in-house computer chips</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: zdnet.com</p>



<p>Alphabet&#8217;s Google unit is trying out artificial intelligence programs to advance its internal development of dedicated chips to accelerate its software, according to Google&#8217;s head of AI research, Jeff Dean. </p>



<p>&#8220;We are using it internally for a few chip design projects,&#8221; said Dean in an interview with&nbsp;<em>ZDNet</em>&nbsp;Monday, following a keynote talk he gave at the International Solid State Circuits Conference, an annual technical symposium held in San Francisco.&nbsp;</p>



<p>Google has over the course of several years developed a family of AI hardware, its Tensor Processing Unit, or TPU, chip, for processing AI in its server computers.&nbsp;</p>



<p>Using AI to design those chips would represent a kind of virtuous cycle, where AI makes chips better, and then those improved chips boost the power of the AI algorithms, and so on.&nbsp;</p>



<p>During his keynote, Dean described to the audience how a machine learning program can be used to make some decisions about how to lay out circuits of a computer chip, with the resultant design having equal or greater acumen compared to a human chip designer.&nbsp;</p>



<p>In the traditional &#8220;place and route&#8221; task, chip designers use software to determine the layout in a chip of the circuits that form the chip&#8217;s operations, analogous to designing the floor plan of a building. A number of variables come into play to find an optimal layout that fulfills several objectives, including delivering chip performance, but also avoiding unnecessary complexity that can drive up the cost to manufacture the chip. That balancing act requires a lot of human heuristics about how best to pursue design. Now, AI algorithms may be able to experiment in ways that can be competitive with those heuristics.&nbsp;</p>



<p>In one example, Dean told the audience that a deep learning neural network, after only twenty-four hours on the problem, found a better solution than human designers six to eight weeks on the problem. The design resulted in a reduction of the total wiring needed in the chip, an improvement.</p>



<p>The deep learning program is akin to the AlphaZero program developed by Google&#8217;s DeepMind unit to conquer the game of Go. Like AlphaZero, the chip design program is a form of what&#8217;s called reinforcement learning. In order to achieve a goal, the program tries various steps to see which ones lead to better results. Rather than pieces on a game board, the moves are choices of how to place the right circuit layout in the total chip design. </p>



<p>Unlike in Go, however, the solution &#8220;space,&#8221; the number of possible circuit layouts, are vastly larger. And, as mentioned above, numerous objectives have to be accommodated, rather than the single objective in Go of winning the game.&nbsp;</p>



<p>Dean, talking with <em>ZDNet</em>, described the internal efforts as being in the early stages of understanding the utility of the technology. &#8220;We&#8217;re getting our designers to experiment with it and see how they start to make use of it in their workflows,&#8221; said Dean. </p>



<p>&#8220;We&#8217;re trying to understand how it&#8217;s useful, and what areas does it improve on.&#8221;&nbsp;</p>



<p>Google&#8217;s foray into AI design comes amidst a renaissance in chip production, as companies large and small design dedicated silicon to run machine learning faster. Dedicated AI hardware can lead to larger and more efficient machine learning software projects, according to some machine learning scientists. </p>



<p>The diversity created by AI hardware startup companies, such as Cerebras Systems and Graphcore, can be expected to continue apace, said Dean, even as Google expands its own efforts.&nbsp;</p>



<p>Dean said the variety that&#8217;s emerging is intriguing.&nbsp;</p>



<p>&#8220;I&#8217;m not sure if they&#8217;re all going to survive, but it&#8217;s pretty interesting because many of them are taking very different design points in the design space,&#8221; Dean said of the startups. &#8220;Just as one distinction, some are accelerating models that are very small, that can fit in on-chip SRAM,&#8221; he said, meaning, the size of the machine learning model is so small it doesn&#8217;t need external memory.&nbsp;</p>



<p>&#8220;And if your model fits in SRAM, those things are going to be very effective, but if you&#8217;re model doesn&#8217;t, that&#8217;s not the chip for you.&#8221;</p>



<p>Asked if the chips will converge on some standard design, Dean suggested diversity is more likely, at least for the time being.&nbsp;</p>



<p>&#8220;I do think there&#8217;s going to be more heterogeneity in the kind of approaches used, not less,&#8221; he said, &#8220;because if you look at the explosion in machine learning research, and uses of machine learning in lots of different kinds of problems, it&#8217;s going to be a large enough set of things in the world that you&#8217;re not going to want just one design, you&#8217;re going to want five or six — not a thousand, but five or six different design points.&#8221;</p>



<p>Added Dean, &#8220;It&#8217;ll be interesting to see which ones hold up, in terms of, are they generally useful for a lot of things, or are they very specialized and accelerate one kind of thing but don&#8217;t do well on others.&#8221;</p>



<p>As for Google&#8217;s own efforts beyond the TPU, Dean indicated there&#8217;s an appetite for more and more dedicated silicon at Google. Asked if the trend to AI hardware at Google &#8220;has legs,&#8221; meaning, can extend beyond its current offerings, Dean replied, &#8220;Oh, yeah.&#8221;</p>



<p>&#8220;Definitely there&#8217;s growing use of machine learning across Google products, both data-center-based services, but also much more of our stuff is running on device on the phone,&#8221; said Dean. The Google Translate application is an escape of a sophisticated program, now at seventy different languages, that can run on a phone even in airplane mode, he noted, when there&#8217;s no connection back to the data center.  </p>



<p>The family of Google silicon for AI has already broadened, he indicated. The &#8220;Edge TPU,&#8221; for example, is a designation that covers &#8220;different design points,&#8221; said Dean, including low-power applications, on the one hand, and high-performance applications at the heart of the data center. Asked if the variety could broaden still further, Dean replied, &#8220;I think it could.&#8221;</p>



<p>&#8220;Even within non-data-center things, you&#8217;re already seeing a distinction of higher power environments like autonomous vehicles, things that don&#8217;t have to be at the 1-watt level, they can be fifty or a hundred watts,&#8221; he said. &#8220;So you want different parts for that versus something on a phone.&#8221; At the same time, there will be ultra-low-power applications like sensors in agriculture that do some AI processing without sending any data to the cloud. Equipped with AI, such a sensor can assess whether there is any data of interest being picked up, say, via a camera, and stream those individual data points back to the cloud for analysis.</p>
<p>The post <a href="https://www.aiuniverse.xyz/google-experiments-with-ai-to-design-its-in-house-computer-chips/">Google experiments with AI to design its in-house computer chips</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/google-experiments-with-ai-to-design-its-in-house-computer-chips/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>AI Chip Startup Cerebras Reveals &#8216;World&#8217;s Fastest AI Supercomputer&#8217;</title>
		<link>https://www.aiuniverse.xyz/ai-chip-startup-cerebras-reveals-worlds-fastest-ai-supercomputer/</link>
					<comments>https://www.aiuniverse.xyz/ai-chip-startup-cerebras-reveals-worlds-fastest-ai-supercomputer/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 28 Nov 2019 09:54:38 +0000</pubDate>
				<category><![CDATA[AI-ONE]]></category>
		<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[AI software]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[computer chips]]></category>
		<category><![CDATA[supercomputer]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=5451</guid>

					<description><![CDATA[<p>Source: crn.com Artificial intelligence chip startup Cerebras Systems claims it has the &#8220;world&#8217;s fastest AI supercomputer,&#8221; thanks to its large Wafer Scale Engine processor that comes with <a class="read-more-link" href="https://www.aiuniverse.xyz/ai-chip-startup-cerebras-reveals-worlds-fastest-ai-supercomputer/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/ai-chip-startup-cerebras-reveals-worlds-fastest-ai-supercomputer/">AI Chip Startup Cerebras Reveals &#8216;World&#8217;s Fastest AI Supercomputer&#8217;</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: crn.com</p>



<p>Artificial intelligence chip startup Cerebras Systems claims it has the &#8220;world&#8217;s fastest AI supercomputer,&#8221; thanks to its large Wafer Scale Engine processor that comes with 400,000 compute cores.</p>



<p>The Los Altos, Calif.-based startup introduced its CS-1 system at the Supercomputing conference in Denver last week after raising more than $200 million in funding from investors, most recently with an $88 million Series D round that was raised in November 2018, according to Andrew Feldman, the founder and CEO of Cerebras who was previously an executive at AMD.</p>



<p>The CS-1 system is based on the startup&#8217;s Wafer Scale Engine chip, which it says is &#8220;the only trillion transistor wafer scale processor in existence,&#8221; measuring 56.7 times larger and containing 78 more compute cores than the largest GPU.</p>



<p>This brings total funding to more than $200 million for Cerebras, which showed off its CS-1 system at the Supercomputing conference in Denver last year. The system is based on its Wafer Scale Engine chip, which it says is &#8220;the only trillion transistor wafer scale processor in existence,&#8221; measuring 56.7 times larger and containing 78 more compute cores than the largest GPU.</p>



<p>With Wafer Scale Engine&#8217;s 400,000 compute cores and 18 GB of on-chip memory, the startup said the CS-1 can deliver compute performance with less space and power than any other system, representing one-third of a standard data center rack while replacing the need for hundreds of thousands of GPUs.</p>



<p>&#8220;The CS-1 is the industry’s fastest AI computer, and because it is easy to install, quick to bring up and integrates with existing AI models in TensorFlow and PyTorch, it delivers value the day it is deployed,&#8221; Feldman said in a recent statement. &#8220;Depending on workload, the CS-1 delivers hundreds or thousands of times the performance of legacy alternatives at one-tenth the power draw and one-tenth the space per unit compute.&#8221;</p>



<p>Cerebras is targeting deep learning workloads, both for training and inference. The startup said the large size of its Wafer Scale Engine allows it to &#8220;process information more quickly&#8221; than other AI accelerators like GPUs, reducing training work from months to minutes. Inference, in the meantime, &#8220;is thousands of times faster,&#8221; with the Wafer Scale Engine capable of making a single image classification in microseconds, which is equal to one one-thousandth of a millisecond.</p>



<p>Among Cerebras&#8217; dozens of customers are the U.S. Department of Energy&#8217;s Argonne National Laboratory and Lawrence Livermore National Laboratory. Argonne, in particular, is using the CS-1 to accelerate neural networks for cancer studies, study the properties of black holes and treat traumatic brain injury.</p>



<p>The startup doesn&#8217;t have plans to sell its chips or systems through channel partners for now, according to Cerebras&#8217; spokesperson.</p>



<p>Marc Fertik, vice president of technology solutions at Elk Grove Village, Ill.-based Ace Computers, No. 261 on CRN&#8217;s 2019 Solution Provider 500 list, said with the CS-1 likely costing a &#8220;fortune,&#8221; it wouldn&#8217;t make sense for Cerebras to work with resellers until it starts selling less expensive systems at higher volumes.</p>



<p>&#8220;As soon as you move down in price point to increase your volume, you need to build your channel, because that&#8217;s when your business development and sales force can&#8217;t do it alone,&#8221; he said.</p>



<p>Even then, he said, prebuilt systems aren&#8217;t for everyone in the channel, citing Nvidia&#8217;s GPU-accelerated DGX deep learning system as an example. While some partners have built practices around the platform, others, like Ace Computers, would rather focus on building their own GPU-accelerated systems using parts from server vendors such as Supermicro, according to Fertik.</p>



<p>&#8220;We have military guys that don’t care about the extra software support, but they care about the hardware cost,&#8221; he said. &#8220;We have successfully convinced them and sold them multiple times something that is 30 percent cheaper.&#8221;</p>
<p>The post <a href="https://www.aiuniverse.xyz/ai-chip-startup-cerebras-reveals-worlds-fastest-ai-supercomputer/">AI Chip Startup Cerebras Reveals &#8216;World&#8217;s Fastest AI Supercomputer&#8217;</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/ai-chip-startup-cerebras-reveals-worlds-fastest-ai-supercomputer/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>New leap for artificial intelligence: computer chips that can smell</title>
		<link>https://www.aiuniverse.xyz/new-leap-for-artificial-intelligence-computer-chips-that-can-smell/</link>
					<comments>https://www.aiuniverse.xyz/new-leap-for-artificial-intelligence-computer-chips-that-can-smell/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 29 Aug 2017 10:49:30 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[computer chips]]></category>
		<category><![CDATA[neurotechnology device]]></category>
		<category><![CDATA[OSHIORENOYA AGABI]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=825</guid>

					<description><![CDATA[<p>Source &#8211; livemint.com Arusha (Tanzania): Nigerian neuroscientist Oshiorenoya Agabi may have found a way to solve one of life’s puzzling dilemmas: how to make air travel pleasant again. What <a class="read-more-link" href="https://www.aiuniverse.xyz/new-leap-for-artificial-intelligence-computer-chips-that-can-smell/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/new-leap-for-artificial-intelligence-computer-chips-that-can-smell/">New leap for artificial intelligence: computer chips that can smell</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source &#8211; <strong>livemint.com</strong></p>
<p><b>Arusha (Tanzania): </b>Nigerian neuroscientist Oshiorenoya Agabi may have found a way to solve one of life’s puzzling dilemmas: how to make air travel pleasant again.</p>
<p>What if you could skip tedious airport security lines, while a special device able to sniff out explosives works silently in the background?</p>
<p>This is only one of the possible uses of what Agabi says is the world’s first neurotechnology device developed by his Silicon Valley-based start-up Koniku and unveiled at the TEDGlobal conference in Tanzania on Sunday.</p>
<p>While those in the field of Artificial Intelligence (AI) are working furiously to create machines that can mimic the brain, or—like tech entrepreneur Elon Musk—implant computers in our brains, Agabi has found a way to merge lab-grown neurons with electronic circuitry.</p>
<p>As many grapple with the finite processing power of silicon, the 38-year-old said he had looked to the brain which is “the most powerful processor the universe has ever seen.”</p>
<p>To simulate the power of just 204 brain neurons would require a supercomputer, he said.</p>
<p>“Instead of copying a neuron, why not just take the biological cell itself and use it as it is? That thought is radical. The consequence of this is mind-boggling,” he said.</p>
<p>So he and a team of geneticists, physicists, bio-engineers, molecular biologists and others set about doing just that, focusing on the problems that were particularly hard for silicon devices to solve.</p>
<p>This includes detecting volatile chemicals and explosives or even illnesses such as cancer.</p>
<p>Agabi said the Koniku Kore device is “a world first” and able to do just that, essentially through breathing in and smelling the air.</p>
<p>He said “major brands”, including those in the travel industry, had signed up and the start-up’s current revenues of $8 million (€7 million) were expected to leap to $30 million by 2018.</p>
<p>One of the main challenges was finding a way to keep the neurons alive, a secret Agabi did not wish to expand on, saying only they could be kept alive for two years in a lab environment and two months in the device.</p>
<p>As AI improves in leaps and bounds, scientists are trying to make and succeeding in making machines more like our brains, able to learn and understand their surroundings: a prospect that is terrifying for many.</p>
<p>Musk, who has repeatedly warned about the perils of AI making humans obsolete, is working on a new project to implant “neural lace” brain-interface technology to prevent humans becoming like a “house cat” to potential machine masters.</p>
<p>However, Agabi, who grew up in Lagos where he helped his mother sell food on the streets, believes the future of AI lies in making machines more alive.</p>
<p>He believes his company could build a cognitive humanoid system based on synthetic living neurons in the next five to seven years.</p>
<p>“It’s not science fiction,” he told <i>AFP.</i> “We want to build a brain of biological neurons—an autonomous system that has intelligence. We do not want to build a human brain.”</p>
<p>Agabi did a bachelors degree in theoretical physics in Lagos before taking an interest in neuroscience and bio-engineering for his PhD in London.</p>
<p>He spoke at the opening session of the four-day TED Global conference, putting African ideas, innovation and creativity in the spotlight with a variety of speakers who each get an 18-minute window to get across their message of choice.</p>
<p>TED—originally known as Technology, Entertainment and Design—has built a global following for its online videos of inspiring talks devoted to “ideas worth spreading.”</p>
<p>The annual international version is taking place in Africa for the first time in a decade with a new crop of “TED Fellows” from the continent to take to the stage.</p>
<p>“This gathering couldn’t come a moment too soon,” said TEDGlobal co-curator Emeka Okafor. “Africa has experienced spectacular economic, demographic and creative growth, but both opportunity and danger are rising at an exponential rate. Our conference will gather the idea catalysts, problem-solvers and change-makers already hard at work here charting Africa’s own path to modernity.”</p>
<p>The post <a href="https://www.aiuniverse.xyz/new-leap-for-artificial-intelligence-computer-chips-that-can-smell/">New leap for artificial intelligence: computer chips that can smell</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/new-leap-for-artificial-intelligence-computer-chips-that-can-smell/feed/</wfw:commentRss>
			<slash:comments>4</slash:comments>
		
		
			</item>
	</channel>
</rss>
