<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>artificial brain Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/artificial-brain/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/artificial-brain/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Thu, 10 Sep 2020 09:59:54 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>Artificial Brain Gives Robots Unprecedented Sensing Capabilities</title>
		<link>https://www.aiuniverse.xyz/artificial-brain-gives-robots-unprecedented-sensing-capabilities/</link>
					<comments>https://www.aiuniverse.xyz/artificial-brain-gives-robots-unprecedented-sensing-capabilities/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 10 Sep 2020 09:59:37 +0000</pubDate>
				<category><![CDATA[Robotics]]></category>
		<category><![CDATA[applications]]></category>
		<category><![CDATA[artificial brain]]></category>
		<category><![CDATA[COVID 19]]></category>
		<category><![CDATA[researchers]]></category>
		<category><![CDATA[Robots]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=11491</guid>

					<description><![CDATA[<p>Source: designnews.com Robots have come a long way in their functionality, but there are still many sensing capabilities that can’t be achieved by these systems that compare <a class="read-more-link" href="https://www.aiuniverse.xyz/artificial-brain-gives-robots-unprecedented-sensing-capabilities/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/artificial-brain-gives-robots-unprecedented-sensing-capabilities/">Artificial Brain Gives Robots Unprecedented Sensing Capabilities</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: designnews.com</p>



<p>Robots have come a long way in their functionality, but there are still many sensing capabilities that can’t be achieved by these systems that compare to how humans interact with their environments.</p>



<p>To solve this issue, researchers at the National University of Singapore (NUS) have created a complex artificial brain system called NeuTouch that mimics human neural networks to provide neuromorphic processing for robotic systems. This should provide them with more sophisticated sensing functionality, including what’s needed to pick up, hold, and manipulate objects in a way that mimics human interactions.</p>



<p>The current problem with robotic systems is they depend on visual processing rather than the actual sense of touch that humans have to help us handle and manipulate objects, said Benjamin C.K. Tee, an assistant professor at NUS Materials Science and Engineering, who co-led the development of NeuTouch with Assistant Professor Harold Soh from NUS Computer Science.</p>



<p>“Robots need to have a sense of touch in order to interact better with humans, but robots today still cannot feel objects very well,” he told <em>Design News</em>. “Touch sensing allows robots to perceive objects based on their physical properties, e.g., surface texture, weight, and stiffness. Such tactile sensing capability augments the robot’s perception of the physical world with information beyond what standard vision and auditory modalities can provide.”</p>



<p><strong>Building a Complete System</strong></p>



<p>The new solution builds on technology Tee and fellow researchers created last year when they developed an artificial nervous system that can give robots and prosthetic devices a sense of touch on par with or even better than human skin.</p>



<p>This system, called Asynchronous Coded Electronic Skin (ACES), can detect touches more than 1,000 times faster than the human sensory nervous system, as well as identify the shape, texture, and hardness of objects 10 times faster than the blink of an eye,&nbsp;<em>Design News</em>&nbsp;reported at the time.</p>



<p>NeuTouch can process sensory data from ACES using neuromorphic technology, which is an area of computing that emulates the neural structure and operation of the human brain. To do this, researchers integrated Intel’s Loihi neuromorphic research chip into the system, Tee said.</p>



<p>By using ACES, NeuTouch can mimic the function of the&nbsp;fast-adapting (FA)&nbsp;mechano-receptors of a human fingertip, which captures dynamic pressure, or dynamic skin deformations, Tee said.</p>



<p> “FA responses are crucial for dexterous manipulation tasks that require rapid detection of object slippage, object hardness, and local curvature,” he told Design News.</p>



<p><strong>Testing for Results</strong></p>



<p>To test the system, researchers fitted a robotic hand with ACES and used it to read braille, passing the tactile data to Loihi via the cloud to convert the micro bumps felt by the hand into a semantic meaning.</p>



<p>In these experiments, Loihi achieved over 92 percent accuracy in classifying the Braille letters, while using 20 times less power than a normal microprocessor.</p>



<p>In other tests, researchers demonstrated how they could improve the robot’s perception capabilities by combining both vision and touch data in a spiking neural network. They tasked a robot equipped with both artificial skin and vision sensors to classify various opaque containers containing differing amounts of liquid. They also tested the system’s ability to identify rotational slip, which is important for stable grasping.</p>



<p>In both tests, the spiking neural network that used both vision and touch data was able to classify objects and detect object slippage with 10 percent more accuracy than a system that used only vision.</p>



<p>Moreover, NeuTouch also could classify the sensory data while it was being accumulated, unlike the conventional approach where data is classified after it has been fully gathered.</p>



<p>The tests also demonstrated the efficiency of neuromorphic technology; Loihi processed the sensory data 21 percent faster than a top-performing graphics processing unit (GPU) while using more than 45 times less power.</p>



<p>Researchers published a paper on their work online and presented their findings at the Robotics: Science and Systems conference.</p>



<p><strong>Applications and Post-COVID 19 Uses</strong></p>



<p>Some applications for NeuTouch include integrating the system into robot grippers to detect slip, which is key to manipulating fragile objects safely and with stability, such as in factory or supply-chain settings, Tee told&nbsp;<em>Design News</em>.</p>



<p>“Accurate detection of slip will allow the&nbsp;robot&nbsp;controller to re-grasp the object and remedy poor initial grasp locations,” he told us. “This feature can be applied to develop more intelligent robots to&nbsp;take over mundane operations such as packing of items in warehouses, which robotic arms can easily adapt to unfamiliar items and apply the appropriate amount of strength to&nbsp;manipulate&nbsp;the items without slippage.”</p>



<p>The system also can be used to create autonomous robots “capable of deft manipulation in (unstructured) physical spaces, since the robots have the ability to feel and better perceive their surroundings,” he added.&nbsp;&nbsp;</p>



<p>Moving forward, researchers plan to continue their work to develop the artificial skin for applications in the logistics and food manufacturing industries where there is a high demand for robotic automation, Tee told Design News.</p>



<p>This type of functionality will especially become more critical in a post-COVID 19 world for creating applications that avoid human contact by letting robots do the work, he said.</p>
<p>The post <a href="https://www.aiuniverse.xyz/artificial-brain-gives-robots-unprecedented-sensing-capabilities/">Artificial Brain Gives Robots Unprecedented Sensing Capabilities</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/artificial-brain-gives-robots-unprecedented-sensing-capabilities/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>New breakthrough by NUS researchers gives robots intelligent sensing abilities to carry out complex tasks</title>
		<link>https://www.aiuniverse.xyz/new-breakthrough-by-nus-researchers-gives-robots-intelligent-sensing-abilities-to-carry-out-complex-tasks/</link>
					<comments>https://www.aiuniverse.xyz/new-breakthrough-by-nus-researchers-gives-robots-intelligent-sensing-abilities-to-carry-out-complex-tasks/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Sat, 18 Jul 2020 06:31:01 +0000</pubDate>
				<category><![CDATA[Robotics]]></category>
		<category><![CDATA[artificial brain]]></category>
		<category><![CDATA[Development]]></category>
		<category><![CDATA[NUS Computer Science]]></category>
		<category><![CDATA[NUS researchers]]></category>
		<category><![CDATA[Robots]]></category>
		<category><![CDATA[Technology]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=10277</guid>

					<description><![CDATA[<p>Source: indiaeducationdiary.in Picking up a can of soft drink may be a simple task for humans, but this is a complex task for robots – it has <a class="read-more-link" href="https://www.aiuniverse.xyz/new-breakthrough-by-nus-researchers-gives-robots-intelligent-sensing-abilities-to-carry-out-complex-tasks/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/new-breakthrough-by-nus-researchers-gives-robots-intelligent-sensing-abilities-to-carry-out-complex-tasks/">New breakthrough by NUS researchers gives robots intelligent sensing abilities to carry out complex tasks</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: indiaeducationdiary.in</p>



<p>Picking up a can of soft drink may be a simple task for humans, but this is a complex task for robots – it has to locate the object, deduce its shape, determine the right amount of strength to use, and grasp the object without letting it slip. Most of today’s robots operate solely based on visual processing, which limits their capabilities. In order to perform more complex tasks, robots have to be equipped with an exceptional sense of touch and the ability to process sensory information quickly and intelligently.</p>



<p>A team of computer scientists and materials engineers from NUS has recently demonstrated an exciting approach to make robots smarter. They developed a sensory integrated artificial brain system that mimics biological neural networks, which can run on a power-efficient neuromorphic processor, such as Intel’s Loihi chip. This novel system integrates artificial skin and vision sensors, equipping robots with the ability to draw accurate conclusions about the objects they are grasping based on the data captured by the vision and touch sensors in real-time.</p>



<p>“The field of robotic manipulation has made great progress in recent years. However, fusing both vision and tactile information to provide a highly precise response in milliseconds remains a technology challenge. Our recent work combines our ultra-fast electronic skins and nervous systems with the latest innovations in vision sensing and AI for robots so that they can become smarter and more intuitive in physical interactions,” said Assistant Professor Benjamin Tee from NUS Materials Science and Engineering. He co-leads this project with Assistant Professor Harold Soh from NUS Computer Science.</p>



<p>The findings of this cross-disciplinary work were presented at the renowned conference Robotics: Science and Systems conference in July 2020. The research paper is available here.</p>



<p><strong>Human-like sense of touch for robots</strong></p>



<p>Enabling a human-like sense of touch in robotics could significantly improve current functionality, and even lead to new uses. For example, on the factory floor, robotic arms fitted with electronic skins could easily adapt to different items, using tactile sensing to identify and grip unfamiliar objects with the right amount of pressure to prevent slipping.</p>



<p>In the new robotic system, the NUS team applied an advanced artificial skin known as Asynchronous Coded Electronic Skin (ACES) developed by Asst Prof Tee and his team in 2019. This novel sensor detects touches more than 1,000 times faster than the human sensory nervous system. It can also identify the shape, texture and hardness of objects 10 times faster than the blink of an eye.</p>



<p>“Making an ultra-fast artificial skin sensor solves about half the puzzle of making robots smarter. They also need an artificial brain that can ultimately achieve perception and learning as another critical piece in the puzzle,” added Asst Prof Tee, who is also from the NUS Institute for Health Innovation &amp; Technology.</p>



<p><strong>A human-like brain for robots</strong></p>



<p>To break new ground in robotic perception, the NUS team explored neuromorphic technology – an area of computing that emulates the neural structure and operation of the human brain – to process sensory data from the artificial skin. As Asst Prof Tee and Asst Prof Soh are members of the Intel Neuromorphic Research Community (INRC), it was a natural choice to use Intel’s Loihi neuromorphic research chip for their new robotic system.</p>



<p>In their initial experiments, the researchers fitted a robotic hand with the artificial skin, and used it to read braille, passing the tactile data to Loihi via the cloud to convert the micro bumps felt by the hand into a semantic meaning. Loihi achieved over 92 per cent accuracy in classifying the Braille letters, while using 20 times less power than a normal microprocessor.</p>



<p>Asst Prof Soh’s team improved the robot’s perception capabilities by combining both vision and touch data in a spiking neural network. In their experiments, the researchers tasked a robot equipped with both artificial skin and vision sensors to classify various opaque containers containing differing amounts of liquid. They also tested the system’s ability to identify rotational slip, which is important for stable grasping.</p>



<p>In both tests, the spiking neural network that used both vision and touch data was able to classify objects and detect object slippage. The classification was 10 per cent more accurate than a system that used only vision. Moreover, using a technique developed by Asst Prof Soh’s team, the neural networks could classify the sensory data while it was being accumulated, unlike the conventional approach where data is classified after it has been fully gathered. In addition, the researchers demonstrated the efficiency of neuromorphic technology: Loihi processed the sensory data 21 per cent faster than a top performing graphics processing unit (GPU), while using more than 45 times less power.</p>



<p>This novel robotic system developed by NUS researchers comprises an artificial brain system that mimics biological neural networks, which can be run on a power-efficient neuromorphic processor such as Intel’s Loihi chip, and is integrated with artificial skin and vision sensors.</p>



<p>Asst Prof Soh shared, “We’re excited by these results. They show that a neuromorphic system is a promising piece of the puzzle for combining multiple sensors to improve robot perception. It’s a step towards building power-efficient and trustworthy robots that can respond quickly and appropriately in unexpected situations.”</p>



<p>“This research from the National University of Singapore provides a compelling glimpse to the future of robotics where information is both sensed and processed in an event-driven manner combining multiple modalities. The work adds to a growing body of results showing that neuromorphic computing can deliver significant gains in latency and power consumption once the entire system is re-engineered in an event-based paradigm spanning sensors, data formats, algorithms, and hardware architecture,” said Mr Mike Davies, Director of Intel’s Neuromorphic Computing Lab.</p>



<p>This research was supported by the National Robotics R&amp;D Programme Office (NR2PO), a set-up that nurtures the robotics ecosystem in Singapore through funding research and development (R&amp;D) to enhance the readiness of robotics technologies and solutions. Key considerations for NR2PO’s R&amp;D investments include the potential for impactful applications in the public sector, and the potential to create differentiated capabilities for our industry.</p>



<p><strong>Next steps</strong></p>



<p>Moving forward, Asst Prof Tee and Asst Prof Soh plan to further develop their novel robotic system for applications in the logistics and food manufacturing industries where there is a high demand for robotic automation, especially moving forward in the post-COVID era.</p>
<p>The post <a href="https://www.aiuniverse.xyz/new-breakthrough-by-nus-researchers-gives-robots-intelligent-sensing-abilities-to-carry-out-complex-tasks/">New breakthrough by NUS researchers gives robots intelligent sensing abilities to carry out complex tasks</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/new-breakthrough-by-nus-researchers-gives-robots-intelligent-sensing-abilities-to-carry-out-complex-tasks/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
