<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>autonomous robot Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/autonomous-robot/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/autonomous-robot/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Thu, 23 Jul 2020 07:15:16 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>What do AI users really think it&#8217;s capable of?</title>
		<link>https://www.aiuniverse.xyz/what-do-ai-users-really-think-its-capable-of/</link>
					<comments>https://www.aiuniverse.xyz/what-do-ai-users-really-think-its-capable-of/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 23 Jul 2020 07:15:06 +0000</pubDate>
				<category><![CDATA[Human Intelligence]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[autonomous robot]]></category>
		<category><![CDATA[Self-learning]]></category>
		<category><![CDATA[Technology]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=10412</guid>

					<description><![CDATA[<p>Source: techradar.com It’s safe to say we’ve been distracted for the last few years. While a tumultuous political, social and economic landscape has seized Britain, in the <a class="read-more-link" href="https://www.aiuniverse.xyz/what-do-ai-users-really-think-its-capable-of/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/what-do-ai-users-really-think-its-capable-of/">What do AI users really think it&#8217;s capable of?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: techradar.com</p>



<p>It’s safe to say we’ve been distracted for the last few years. While a tumultuous political, social and economic landscape has seized Britain, in the background a technological revolution has been taking place. No longer a futuristic concept, artificial intelligence (AI) is making its way into our lives right now. </p>



<p>Self-learning machines are already found in devices and cloud services used by three in four global consumers. They’re also dictating which media we consume, how we communicate with each other and what our jobs entail. Could human intelligence soon be replaced?</p>



<p>Perhaps not. There are a huge number of misconceptions around AI, not to mention fear about its capabilities. The best way to define its true meaning, abilities and potential impact upon the world is to speak to those using it today. I took part in a series of focus groups with data scientists, business leaders, academics and students, all of whom work closely with this technology. As those who are shaping how AI impacts our society, their views show whether we should embrace AI, or fight back.</p>



<h3 class="wp-block-heading" id="a-change-is-coming-to-jobs-but-unemployment-won-x2019-t-rise">A change is coming to jobs, but unemployment won’t rise</h3>



<p>One topic was pervasive: job losses due to AI-driven automation. While it’s positive that most participants believed AI would create more jobs than it replaced, there was little agreement on the duration, severity or consequences of job losses resulting from AI in the short term. In particular, younger participants tended to be more pessimistic about their future prospects, anticipating a significant rise in AI-enabled inequality and a breakdown of social cohesion. Some feared the powerful technology being placed in the hands of a few could drive a much greater divide between those with power, wealth and influence and those without.</p>



<p>Yet, while there was some trepidation from this current and future workforce, the message from the boardroom was loud and clear: workers have little to fear from AI. The reason for this was simple – even in an AI-driven future, humans would remain a valuable commodity worth investing in. They would continue to deliver value that machines do not.</p>



<h3 class="wp-block-heading" id="autonomous-robot-assistants-aren-x2019-t-on-the-cards-x2026-yet">Autonomous robot assistants aren’t on the cards… yet</h3>



<p>As several of the professors and data scientists informed us, we are still a long way from the ‘general intelligence’ so often portrayed in science fiction. Despite the hype, most AIs are designed to be very good at solving a specific problem and under very particular parameters. Introduce a variable and the system breaks down or a new model needs to be created.</p>



<p>Time and time again, the respondents reminded us that human creativity, insight and contextual awareness were key to making AI work. Technical executives in the C-suite told us how they ensured any autonomous processes were closely monitored and supervised by human employees. AI solutions with hidden internal workings weren’t worth the risk, due to a lack of transparency and explainability.</p>



<p>These sorts of validation roles have started to emerge only recently. With time, however, more transparent processes where employees review, understand and resolve the decisions made by AI systems will be a massive source of employment. Like any piece of software, the quality of AI insight depends on the quality of the data you feed into it, and it takes a human to know and judge what is good for it.</p>



<h3 class="wp-block-heading" id="the-world-as-we-know-it-is-changing-for-good">The world as we know it is changing for good</h3>



<p>Technological revolutions are nothing new. Each generation is faced with a new set of technologies which upend stability in favour of progress. How many Uber drivers, YouTubers and app developers did you know at the start of the millennium? Just as the internet revolutionized life as we knew it, AI is powerful enough to cause seismic change across all industries. But, as one respondent on our focus groups put it, “AI will replace us just like computers did. That’s to say, it won’t.”</p>



<p>Today’s genuine AI users argue that public perceptions of AI often contain elements of sci-fi. In reality, the future belongs to the cyborg, rather than the android. This is a key distinction: rather than imitating humans and challenging us at our own game, our economy will be defined by humans able to work hand in glove with AI. In a team, humans and AI can develop simultaneously to make better decisions, improve productivity, and ultimately boost humanity to new heights – and that revolution has already started.</p>
<p>The post <a href="https://www.aiuniverse.xyz/what-do-ai-users-really-think-its-capable-of/">What do AI users really think it&#8217;s capable of?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/what-do-ai-users-really-think-its-capable-of/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>On Your Left! Adept new robot rolls sociably with pedestrians</title>
		<link>https://www.aiuniverse.xyz/on-your-left-adept-new-robot-rolls-sociably-with-pedestrians/</link>
					<comments>https://www.aiuniverse.xyz/on-your-left-adept-new-robot-rolls-sociably-with-pedestrians/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 12 Sep 2017 06:18:18 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[Reinforcement Learning]]></category>
		<category><![CDATA[autonomous robot]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[mobile robots]]></category>
		<category><![CDATA[robotic design]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=1074</guid>

					<description><![CDATA[<p>Source &#8211; enidnews.com Pass on the left. Maintain a respectable distance from others. Stop or change course to avoid bumping into someone. For most humans, a stroll down <a class="read-more-link" href="https://www.aiuniverse.xyz/on-your-left-adept-new-robot-rolls-sociably-with-pedestrians/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/on-your-left-adept-new-robot-rolls-sociably-with-pedestrians/">On Your Left! Adept new robot rolls sociably with pedestrians</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source &#8211; <strong>enidnews.com</strong></p>
<p>Pass on the left. Maintain a respectable distance from others. Stop or change course to avoid bumping into someone. For most humans, a stroll down a hallway or sidewalk is second nature.</p>
<p>Now, engineers at MIT have designed an autonomous robot with “sociably aware navigation” – the ability to keep pace with foot traffic while observing the unspoken rules of the pedestrian road.</p>
<p>The knee-high cousin of R2-D2 refined its sidewalk skills in the busy, winding hallways of MIT’s Stata Center, successfully keeping up with the average pace of pedestrians without running into – or over – anyone.</p>
<p>The researchers will detail the pedestrian-friendly robotic design in a paper they are scheduled to present at the IEEE Conference on Intelligent Robots and Systems in Vancouver in September. Their work was funded by Ford Motor Co.</p>
<p>New applications for these heads-up devices will multiply as the technology continues to improve, the lead researcher said.</p>
<p>“Socially aware navigation is a central capability for mobile robots operating in environments that require frequent interactions with pedestrians,” said Yu Fan Chen, the study’s lead author and a former MIT graduate student who led the team. “Small robots could operate on sidewalks for package and food delivery. Similarly, personal mobility devices could transport people in large, crowded spaces such as shopping malls, airports and hospitals.”</p>
<p>The seemingly simple task of navigating a busy hallway in a natural way entails a complex set of calculations. A robot must recognize its surroundings, know where it is, identify the optimal path to its destination and execute the path, taking into account the motion of people and predicting their goals and intentions.</p>
<p>The MIT team outfitted the device with off-the-shelf sensors, including webcams, a depth sensor and a lidar sensor, a remote sensing method that uses pulsed laser to measure ranges.</p>
<p>Its localization skills are rooted in open-source algorithms that map the environment and determine the robot’s position. And they controlled the robot using the standards methods used to drive autonomous vehicles.</p>
<p>But motion planning was a challenge, since the paths of individual pedestrians are difficult to predict. “Once you figure out where you are in the world and know how to follow trajectories, which trajectories should you be following?” said team member Michael Everett.</p>
<p>A robot programmed to calculate everyone’s expected trajectories and then plot its own optimal course would spend a lot of time on the side of a walkway, calculating.</p>
<p>Alternately, a robot could be programmed based on a reactive approach, quickly computing new paths to avoid collisions. But since pedestrians are unpredictable – they wander and weave and stop to pet cute dogs – reactive robots tend to be awkward, colliding with people and appearing as if they are excessively avoiding people.</p>
<p>Instead, the MIT team used reinforcement learning, a machine learning approach, performing computer simulations to train a robot to take certain paths given the speed and trajectory of other objects in the environment.</p>
<p>They also incorporated pedestrian social norms during this offline training. For example, they encouraged the robot in simulations to proceed through a hallway on the right and penalized the device when it went forward on the left.</p>
<p>Once the robot was trained in simulation, the researchers programmed it to pursue the optimal paths when it recognized a similar scenario in the real world.</p>
<p>The researchers also enabled the robot to assess its environment and adjust its course every tenth of a second. That way, the robot could roll through the hallway at a typical walking speed of 1.2 meters per second with no pauses to reprogram its route.</p>
<p>“This way, we think our robot looks more natural, and is anticipating what people are doing,” said Everett.</p>
<p>Next up for the MIT team: exploring how robots might handle crowds in a pedestrian environment.</p>
<p>The post <a href="https://www.aiuniverse.xyz/on-your-left-adept-new-robot-rolls-sociably-with-pedestrians/">On Your Left! Adept new robot rolls sociably with pedestrians</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/on-your-left-adept-new-robot-rolls-sociably-with-pedestrians/feed/</wfw:commentRss>
			<slash:comments>3</slash:comments>
		
		
			</item>
		<item>
		<title>This robot can follow rules of pedestrians</title>
		<link>https://www.aiuniverse.xyz/this-robot-can-follow-rules-of-pedestrians/</link>
					<comments>https://www.aiuniverse.xyz/this-robot-can-follow-rules-of-pedestrians/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 01 Sep 2017 09:47:10 +0000</pubDate>
				<category><![CDATA[Human Intelligence]]></category>
		<category><![CDATA[Reinforcement Learning]]></category>
		<category><![CDATA[autonomous robot]]></category>
		<category><![CDATA[Intelligent Robots]]></category>
		<category><![CDATA[mobile robots]]></category>
		<category><![CDATA[open-source algorithms]]></category>
		<category><![CDATA[robot]]></category>
		<category><![CDATA[robotic design]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=890</guid>

					<description><![CDATA[<p>Source &#8211; economictimes.indiatimes.com Engineers at Massachusetts Institute of Technology (MIT) have designed an autonomous robot that can keep pace with foot traffic while observing the general social codes <a class="read-more-link" href="https://www.aiuniverse.xyz/this-robot-can-follow-rules-of-pedestrians/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/this-robot-can-follow-rules-of-pedestrians/">This robot can follow rules of pedestrians</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source &#8211; <strong>economictimes.indiatimes.com</strong></p>
<p>Engineers at Massachusetts Institute of Technology (MIT) have designed an autonomous robot that can keep pace with foot traffic while observing the general social codes that pedestrians follow to avoid oncoming obstacles while keeping up a steady walking pace.</p>
<p>In drive tests, the robot, which resembles a knee-high kiosk on wheels, successfully avoided collisions while keeping up with the average flow of pedestrians, said the researchers who have detailed their robotic design in a paper scheduled to be presented at the IEEE Conference on Intelligent Robots and Systems to be held in Vancouver, Canada, in September.</p>
<div>&#8220;Socially aware navigation is a central capability for mobile robots operating in environments that require frequent interactions with pedestrians,&#8221; said lead author of the study Yu Fan (Steven) Chen.</p>
<p>&#8220;For instance, small robots could operate on sidewalks for package and food delivery. Similarly, personal mobility devices could transport people in large, crowded spaces, such as shopping malls, airports, and hospitals,&#8221; Chen said.</p>
<div>In order for a robot to make its way autonomously through a heavily trafficked environment, it must solve four main challenges &#8212; localisation (knowing where it is in the world), perception (recognising its surroundings), motion planning (identifying the optimal path to a given destination), and control (physically executing its desired path).</p>
<p>Chen and his colleagues used standard approaches to solve the problems of localisation and perception.</p>
<div>For the latter, they outfitted the robot with off-the-shelf sensors, such as webcams, a depth sensor, and a high-resolution lidar sensor.</p>
<p>For the problem of localisation, they used open-source algorithms to map the robot&#8217;s environment and determine its position. To control the robot, they employed standard methods used to drive autonomous ground vehicles.</p></div>
<div></div>
<div>For motion planning, the researchers used reinforcement learning, a type of machine learning approach, in which they performed computer simulations to train a robot to take certain paths, given the speed and trajectory of other objects in the environment.</div>
</div>
</div>
<div></div>
<div>The team also incorporated social norms into this offline training phase, in which they encouraged the robot in simulations to pass on the right, and penalised the robot when it passed on the left.</p>
<p>&#8220;We want it to be travelling naturally among people and not be intrusive,&#8221; study co-author Michael Everett said.</p></div>
<p>The post <a href="https://www.aiuniverse.xyz/this-robot-can-follow-rules-of-pedestrians/">This robot can follow rules of pedestrians</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/this-robot-can-follow-rules-of-pedestrians/feed/</wfw:commentRss>
			<slash:comments>3</slash:comments>
		
		
			</item>
	</channel>
</rss>
