<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Autonomous vehicles Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/autonomous-vehicles/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/autonomous-vehicles/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Thu, 24 Dec 2020 06:16:04 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>HOW IS ARTIFICIAL INTELLIGENCE TRANSFORMING THE LIVES OF PEOPLE WITH DISABILITIES?</title>
		<link>https://www.aiuniverse.xyz/how-is-artificial-intelligence-transforming-the-lives-of-people-with-disabilities/</link>
					<comments>https://www.aiuniverse.xyz/how-is-artificial-intelligence-transforming-the-lives-of-people-with-disabilities/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 24 Dec 2020 06:15:13 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[AI Model]]></category>
		<category><![CDATA[Autonomous vehicles]]></category>
		<category><![CDATA[AWS]]></category>
		<category><![CDATA[Facebook]]></category>
		<category><![CDATA[Google]]></category>
		<category><![CDATA[Microsoft]]></category>
		<category><![CDATA[Technology]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=12472</guid>

					<description><![CDATA[<p>Source: analyticsinsight.net Leveraging Artificial Intelligence to Create Impressive Products for Disabled People Technology is an excellent way to enhance the lives of people with disabilities. With the advent <a class="read-more-link" href="https://www.aiuniverse.xyz/how-is-artificial-intelligence-transforming-the-lives-of-people-with-disabilities/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-is-artificial-intelligence-transforming-the-lives-of-people-with-disabilities/">HOW IS ARTIFICIAL INTELLIGENCE TRANSFORMING THE LIVES OF PEOPLE WITH DISABILITIES?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: analyticsinsight.net</p>



<h3 class="wp-block-heading">Leveraging Artificial Intelligence to Create Impressive Products for Disabled People</h3>



<p>Technology is an excellent way to enhance the lives of people with disabilities. With the advent of artificial intelligence, several avenues of research have opened up that focus on enhancing the lives of people with impairment.</p>



<p>For instance, Facebook has designed an AI tool that can help the blind “see” again. This AI model explains the images on the Facebook feed of a blind person, so the person using the screen reader gets an idea of what is going on in the picture. This means people with visual impairment no longer have to hear a screen reader say “Photo” by “John Doe.” Google’s ‘Look to Speak’ app uses machine learning and computer vision to allow users to control their devices with their eyes</p>



<p>Similarly, OrCam, a Jerusalem-based company, has developed an AI-based called OrCam Read. This handheld device can read full pages or screens of text aloud from any printed or digital surface, including newspapers, books, product labels, and computers and smartphones. Through this device, OrCam aims to help people with reading challenges, such as dyslexia, mild to moderate vision loss, reading fatigue, as well as for those who read large volumes of text.</p>



<p>Even company giants like Microsoft have started a five-year program called ‘AI for Accessibility,’ with an investment of US$25 million, aiming to put AI in the hands of developers to make the world more accessible by providing AI solutions for the specially-abled. Artificial intelligence not only assists people with physical disabilities but is also helping people struggling with learning problems and mental health issues. E.g., Microsoft’s Windows Hello uses biometric login, i.e., fingerprint, face, or iris, which can work for people with physical disabilities or those with dyslexia who might struggle to remember passwords. AI chatbots like Woebot and Wysa are ensuring the availability of consultation for mental health woes, beyond the therapist hours 24/7.</p>



<p>Meanwhile, people suffering from epilepsy can have seizures from blinking lights and animations. This is why accessiBe, a web accessibility platform enables epileptic users to disable various types of animation, such as GIFs and videos so that they can browse the web without complications. Voiceitt is an app for people with speech impediments, including both those who need it temporarily after strokes and brain injuries, and those with more long-term conditions like cerebral palsy, Parkinson’s, and Down’s syndrome. The app uses machine learning to pick up speakers’ unique speech patterns, recognize any mispronunciations, and rectify them before creating an audio or text output. Livio AI, developed by Starkey, an AI medical device company, is a hearing aid that will enhance the hearing experience by quieting all the external noise from the environment and tracking health-related data to enable patients to seek help during emergencies.</p>



<p>Thanks to artificial intelligence, autonomous vehicles also promise to&nbsp;provide people with disabilities more mobility&nbsp;than ever before. Once the self-driving vehicles are fully integrated into society, they can be a resourceful asset for people with different disabilities, including motor impairment. These people would no longer be dependent on other people or public transport.</p>



<p>Further, most of the existing testing methods are highly ineffective at pinpointing learning disabilities like dyslexia or dyscalculia. Artificial Intelligence can help teachers and healthcare professionals diagnose early signs of such conditions and help the students accordingly. For instance, Australian startup Dystech has developed a screening app for early detection of such learning disorders.</p>



<p>Built on Amazon Web Services (AWS), Dystech employs artificial intelligence and machine learning to screen test if the user has dyslexia or dysgraphia. For the former, the app uses datasets of audio recording from both dyslexic and non-dyslexic adults and children to train the AI and relies on users reading aloud words that appear on the screen while being recorded using their smart device during assessment . And for dysgraphia it uses a photo of a handwritten text for screening. After subjected to a 10-minute screening test, app informs users about their likelihood of having dyslexia or dysgraphia.</p>
<p>The post <a href="https://www.aiuniverse.xyz/how-is-artificial-intelligence-transforming-the-lives-of-people-with-disabilities/">HOW IS ARTIFICIAL INTELLIGENCE TRANSFORMING THE LIVES OF PEOPLE WITH DISABILITIES?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-is-artificial-intelligence-transforming-the-lives-of-people-with-disabilities/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>The Fifth Industrial Revolution: where mind meets machine</title>
		<link>https://www.aiuniverse.xyz/the-fifth-industrial-revolution-where-mind-meets-machine/</link>
					<comments>https://www.aiuniverse.xyz/the-fifth-industrial-revolution-where-mind-meets-machine/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Mon, 10 Aug 2020 08:01:14 +0000</pubDate>
				<category><![CDATA[Internet of things]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Autonomous vehicles]]></category>
		<category><![CDATA[Internet of Things]]></category>
		<category><![CDATA[Technology]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=10784</guid>

					<description><![CDATA[<p>Source: thenational.ae As far as revolutions go, the one we are living through now seems quiet. Though its effects are profound and touch everybody, not least during <a class="read-more-link" href="https://www.aiuniverse.xyz/the-fifth-industrial-revolution-where-mind-meets-machine/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/the-fifth-industrial-revolution-where-mind-meets-machine/">The Fifth Industrial Revolution: where mind meets machine</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: thenational.ae</p>



<p>As far as revolutions go, the one we are living through now seems quiet. Though its effects are profound and touch everybody, not least during these days of Covid-19, the Fourth Industrial Revolution is enabled by technologies that are based on computing and the internet, which largely do their work in the background of our lives. Main applications include the Internet of Things (&#8220;smart&#8221; toasters and refrigerators), artificial intelligence, autonomous vehicles and medicine tailored to an individual&#8217;s DNA.</p>



<p>Two things are certain: the speed of this revolution is unprecedented, and the impact is relevant to more and more people in increasingly diverse ways. It’s difficult to imagine a world without the Fourth Industrial Revolution&#8217;s technologies. Yet, as surely as four follows three and five follows four, there will be further industrial revolutions.</p>



<p>Signs of the next one are already emerging, and it is set to be just as life-changing as its predecessors. But to understand what&#8217;s in store for the Fifth Industrial Revolution, we must first look back at where we have been.</p>



<p>The original Industrial Revolution, beginning in the late 18th century, mechanised industries with steam engines and replaced agricultural societies. Technologies of this period paved the way toward the use of oil and gas in the late 1800s, when the combustion engine appeared, truly driving industries into the Second Industrial Revolution. Aircraft and automobiles were central to this revolution.</p>



<p>The Third Industrial Revolution, beginning in the 1960s, was characterised by computers and electronics. This enabled some of the earliest journeys into space on less computing power than we carry in our hands today. And now here we are, in the midst of the Fourth Revolution.</p>



<p>It is worth noting that the first of the revolutions lasted about 200 years. The second lasted about 100, while the third only about 50. It is easy to see the pattern here.</p>



<p>One trend is particularly important in understanding what comes next: the intimacy of technology. Steam engines were important and impressively large industrial tools; They were housed in massive factories, and hundreds of people laboured around them. Then, with the combustion engine and the telephone of the second revolution, we became closely connected to these technologies and to one another. The third revolution was about miniaturising technology and personal computing. During the fourth, we are hyper-connected through our smart devices to most of the planet.</p>



<p>The Fifth Industrial Revolution will make that connection closer and seamless, and will feel unmediated. The smart device onto which we tap and into which we speak will disappear. Brain-computer interfaces will replace them.</p>



<p>The fifth will stand on the shoulders of the fourth, as technology of diminishing size will be fundamental, and the digital networks will be essential. We are soon finding that the rate at which we type into our smart devices today is a frustrating few bytes at most, while our imagination is orders of magnitude greater.</p>



<p>Can we connect our brains – and our minds – to machines? The short answer is yes, and we have done so for some time. The longer answer is more complicated, but more interesting.</p>



<p>Until a few years ago, machines were connected to the brain and the nervous system principally for medical purposes – for example, to treat Parkinson’s disease or repair spinal cord injuries. Most recently, research has focused on other, non-therapeutic uses, and some of the most high-profile investment in such technology comes from Facebook, Google, Amazon and Elon Musk&#8217;s Neuralink. This is where the Fifth Industrial Revolution is in the making.</p>



<p>Mr Musk founded Neuralink in 2016. It has since established technologies that can record and stimulate signals from thousands of sites in the brain. Artificial Intelligence is an important component of these achievements and new announcements from Neuralink are expected later in the month. Facebook has recently acquired Ctrl-Labs, a New York-based start-up that had developed a bracelet that detects the intention to move and allows users to manipulate objects on a screen by thought alone. Machine learning is a fundamental ingredient in achieving this.</p>



<p>Bryan Johnson, another tech pioneer, has founded Kernel and recently announced the ability to decode a person’s brain activity and identify the speech or song they are hearing. Mr Johnson aims to usher in a &#8220;neuro-quantified era&#8221; to characterise thoughts and emotions, both conscious and subconscious. Investors seem to be enthusiastic: they funded Kernel with more than $50 million in early July.</p>



<p>The direction of travel is clear. The science and technology are progressing quickly, for therapeutic and lifestyle or commercial applications. The demand is growing and the underlying Fourth Industrial Revolution technologies are going to make this a reality.</p>



<p>We might communicate with others by thought alone, check in at the airport using a mind-reading bracelet, or do our mind-supported shopping – perhaps, for example, to guarantee our safety from infectious viruses. Eventually, regulation will help to make such devices accessible, safe and mainstream. And our use of these technologies will lay the foundations for yet a new revolution. What might the Sixth Industrial Revolution hold?</p>
<p>The post <a href="https://www.aiuniverse.xyz/the-fifth-industrial-revolution-where-mind-meets-machine/">The Fifth Industrial Revolution: where mind meets machine</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/the-fifth-industrial-revolution-where-mind-meets-machine/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>The Surprising Way Artificial Intelligence Is Transforming Transportation</title>
		<link>https://www.aiuniverse.xyz/the-surprising-way-artificial-intelligence-is-transforming-transportation/</link>
					<comments>https://www.aiuniverse.xyz/the-surprising-way-artificial-intelligence-is-transforming-transportation/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 28 Nov 2019 11:39:11 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[AI safety]]></category>
		<category><![CDATA[Autonomous vehicles]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[transforming]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=5467</guid>

					<description><![CDATA[<p>Source: forbes.com While our growing dependencies on mobile phones stand to threaten road safety and increase rates of distracted driving, other technology innovations can work in safety’s <a class="read-more-link" href="https://www.aiuniverse.xyz/the-surprising-way-artificial-intelligence-is-transforming-transportation/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/the-surprising-way-artificial-intelligence-is-transforming-transportation/">The Surprising Way Artificial Intelligence Is Transforming Transportation</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: forbes.com</p>



<p>While our growing dependencies on mobile phones stand to threaten road safety and increase rates of distracted driving, other technology innovations can work in safety’s favor. Developments in 5G networks, autonomous vehicles, and artificial intelligence are poised to transform the way we drive and the safety of our roads.</p>



<p>5G will have a positive impact on road maintenance with faster data collection creating new possibilities around automation. Today, road crews have to physically go on-site to inspect a problem and determine what next steps are required. But through new video and sensor data, road maintenance crews will receive alerts of life-threatening hazards faster than ever. Connected vehicles equipped with dash cams will generate crowdsourced footage of potential debris and other hazards so that crews can act fast to alert drivers in the area and find safe solutions. Sensors on smartphones can produce similar insights already and offer insights in the interim. In addition to this, departments will be able to rank the urgency of various jobs by analyzing data from each location.</p>



<p>According to a report, 94% of vehicle accidents in the US involve human error and are potentially avoidable. With autonomous vehicle technology especially, there’s the potential to essentially eliminate human error from the risk equation, decreasing the number of collisions and improving overall road safety. To achieve full autonomy, the onboard computers on self-driving cars need to make use of cameras and radar sensors to generate a 3D view of the vehicle’s surroundings. One of the challenges to this lies in getting the information needed to make split-second decisions in real-time. Eventually, 5G and artificial intelligence will be leveraged in tandem to give these vehicles a more accurate view of the road, making cars more functional and safe.</p>



<p>Artificial intelligence is also ushering in a new chapter for smartphones. Even though most of us don’t realize it, artificial intelligence is powering many of the features on several mobile apps today. These include Map apps, as well as virtual assistants like Google Assistant, Cortana, and Siri. With mobile apps running telematics in the background, drivers gain access to the latest technologies in driver safety, artificial intelligence, and 5G in a single device. Drivers are also able to use voice commands to look for gas stations, perform internet searches, and communicate with friends and family instead of physically using their phones while driving. Even more, artificial intelligence paired with telematics gives drivers access to real-time information on fuel usage, vehicle location, driver behavior, and speed.</p>
<p>The post <a href="https://www.aiuniverse.xyz/the-surprising-way-artificial-intelligence-is-transforming-transportation/">The Surprising Way Artificial Intelligence Is Transforming Transportation</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/the-surprising-way-artificial-intelligence-is-transforming-transportation/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Ola ‘acquihires’ artificial intelligence start-up Pikup.ai</title>
		<link>https://www.aiuniverse.xyz/ola-acquihires-artificial-intelligence-start-up-pikup-ai/</link>
					<comments>https://www.aiuniverse.xyz/ola-acquihires-artificial-intelligence-start-up-pikup-ai/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Wed, 14 Aug 2019 17:39:57 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[acquihires]]></category>
		<category><![CDATA[Autonomous vehicles]]></category>
		<category><![CDATA[Technology Centre]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=4343</guid>

					<description><![CDATA[<p>Source: financialexpress.com Indian ride hailing giant Ola has ‘acquihired’ Pikup.ai, an artificial intelligence start-up from Bengaluru, for an undisclosed amount. As part of the deal, Ola will <a class="read-more-link" href="https://www.aiuniverse.xyz/ola-acquihires-artificial-intelligence-start-up-pikup-ai/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/ola-acquihires-artificial-intelligence-start-up-pikup-ai/">Ola ‘acquihires’ artificial intelligence start-up Pikup.ai</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: financialexpress.com</p>



<p>Indian ride hailing giant Ola has ‘acquihired’ Pikup.ai, an artificial intelligence start-up from Bengaluru, for an undisclosed amount.</p>



<p>As part of the deal, Ola will hire Pickup.ai’s team following the acquisition. Pickup.ai is co-founded by Inder Singh and Ritwik Saikia. The start-up uses autonomous technologies like artificial intelligence, computer vision and sensor fusion to provide seamless AI-powered solutions for businesses.</p>



<p>Earlier this year, Ola announced its intent to set up an Advanced Technology Centre in the San Francisco Bay area, to focus on developing next-generation technologies in mobility like Electric, Connected and Autonomous Vehicles. Ola expects this acquisition to deliver innovations that continue to improve safety and transform customer experience.</p>



<p>Talking about the deal, Ankit Bhati, co-founder &amp; CTO, Ola, said: “As we advance on our mission to build mobility for a billion people, we are investing in futuristic technology solutions that will shape the future of mobility in India and the world”. Ola has said that it is increasing its focus on using advanced analytics and deep technology to build futuristic mobility solutions for India and other markets.</p>



<p>The cab aggregator aims to use the rich data it has and its expertise in technologies like machine learning and artificial intelligence to identify deep insights that can lead to improved mobility outcomes. This will include investments in early-stage businesses, acquisitions as well as acquihires across Artificial Intelligence, Machine Learning, Computer Vision and other emerging areas of deep technology.</p>



<p>Commenting on the deal, Inder Singh, co-founder of Pikup.ai, said: “we are looking forward to joining Ola on its mission to build mobility for a billion people and are very excited about building meaningful technology solutions that have a deep impact on the lives of millions, every single day”.</p>
<p>The post <a href="https://www.aiuniverse.xyz/ola-acquihires-artificial-intelligence-start-up-pikup-ai/">Ola ‘acquihires’ artificial intelligence start-up Pikup.ai</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/ola-acquihires-artificial-intelligence-start-up-pikup-ai/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Social media data mining to be used to fix wireless UK blackspots with 5G rollout</title>
		<link>https://www.aiuniverse.xyz/social-media-data-mining-to-be-used-to-fix-wireless-uk-blackspots-with-5g-rollout/</link>
					<comments>https://www.aiuniverse.xyz/social-media-data-mining-to-be-used-to-fix-wireless-uk-blackspots-with-5g-rollout/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Sat, 03 Aug 2019 09:41:08 +0000</pubDate>
				<category><![CDATA[Data Mining]]></category>
		<category><![CDATA[accurate]]></category>
		<category><![CDATA[Autonomous vehicles]]></category>
		<category><![CDATA[data mining]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[UK]]></category>
		<category><![CDATA[wireless blackspots]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=4243</guid>

					<description><![CDATA[<p>Source: telemediaonline.co.uk Ranplan Wireless is collaborating with the University of Warwick on a project funded by Innovate UK’s Geospatial Commission to identify wireless blackspots to support the rollout of 5G and help improve urban and rural coverage. The COCKPIT-5G project will use crowd blackspot intelligence sourcing and social media techniques along <a class="read-more-link" href="https://www.aiuniverse.xyz/social-media-data-mining-to-be-used-to-fix-wireless-uk-blackspots-with-5g-rollout/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/social-media-data-mining-to-be-used-to-fix-wireless-uk-blackspots-with-5g-rollout/">Social media data mining to be used to fix wireless UK blackspots with 5G rollout</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: telemediaonline.co.uk</p>



<p>Ranplan Wireless is collaborating with the University of Warwick on a project funded by Innovate UK’s Geospatial Commission to identify wireless blackspots to support the rollout of 5G and help improve urban and rural coverage.</p>



<p>The COCKPIT-5G project will use crowd blackspot intelligence sourcing and social media techniques along with sophisticated real-time natural language processing to curate consumer data and build up an accurate connectivity map of the UK. </p>



<p>With&nbsp;the start of the&nbsp;5G&nbsp;rollout in the UK,&nbsp;new cell sites&nbsp;are&nbsp;being&nbsp;planned to address the&nbsp;problems of service blackspots&nbsp;with&nbsp;poor or no&nbsp;signal&nbsp;at all&nbsp;and meet the demand for&nbsp;new services&nbsp;to support&nbsp;autonomous vehicles, artificial intelligence and the growing&nbsp;digital&nbsp;economy.&nbsp;</p>



<p>Enhancing 5G coverage to all areas of the UK moves towards the&nbsp;UK&nbsp;Government’s ambition to become a global leader in 5G communication technologies.</p>



<p>The COCKPIT–5G project leverages the latest and best cutting-edge advances in social media viral campaigns&nbsp;and natural language machine learning to automatically build a database of blackspots and their geospatial and contextual information&nbsp;by&nbsp;understanding&nbsp;the&nbsp;consumer experience in real-time.&nbsp;</p>



<p>“The aim of the project is to use customer centric data to improve network deployment efficiency and increase user satisfaction,” says Jie Zhang, Chief Scientific Officer at Ranplan Wireless. “The vision is for 5G wireless networks to self-regulateas this is the future of managing complex ‘on demand’ connectivity in dense environments. By being able to identify coverage blackspots means that operators can also more precisely determine where to place additional small cells to ensure quality of service and save on CAPEX.”</p>



<p>Dr Weisi Guo, from the School of Engineering at the University of Warwick, says:&nbsp;“COCKPIT-5G will enable the UK to be world-leaders in new technologies by getting 5G coverage in blackspots. Businesses will be better connected, which in turn improves manufacturing, and we can enhance AI and the development of autonomous vehicles.”</p>
<p>The post <a href="https://www.aiuniverse.xyz/social-media-data-mining-to-be-used-to-fix-wireless-uk-blackspots-with-5g-rollout/">Social media data mining to be used to fix wireless UK blackspots with 5G rollout</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/social-media-data-mining-to-be-used-to-fix-wireless-uk-blackspots-with-5g-rollout/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>New robot rolls with the rules of pedestrian conduct</title>
		<link>https://www.aiuniverse.xyz/new-robot-rolls-with-the-rules-of-pedestrian-conduct/</link>
					<comments>https://www.aiuniverse.xyz/new-robot-rolls-with-the-rules-of-pedestrian-conduct/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 01 Sep 2017 09:40:10 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Human Intelligence]]></category>
		<category><![CDATA[Aeronautical]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[algorithms]]></category>
		<category><![CDATA[astronautical engineering]]></category>
		<category><![CDATA[Autonomous vehicles]]></category>
		<category><![CDATA[Research]]></category>
		<category><![CDATA[Robotics]]></category>
		<category><![CDATA[Robots]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=887</guid>

					<description><![CDATA[<p>Source &#8211; news.mit.edu Just as drivers observe the rules of the road, most pedestrians follow certain social codes when navigating a hallway or a crowded thoroughfare: Keep to <a class="read-more-link" href="https://www.aiuniverse.xyz/new-robot-rolls-with-the-rules-of-pedestrian-conduct/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/new-robot-rolls-with-the-rules-of-pedestrian-conduct/">New robot rolls with the rules of pedestrian conduct</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source &#8211; <strong>news.mit.edu</strong></p>
<p>Just as drivers observe the rules of the road, most pedestrians follow certain social codes when navigating a hallway or a crowded thoroughfare: Keep to the right, pass on the left, maintain a respectable berth, and be ready to weave or change course to avoid oncoming obstacles while keeping up a steady walking pace.</p>
<p>Now engineers at MIT have designed an autonomous robot with “socially aware navigation,” that can keep pace with foot traffic while observing these general codes of pedestrian conduct.</p>
<p>In drive tests performed inside MIT’s Stata Center, the robot, which resembles a knee-high kiosk on wheels, successfully avoided collisions while keeping up with the average flow of pedestrians. The researchers have detailed their robotic design in a paper that they will present at the IEEE Conference on Intelligent Robots and Systems in September.</p>
<p>“Socially aware navigation is a central capability for mobile robots operating in environments that require frequent interactions with pedestrians,” says Yu Fan “Steven” Chen, who led the work as a former MIT graduate student and is the lead author of the study. “For instance, small robots could operate on sidewalks for package and food delivery. Similarly, personal mobility devices could transport people in large, crowded spaces, such as shopping malls, airports, and hospitals.”</p>
<p>Chen’s co-authors are graduate student Michael Everett, former postdoc Miao Liu, and Jonathan How, the Richard Cockburn Maclaurin Professor of Aeronautics and Astronautics at MIT.</p>
<p><strong>Social drive</strong></p>
<p>In order for a robot to make its way autonomously through a heavily trafficked environment, it must solve four main challenges: localization (knowing where it is in the world), perception (recognizing its surroundings), motion planning (identifying the optimal path to a given destination), and control (physically executing its desired path).</p>
<p>Chen and his colleagues used standard approaches to solve the problems of localization and perception. For the latter, they outfitted the robot with off-the-shelf sensors, such as webcams, a depth sensor, and a high-resolution lidar sensor. For the problem of localization, they used open-source algorithms to map the robot’s environment and determine its position. To control the robot, they employed standard methods used to drive autonomous ground vehicles.</p>
<p>“The part of the field that we thought we needed to innovate on was motion planning,” Everett says. “Once you figure out where you are in the world, and know how to follow trajectories, which trajectories should you be following?”</p>
<p>That’s a tricky problem, particularly in pedestrian-heavy environments, where individual paths are often difficult to predict. As a solution, roboticists sometimes take a trajectory-based approach, in which they program a robot to compute an optimal path that accounts for everyone&#8217;s desired trajectories. These trajectories must be inferred from sensor data, because people don&#8217;t explicitly tell the robot where they are trying to go.</p>
<p>“But this takes forever to compute. Your robot is just going to be parked, figuring out what to do next, and meanwhile the person’s already moved way past it before it decides ‘I should probably go to the right,’” Everett says. “So that approach is not very realistic, especially if you want to drive faster.”</p>
<p>Others have used faster, “reactive-based” approaches, in which a robot is programmed with a simple model, using geometry or physics, to quickly compute a path that avoids collisions.</p>
<p>The problem with reactive-based approaches, Everett says, is the unpredictability of human nature — people rarely stick to a straight, geometric path, but rather weave and wander, veering off to greet a friend or grab a coffee. In such an unpredictable environment, such robots tend to collide with people or look like they are being pushed around by avoiding people excessively.</p>
<p>“The knock on robots in real situations is that they might be too cautious or aggressive,” Everett says. “People don’t find them to fit into the socially accepted rules, like giving people enough space or driving at acceptable speeds, and they get more in the way than they help.”</p>
<p><strong>Training days</strong></p>
<p>The team found a way around such limitations, enabling the robot to adapt to unpredictable pedestrian behavior while continuously moving with the flow and following typical social codes of pedestrian conduct.</p>
<p>They used reinforcement learning, a type of machine learning approach, in which they performed computer simulations to train a robot to take certain paths, given the speed and trajectory of other objects in the environment. The team also incorporated social norms into this offline training phase, in which they encouraged the robot in simulations to pass on the right, and penalized the robot when it passed on the left.</p>
<p>“We want it to be traveling naturally among people and not be intrusive,” Everett says. “We want it to be following the same rules as everyone else.”</p>
<p>The advantage to reinforcement learning is that the researchers can perform these training scenarios, which take extensive time and computing power, offline. Once the robot is trained in simulation, the researchers can program it to carry out the optimal paths, identified in the simulations, when the robot recognizes a similar scenario in the real world.</p>
<p>The researchers enabled the robot to assess its environment and adjust its path, every one-tenth of a second. In this way, the robot can continue rolling through a hallway at a typical walking speed of 1.2 meters per second, without pausing to reprogram its route.</p>
<p>“We’re not planning an entire path to the goal — it doesn’t make sense to do that anymore, especially if you’re assuming the world is changing,” Everett says. “We just look at what we see, choose a velocity, do that for a tenth of a second, then look at the world again, choose another velocity, and go again. This way, we think our robot looks more natural, and is anticipating what people are doing.”</p>
<p><strong>Crowd control</strong></p>
<p>Everett and his colleagues test-drove the robot in the busy, winding halls of MIT’s Stata Building, where the robot was able to drive autonomously for 20 minutes at a time. It rolled smoothly with the pedestrian flow, generally keeping to the right of hallways, occasionally passing people on the left, and avoiding any collisions.</p>
<p>“We wanted to bring it somewhere where people were doing their everyday things, going to class, getting food, and we showed we were pretty robust to all that,” Everett says. “One time there was even a tour group, and it perfectly avoided them.”</p>
<p>Everett says going forward, he plans to explore how robots might handle crowds in a pedestrian environment.</p>
<p>“Crowds have a different dynamic than individual people, and you may have to learn something totally different if you see five people walking together,” Everett says. “There may be a social rule of, ‘Don’t move through people, don’t split people up, treat them as one mass.’ That’s something we’re looking at in the future.”</p>
<p>The post <a href="https://www.aiuniverse.xyz/new-robot-rolls-with-the-rules-of-pedestrian-conduct/">New robot rolls with the rules of pedestrian conduct</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/new-robot-rolls-with-the-rules-of-pedestrian-conduct/feed/</wfw:commentRss>
			<slash:comments>8</slash:comments>
		
		
			</item>
	</channel>
</rss>
