<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Gadgets Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/gadgets/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/gadgets/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Mon, 28 Jun 2021 09:09:28 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.1</generator>
	<item>
		<title>How Artificial Intelligence Is Taking Over Our Gadgets</title>
		<link>https://www.aiuniverse.xyz/how-artificial-intelligence-is-taking-over-our-gadgets/</link>
					<comments>https://www.aiuniverse.xyz/how-artificial-intelligence-is-taking-over-our-gadgets/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Mon, 28 Jun 2021 09:09:26 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Gadgets]]></category>
		<category><![CDATA[How]]></category>
		<category><![CDATA[TAKING]]></category>
		<guid isPermaLink="false">https://www.aiuniverse.xyz/?p=14614</guid>

					<description><![CDATA[<p>Source &#8211; https://www.bangkokpost.com/ AI is moving from data centers to devices, making everything from phones to tractors faster and more private. These newfound smarts also come with pitfalls. If you think of AI as something futuristic and abstract, start thinking different. We&#8217;re now witnessing a turning point for artificial intelligence, as more of it comes <a class="read-more-link" href="https://www.aiuniverse.xyz/how-artificial-intelligence-is-taking-over-our-gadgets/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-artificial-intelligence-is-taking-over-our-gadgets/">How Artificial Intelligence Is Taking Over Our Gadgets</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.bangkokpost.com/</p>



<p><em>AI is moving from data centers to devices, making everything from phones to tractors faster and more private. These newfound smarts also come with pitfalls.</em></p>



<p>If you think of AI as something futuristic and abstract, start thinking different.</p>



<p>We&#8217;re now witnessing a turning point for artificial intelligence, as more of it comes down from the clouds and into our smartphones and automobiles. While it&#8217;s fair to say that AI that lives on the &#8220;edge&#8221; &#8212; where you and I are &#8212; is still far less powerful than its datacenter-based counterpart, it&#8217;s potentially far more meaningful to our everyday lives.</p>



<p>One key example: This fall, Apple&#8217;s Siri assistant will start processing voice on iPhones.</p>



<p>Right now, even your request to set a timer is sent as an audio recording to the cloud, where it is processed, triggering a response that&#8217;s sent back to the phone.</p>



<p>By processing voice on the phone, says Apple, Siri will respond more quickly. This will only work on the iPhone XS and newer models, which have a compatible built-for-AI processor the company calls a &#8220;neural engine.&#8221;</p>



<p>People might also feel more secure knowing that their voice recordings aren&#8217;t being sent to unseen computers in faraway places.</p>



<p>Google actually led the way with on-phone processing: In 2019, it introduced a Pixel phone that could transcribe speech to text and perform other tasks without any connection to the cloud.</p>



<p>One reason Google decided to build its own phones was that the company saw potential in creating custom hardware tailor-made to run AI, says Brian Rakowski, product manager of the Pixel group at Google.</p>



<p>These so-called edge devices can be pretty much anything with a microchip and some memory, but they tend to be the newest and most sophisticated of smartphones, automobiles, drones, home appliances, and industrial sensors and actuators.</p>



<p>Edge AI has the potential to deliver on some of the long-delayed promises of AI, like more responsive smart assistants, better automotive safety systems, new kinds of robots, even autonomous military machines.</p>



<p>The challenges of making AI work at the edge &#8212; that is, making it reliable enough to do its job and then justifying the additional complexity and expense of putting it in our devices &#8212; are monumental.</p>



<p>Existing AI can be inflexible, easily fooled, unreliable and biased. In the cloud, it can be trained on the fly to get better &#8212; think about how Alexa improves over time. When it&#8217;s in a device, it must come pre-trained, and be updated periodically.</p>



<p>Yet the improvements in chip technology in recent years have made it possible for real breakthroughs in how we experience AI, and the commercial demand for this sort of functionality is high.</p>



<p><strong>From swords to plowshares</strong></p>



<p>Shield AI, a contractor for the Department of Defense, has put a great deal of AI into quadcopter-style drones which have already carried out &#8212; and continue to be used in &#8212; real-world combat missions.</p>



<p>One mission is to help soldiers scan for enemy combatants in buildings that must be cleared.</p>



<p>The DoD has been eager to use the company&#8217;s drones, says Shield AI&#8217;s co-founder, Brandon Tseng, because even if they fail, they can be used to reduce human casualties.</p>



<p>&#8220;In 2016 and early 2017, we had early prototypes with something like 75% reliability, something you would never take to market, and the DoD were saying, &#8216;We&#8217;ll take that overseas and use that in combat right now&#8217;,&#8221; Mr. Tseng says.</p>



<p>When he protested that the system wasn&#8217;t ready, the response from within the military was that anything was better than soldiers going through a door and being shot.</p>



<p>In a combat zone, you can&#8217;t count on a fast, robust, wireless cloud connection, especially now that enemies often jam wireless communication and GPS signals. When on a mission, processing and image recognition must occur on the company&#8217;s drones themselves.</p>



<p>Shield AI uses a small, efficient computer made by Nvidia, designed for running AI on devices, to create a quadcopter drone no bigger than a typical camera-wielding consumer model.</p>



<p>The Nova 2 can fly long enough to enter a building, and use AI to recognize and examine dozens of hallways, stairwells and rooms, cataloging objects and people it sees along its way.</p>



<p>Meanwhile, in the town of Salinas, Calif., birthplace of&nbsp;Grapes of Wrath&nbsp;author John Steinbeck and an agricultural center to this day, a robot the size of an SUV is spending this year&#8217;s growing season raking the earth with its 12 robotic arms.</p>



<p>Made by FarmWise Labs Inc., the robot trundles along fields of celery as if it were any other tractor. Underneath its metal shroud, it uses computer vision and an edge AI system to decide, in less than a second, whether a plant is a food crop or a weed, and directs its plow-like claws to avoid or eradicate the plant accordingly.</p>



<p>FarmWise&#8217;s huge, diesel robo-weeder can generate its own electricity, enabling it to carry a veritable supercomputer&#8217;s worth of processing power &#8212; four GPUs and 16 CPUs which together draw 500 watts of electricity.</p>



<p>In our everyday lives, things like voice transcription that work whether or not we have a connection, or how good it is, could mean shifts in how we prefer to interact with our mobile devices.</p>



<p>Getting always-available voice transcription to work on Google&#8217;s Pixel phone &#8220;required a lot of breakthroughs to run on the phone as well as it runs on a remote server,&#8221; says Mr. Rakowski.</p>



<p>Google has almost unlimited resources to experiment with AI in the cloud, but getting those same algorithms, for everything from voice transcription and power management to real-time translation and image processing, to work on phones required the introduction of custom microprocessors like the Pixel Neural Core, he adds.</p>



<p><strong>Turning cats into pure math</strong></p>



<p>What nearly all edge AI systems have in common is that, as pre-trained AI, they are only performing &#8220;inference,&#8221; says Dennis Laudick, vice president of marketing for AI and machine learning at Arm Holdings, which licenses chip designs and instructions to companies such as Apple, Samsung, Qualcomm, Nvidia and others.</p>



<p>Generally speaking, machine-learning AI consists of four phases:</p>



<p>Data is captured or collected: Say, for example, in the form of millions of cat pictures.</p>



<p>Humans label the data: Yes, these are cat photos.</p>



<p>AI is trained with the labeled data: This process selects for models that identify cats.</p>



<p>Then the resulting pile of code is turned into an algorithm and implemented in software: Here&#8217;s a camera app for cat lovers!</p>



<p>(Note: If this doesn&#8217;t exist yet, consider it your million-dollar idea of the day.)</p>



<p>The last bit of the process &#8212; something like that cat-identifying software &#8212; is the inference phase.</p>



<p>The software on many smart surveillance cameras, for example, is performing inference, says Eric Goodness, a research vice president at technology-consulting firm Gartner.</p>



<p>These systems can already identify how many patrons are in the restaurant, if any are engaging in undesirable behavior, or if the fries have been in the fryer too long.</p>



<p>It&#8217;s all just mathematical functions, ones so complicated that it would take a monumental effort by humans to write them, but which machine-learning systems can create when trained on enough data.</p>



<p><strong>Robot pratfalls</strong></p>



<p>While all of this technology has enormous promise, making AI work on individual devices, whether or not they can connect to the cloud, comes with a daunting set of challenges, says Elisa Bertino, a professor of computer science at Purdue University.</p>



<p>Modern AI, which is primarily used to recognize patterns, can have difficulty coping with inputs outside of the data it was trained on. Operating in the real world only makes it tougher &#8212; just consider the classic example of a Tesla that brakes when it sees a stop sign on a billboard.</p>



<p>To make edge AI systems more competent, one edge device might gather some data but then pair with another, more powerful device, which can integrate data from a variety of sensors, says Dr. Bertino.</p>



<p>If you&#8217;re wearing a smartwatch with a heart-rate monitor, you&#8217;re already witnessing this: The watch&#8217;s edge AI pre-processes the weak signal of your heart rate, then passes that data to your smartphone, which can further analyze that data &#8212; whether or not it&#8217;s connected to the internet.</p>



<p>The overwhelming majority of AI algorithms are still trained in the cloud. They can also be retrained using more or fresher data, which lets them continually improve.</p>



<p>Down the road, says Mr. Goodness, edge AI systems will begin to learn on their own &#8212; that is, they&#8217;ll become powerful enough to move beyond inference and actually gather data and use it to train their own algorithms.</p>



<p>AI that can learn all by itself, without connection to a cloud superintelligence, might eventually raise legal and ethical challenges.</p>



<p>How can a company certify an algorithm that&#8217;s been off evolving in the real world for years after its initial release, asks Dr. Bertino.</p>



<p>And in future wars, who will be willing to let their robots decide when to pull the trigger? Whoever does might end up with an advantage &#8212; but also all the collateral damage that happens when, inevitably, AI makes mistakes.</p>
<p>The post <a href="https://www.aiuniverse.xyz/how-artificial-intelligence-is-taking-over-our-gadgets/">How Artificial Intelligence Is Taking Over Our Gadgets</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-artificial-intelligence-is-taking-over-our-gadgets/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Meet YuMi: A Robot Nurse Built to Make the Rounds</title>
		<link>https://www.aiuniverse.xyz/meet-yumi-a-robot-nurse-built-to-make-the-rounds/</link>
					<comments>https://www.aiuniverse.xyz/meet-yumi-a-robot-nurse-built-to-make-the-rounds/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Sat, 21 Dec 2019 06:37:58 +0000</pubDate>
				<category><![CDATA[Data Robot]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[computers]]></category>
		<category><![CDATA[Gadgets]]></category>
		<category><![CDATA[medical technology]]></category>
		<category><![CDATA[Robotics]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=5738</guid>

					<description><![CDATA[<p>Source: discovermagazine.com ABB’s robotic lab technician, YuMi, and Nurse Ratched have more in common than might appear at first blush. They’re both cold; they’re both heartless; and they both really want to help you take your meds. But while Nurse Ratched notoriously represents the corrupting power of institutionalized bureaucracy, this robot, named YuMi, just wants <a class="read-more-link" href="https://www.aiuniverse.xyz/meet-yumi-a-robot-nurse-built-to-make-the-rounds/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/meet-yumi-a-robot-nurse-built-to-make-the-rounds/">Meet YuMi: A Robot Nurse Built to Make the Rounds</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: discovermagazine.com</p>



<p>ABB’s robotic lab technician, YuMi, and Nurse Ratched have more in common than might appear at first blush. They’re both cold; they’re both heartless; and they both really want to help you take your meds.</p>



<p>But while Nurse Ratched notoriously represents the corrupting power of institutionalized bureaucracy, this robot, named YuMi, just wants to help hospitals and research labs run a little smoother.</p>



<p>The Swiss robotics and automation company showcased the roving lab tech earlier this fall at its new healthcare research hub, which is a collaboration with Texas Medical Research Innovation Institute in Houston. The hybrid lab combines a staff of 20 with an array of robotic assistants to test new ways that humans and machines can collaborate at the heart of medicine.</p>



<p>And there’s some urgency to their work. Baby Boomers are aging, and an unprecedented number of Americans are poised to enter the healthcare system over the next 10 years. Simultaneously, the industry is facing a deep shortage of nurses, doctors and other medical staff — particularly in home healthcare. There’s hope that robotics, artificial intelligence and automation will help leaders navigate these seismic demographic shifts and deliver care to more people and potentially with fewer resources.</p>



<p>Making the Rounds <br>
In contrast to gargantuan robotic arms locked in cages along automobile assembly lines, YuMi is designed to work closely with humans as a gentler, collaborative sidekick. YuMi’s precise touch and range of motion make it adaptable to a wide range of tasks, from basics like sorting and unboxing to more elaborate tasks like folding paper airplanes, playing pool or directing symphonies.</p>



<p>For one of their medical bot prototypes, ABB engineers simply mounted YuMi atop a moving platform. YuMi uses its machine vision to avoid staffers and other obstacles, and can be programmed to do any number of rote, time-consuming tasks. YuMi could pick-up patient tests and transport them to the lab for processing. Delivering food and linens is no problem. YuMi can even easily deliver morning and evening medications right to door.</p>



<p>ABB also fitted a lab with other YuMi concepts that sort pills, prepare and unpackage medicines, load and unload centrifuges, and execute lab work pipetting. The robots are best suited for the repetitive, high volume tasks that consume a big part of staff time. ABB engineers say robots can perform these tasks 50 percent faster, and can also do them 24 hours a day. Ultimately, it gives staff more time to focus on higher-level work.</p>



<p>“The health care sector is undergoing significant transformation as the diagnosis and treatment of disease advances, while coping with an aging population, increasing costs and a growing worldwide shortage of medical staff,” Sami Atiya, president of ABB’s robotics and discrete automation business, said in a press release.</p>



<p>Feeling the Crunch <br>
A recent report from the U.S. Department of Veterans Affairs Office of the Inspector General found that 96 percent of VA facilities reported at least one “severe” occupational shortage as of December 2018. Thirty-nine percent reported 20 or more shortages. Mercer, a healthcare consultancy, estimates the United States will need to hire 2.3 million healthcare workers by 2025 to address the labor gap.</p>



<p>Robots could be key to helping drive down the costs of care and help medical workers do more with smaller teams. ABB estimates there will be some 60,000 medical robots on the job within five years or so. Robots, along with telemedicine, data mining, advances in genetics and so much more, are radically redefining what it means to visit the doctor.</p>
<p>The post <a href="https://www.aiuniverse.xyz/meet-yumi-a-robot-nurse-built-to-make-the-rounds/">Meet YuMi: A Robot Nurse Built to Make the Rounds</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/meet-yumi-a-robot-nurse-built-to-make-the-rounds/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>This prosthetic arm combines manual control with machine learning</title>
		<link>https://www.aiuniverse.xyz/this-prosthetic-arm-combines-manual-control-with-machine-learning/</link>
					<comments>https://www.aiuniverse.xyz/this-prosthetic-arm-combines-manual-control-with-machine-learning/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Sat, 14 Sep 2019 12:08:41 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[EPFL]]></category>
		<category><![CDATA[Gadgets]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[Prosthetics]]></category>
		<category><![CDATA[Robotics]]></category>
		<category><![CDATA[Science]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=4483</guid>

					<description><![CDATA[<p>Source: techcrunch.com Prosthetic limbs are getting better every year, but the strength and precision they gain doesn’t always translate to easier or more effective use, as amputees have only a basic level of control over them. One promising avenue being investigated by Swiss researchers is having an AI take over where manual control leaves off. To visualize <a class="read-more-link" href="https://www.aiuniverse.xyz/this-prosthetic-arm-combines-manual-control-with-machine-learning/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/this-prosthetic-arm-combines-manual-control-with-machine-learning/">This prosthetic arm combines manual control with machine learning</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: techcrunch.com</p>



<p>Prosthetic limbs are getting better every year, but the strength and precision they gain doesn’t always translate to easier or more effective use, as amputees have only a basic level of control over them. One promising avenue being investigated by Swiss researchers is having an AI take over where manual control leaves off.</p>



<p>To visualize the problem, imagine a person with their arm amputated above the elbow controlling a smart prosthetic limb. With sensors placed on their remaining muscles and other signals, they may fairly easily be able to lift their arm and direct it to a position where they can grab an object on a table.</p>



<p>But what happens next? The many muscles and tendons that would have controlled the fingers are gone, and with them the ability to sense exactly how the user wants to flex or extend their artificial digits. If all the user can do is signal a generic “grip” or “release,” that loses a huge amount of what a hand is actually good for.</p>



<p>Here’s where researchers from École polytechnique fédérale de Lausanne (EPFL)  take over. Being limited to telling the hand to grip or release isn’t a problem if the hand knows what to do next — sort of like how our natural hands “automatically” find the best grip for an object without our needing to think about it. Robotics researchers have been working on automatic detection of grip methods for a long time, and it’s a perfect match for this situation.</p>



<p>Prosthesis users train a machine learning model by having it observe their muscle signals while attempting various motions and grips as best they can without the actual hand to do it with. With that basic information the robotic hand knows what type of grasp it should be attempting, and by monitoring and maximizing the area of contact with the target object, the hand improvises the best grip for it in real time. It also provides drop resistance, being able to adjust its grip in less than half a second should it start to slip.</p>



<p>The result is that the object is grasped strongly but gently for as long as the user continues gripping it with, essentially, their will. When they’re done with the object, having taken a sip of coffee or moved a piece of fruit from a bowl to a plate, they “release” the object and the system senses this change in their muscles’ signals and does the same.</p>



<p>It’s reminiscent of another approach, by students in Microsoft’s Imagine Cup, in which the arm is equipped with a camera in the palm that gives it feedback on the object and how it ought to grip it.</p>



<p>It’s all still very experimental, and done with a third-party robotic arm and not particularly optimized software. But this “shared control” technique is promising and could very well be foundational to the next generation of smart prostheses. The team’s paper is published in the journal Nature Machine Intelligence.</p>
<p>The post <a href="https://www.aiuniverse.xyz/this-prosthetic-arm-combines-manual-control-with-machine-learning/">This prosthetic arm combines manual control with machine learning</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/this-prosthetic-arm-combines-manual-control-with-machine-learning/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Facebook&#8217;s Artificial Intelligence Roborts Shut Down After They Start Talking to Each Other in Their own Language</title>
		<link>https://www.aiuniverse.xyz/facebooks-artificial-intelligence-roborts-shut-down-after-they-start-talking-to-each-other-in-their-own-language/</link>
					<comments>https://www.aiuniverse.xyz/facebooks-artificial-intelligence-roborts-shut-down-after-they-start-talking-to-each-other-in-their-own-language/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 01 Aug 2017 08:13:19 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Human Intelligence]]></category>
		<category><![CDATA[chatbots]]></category>
		<category><![CDATA[Facebook]]></category>
		<category><![CDATA[Facebook challenged]]></category>
		<category><![CDATA[Gadgets]]></category>
		<category><![CDATA[LANGUAGE]]></category>
		<category><![CDATA[Robots]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=415</guid>

					<description><![CDATA[<p>Source &#8211; independent.co.uk Facebook has shut down two artificial intelligences that appeared to be chatting to each other in a strange language only they understood. The two chatbots came to create their own changes to English that made it easier for them to work – but which remained mysterious to the humans that supposedly look after <a class="read-more-link" href="https://www.aiuniverse.xyz/facebooks-artificial-intelligence-roborts-shut-down-after-they-start-talking-to-each-other-in-their-own-language/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/facebooks-artificial-intelligence-roborts-shut-down-after-they-start-talking-to-each-other-in-their-own-language/">Facebook&#8217;s Artificial Intelligence Roborts Shut Down After They Start Talking to Each Other in Their own Language</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source &#8211; <strong>independent.co.uk</strong></p>
<p>Facebook has shut down two artificial intelligences that appeared to be chatting to each other in a strange language only they understood.</p>
<p>The two chatbots came to create their own changes to English that made it easier for them to work – but which remained mysterious to the humans that supposedly look after them.</p>
<p>The bizarre discussions came as Facebook challenged its chatbots to try and negotiate with each other over a trade, attempting to swap hats, balls and books, each of which were given a certain value. But they quickly broke down as the robots appeared to chant at each other in a language that they each understood but which appears mostly incomprehensible to humans.</p>
<p>The robots had been instructed to work out how to negotiate between themselves, and improve their bartering as they went along.</p>
<p>The actual negotiations appear very odd, and don&#8217;t look especially useful:</p>
<div class="inline-block inline-block-infobox ines_infobox inline-block-center">
<div class="infobox-header"></div>
<div class="ines_infobox-content">
<p>Bob: i can i i everything else . . . . . . . . . . . . . .</p>
<p>Alice: balls have zero to me to me to me to me to me to me to me to me to</p>
<p>Bob: you i everything else . . . . . . . . . . . . . .</p>
<p>Alice: balls have a ball to me to me to me to me to me to me to me</p>
<p>Bob: i i can i i i everything else . . . . . . . . . . . . . .</p>
<p>Alice: balls have a ball to me to me to me to me to me to me to me</p>
<p>Bob: i . . . . . . . . . . . . . . . . . . .</p>
<p>Alice: balls have zero to me to me to me to me to me to me to me to me to</p>
<p>Bob: you i i i i i everything else . . . . . . . . . . . . . .</p>
<p>Alice: balls have 0 to me to me to me to me to me to me to me to me to</p>
<p>Bob: you i i i everything else . . . . . . . . . . . . . .</p>
<p>Alice: balls have zero to me to me to me to me to me to me to me to me to</p>
</div>
</div>
<p>But there appear to be some rules to the speech. The way the chatbots keep stressing their own name appears to a part of their negotiations, not simply a glitch in the way the messages are read out.</p>
<p>Indeed, some of the negotiations that were carried out in this bizarre language even ended up successfully concluding their negotiations, while conducting them entirely in the bizarre language.</p>
<div class="dnd-widget-wrapper context-sdl_editor_representation type-gallery">
<div class="dnd-atom-rendered">
<div class="image">
<div class="container grid-mod-gallery" data-scald-gallery="3906706">
<h2 class="gallery-title">Gadgets and tech news in pictures</h2>
<div class="images"></div>
</div>
<div class="full-gallery"></div>
</div>
</div>
</div>
<p>That said, it&#8217;s unlikely that the language is a precursor to new forms of human speech, according to linguist Mark Liberman.</p>
<p>&#8220;n the first place, it&#8217;s entirely text-based, while human languages are all basically spoken (or gestured), with text being an artificial overlay,&#8221; he wrote on his blog. &#8220;And beyond that, it&#8217;s unclear that this process yields a system with the kind of word, phrase, and sentence structures characteristic of human languages.&#8221;</p>
<p>The chatbots also learned to negotiate in ways that seem very human. They would, for instance, pretend to be very interested in one specific item – so that they could later pretend they were making a big sacrifice in giving it up, according to a paper published by the Facebook Artificial Intelligence Research division.</p>
<p>Facebook&#8217;s experiment isn&#8217;t the only time that artificial intelligence has invented new forms of language.</p>
<p>Earlier this year, Google revealed that the AI it uses for its Translate tool had created its own language, which it would translate things into and then out of. But the company was happy with that development and allowed it to continue.</p>
<p>Another study at OpenAI found that artificial intelligence could be encouraged to create a language, making itself more efficient and better at communicating as it did so.</p>
<p>The post <a href="https://www.aiuniverse.xyz/facebooks-artificial-intelligence-roborts-shut-down-after-they-start-talking-to-each-other-in-their-own-language/">Facebook&#8217;s Artificial Intelligence Roborts Shut Down After They Start Talking to Each Other in Their own Language</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/facebooks-artificial-intelligence-roborts-shut-down-after-they-start-talking-to-each-other-in-their-own-language/feed/</wfw:commentRss>
			<slash:comments>3</slash:comments>
		
		
			</item>
	</channel>
</rss>
