<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>SENSORS Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/sensors/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/sensors/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Thu, 25 Mar 2021 06:28:13 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>Deep-learning algorithm designs soft robots with sensors</title>
		<link>https://www.aiuniverse.xyz/deep-learning-algorithm-designs-soft-robots-with-sensors/</link>
					<comments>https://www.aiuniverse.xyz/deep-learning-algorithm-designs-soft-robots-with-sensors/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 25 Mar 2021 06:28:10 +0000</pubDate>
				<category><![CDATA[Deep Learning]]></category>
		<category><![CDATA[algorithm]]></category>
		<category><![CDATA[deep-learning]]></category>
		<category><![CDATA[designs]]></category>
		<category><![CDATA[Robots]]></category>
		<category><![CDATA[SENSORS]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=13779</guid>

					<description><![CDATA[<p>Source &#8211; https://www.theweek.in/ Soft robots collect more useful information about their surroundings Creating soft robots has been a long-running challenge in robotics. Their rigid counterparts have a <a class="read-more-link" href="https://www.aiuniverse.xyz/deep-learning-algorithm-designs-soft-robots-with-sensors/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/deep-learning-algorithm-designs-soft-robots-with-sensors/">Deep-learning algorithm designs soft robots with sensors</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.theweek.in/</p>



<p>Soft robots collect more useful information about their surroundings</p>



<p>Creating soft robots has been a long-running challenge in robotics. Their rigid counterparts have a built-in advantage: a limited range of motion. Rigid robots&#8217; finite array of joints and limbs usually makes for manageable calculations by the algorithms that control mapping and motion planning.</p>



<p>A team of MIT researchers developed a deep learning neural network to aid the design of soft-bodied robots.</p>



<p>Soft-bodied robots are able to interact with people more safely or slip into tight spaces with ease. Soft robots are not so tractable. But for robots to reliably complete their programmed duties, they need to know the whereabouts of all their body parts. That&#8217;s a tall task for a soft robot that can deform in a virtually infinite number of ways. The algorithm developed by MIT researchers can help engineers design soft robots that collect more useful information about their surroundings. The deep-learning algorithm suggests an optimised placement of sensors within the robot&#8217;s body, allowing it to better interact with its environment and complete assigned tasks. The advance is a step toward the automation of robot design. &#8220;The system not only learns a given task, but also how to best design the robot to solve that task,&#8221; says Alexander Amini. &#8220;Sensor placement is a very difficult problem to solve. So, having this solution is extremely exciting.&#8221;</p>



<p>Soft-bodied robots are flexible and pliant—they generally feel more like a bouncy ball than a bowling ball. &#8220;The main problem with soft robots is that they are infinitely dimensional,&#8221; says co-author Andrew Spielberg. &#8220;Any point on a soft-bodied robot can, in theory, deform in any way possible.&#8221; That makes it tough to design a soft robot that can map the location of its body parts. Past efforts have used an external camera to chart the robot&#8217;s position and feed that information back into the robot&#8217;s control program. But the researchers wanted to create a soft robot untethered from external aid.</p>



<p>&#8220;You can&#8217;t put an infinite number of sensors on the robot itself,&#8221; says Spielberg. &#8220;So, the question is: How many sensors do you have, and where do you put those sensors in order to get the most bang for your buck?&#8221; The team turned to deep learning for an answer.</p>



<p>The researchers developed a novel neural network architecture that both optimises sensor placement and learns to efficiently complete tasks. First, the researchers divided the robot&#8217;s body into regions called &#8220;particles.&#8221; Each particle&#8217;s rate of strain was provided as an input to the neural network. Through a process of trial and error, the network &#8220;learns&#8221; the most efficient sequence of movements to complete tasks, like gripping objects of different sizes. At the same time, the network keeps track of which particles are used most often, and it culls the lesser-used particles from the set of inputs for the networks&#8217; subsequent trials.</p>



<p>Spielberg says their work could help to automate the process of robot design. In addition to developing algorithms to control a robot&#8217;s movements, &#8220;we also need to think about how we&#8217;re going to sensorize these robots, and how that will interplay with other components of that system,&#8221; he says. And better sensor placement could have industrial applications, especially where robots are used for fine tasks like gripping. &#8220;That&#8217;s something where you need a very robust, well-optimized sense of touch,&#8221; says Spielberg. &#8220;So, there&#8217;s potential for immediate impact.&#8221;</p>



<p>&#8220;Automating the design of sensorised soft robots is an important step toward rapidly creating intelligent tools that help people with physical tasks,&#8221; says coauthor Daniela Rus. &#8220;The sensors are an important aspect of the process, as they enable the soft robot to &#8220;see&#8221; and understand the world and its relationship with the world.&#8221;</p>



<p>The research will be presented during April&#8217;s IEEE International Conference on Soft Robotics and will be published in the journal IEEE Robotics and Automation Letters.&nbsp;&nbsp;</p>



<p></p>
<p>The post <a href="https://www.aiuniverse.xyz/deep-learning-algorithm-designs-soft-robots-with-sensors/">Deep-learning algorithm designs soft robots with sensors</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/deep-learning-algorithm-designs-soft-robots-with-sensors/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>ARTIFICIAL INTELLIGENCE, IOT SENSORS TECH, ABOARD NASA’S PERSEVERANCE ROVER</title>
		<link>https://www.aiuniverse.xyz/artificial-intelligence-iot-sensors-tech-aboard-nasas-perseverance-rover/</link>
					<comments>https://www.aiuniverse.xyz/artificial-intelligence-iot-sensors-tech-aboard-nasas-perseverance-rover/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 25 Feb 2021 05:27:06 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[IoT]]></category>
		<category><![CDATA[NASA’S]]></category>
		<category><![CDATA[PERSEVERANCE]]></category>
		<category><![CDATA[SENSORS]]></category>
		<category><![CDATA[Tech]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=13076</guid>

					<description><![CDATA[<p>Source &#8211; https://www.analyticsinsight.net/ Apart from its Bull’s Eye Landing, What’s so Unique about Perseverance Rover Mission from NASA? Last Thursday, NASA’s Perseverance rover grabbed headlines all around <a class="read-more-link" href="https://www.aiuniverse.xyz/artificial-intelligence-iot-sensors-tech-aboard-nasas-perseverance-rover/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/artificial-intelligence-iot-sensors-tech-aboard-nasas-perseverance-rover/">ARTIFICIAL INTELLIGENCE, IOT SENSORS TECH, ABOARD NASA’S PERSEVERANCE ROVER</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.analyticsinsight.net/</p>



<p>Apart from its Bull’s Eye Landing, What’s so Unique about Perseverance Rover Mission from NASA?</p>



<p>Last Thursday, NASA’s Perseverance rover grabbed headlines all around the world, with its historic landing on the Martian surface. The rover, which was launched July 30, 2020, from the Cape Canaveral Air Force Station in Florida atop a ULA Atlas 541 rocket, finally made touched down at 3.44pm ET (8.44pm GMT). The mission objective is to search for signs of ancient life and to collect rock and soil samples for possible return to Earth – where it will be scanned for presence of microbial life.</p>



<h3 class="wp-block-heading"><strong>Why Jezero?</strong></h3>



<p>The name Perseverance was given by Alex Mather, who won a K-12 public naming contest with 28,000 entries. Named after the human characteristic, the mission follows similar name-scheme of its predecessors: Curiosity, Spirit and Opportunity. The rover made a touch down at an ancient river delta site in Jezero Crater.</p>



<p>Jezero, which means lake in Balkan languages, is believed to be a water filled lake that existed nearly 4 billion years ago. It is reported that this Martian lake was as big and wet as Nevada and California’s Lake Tahoe. The deposits in the crater are rich in clay minerals, which form in the presence of water, meaning life may have once existed there – and such sediments on Earth have been known to store microscopic fossils. The rover will start at base of delta cliffs, before moving across the delta towards what was possibly a shoreline, then climbing up the 610-metre crater rim. It will take about two Earth years (one Mars’ year) to complete half this journey.</p>



<p>But the journey is not smooth ‘sailing’! The Jezero Crater is full of obstacles and dangers to the rover, including boulders, cliffs, sand dunes and depressions, any one of which could end the mission, both in landing and as the rover drives along the surface.</p>



<h3 class="wp-block-heading"><strong>Perseverance vs Curiosity</strong></h3>



<p>One of the significant upgrades in Perseverance rover from Curosity is that it has received a gentle tread pattern and larger diameter. The reason behind this upgrade is to prevent the rover from getting stuck in the fine Martian sand, as Curiosity did in 2014, and also protect it from sharp Martian rocks called ventifacts.</p>



<p>Both Curiosity and Perseverance share the same basic body (WEB or Warm Electronics Box) and the same type of power source (radioisotope thermoelectric generators), and both landed using a spectacular overhead crane strategy. According to Matt Wallace, the deputy mission manager at NASA’s Jet Propulsion Lab, Perseverance can drive three times faster than any previous Mars rover. It will have a maximum speed of only 4.4 cm per second, which is one-thirtieth as fast as human walking speed.</p>



<p>The rover is powered by 4.8 kilograms of Plutonium 238, which can supply 110 watts of electrical power continuously. A pair of lithium-ion batteries will work in tandem with the RTG to help handle peak demand.</p>



<p>The last NASA spacecraft to land on Mars was the InSight robotic probe, which touched down on 26 November 2018. While there was no live video, the landing was monitored through telemetry clocking the probe’s velocity. Till now, Nasa has nailed eight of nine landing attempts, making the US the only country to achieve a successful touchdown. Following behind is China, whose Tianwen-1 mission along with UAE’s Hope orbiter made it to Martian orbit few days ago. While Tianwen-1’s rover will attempt to land on Mars in May, Hope will remain in orbit to study the Martian atmosphere.</p>



<p>Even the Soviet Union had successfully made the first impact on Mars on 27 November 1971. Unfortunately, the spacecraft was destroyed. Though Soviet Union did manage a soft landing on Mars a few months later, its spacecraft started to transmit an image back to Earth, before going silent a little over a minute and a half into the transmission.</p>



<h3 class="wp-block-heading"><strong>Interesting Technologies Aboard Perseverance</strong></h3>



<p>It is natural to assume that the rover will carry out a number of experiments during its mission. These experiments will leverage a blend of technologies like IoT sensors, robotics, cloud, artificial intelligence and more.</p>



<p>For instance, Perseverance features an automated hazard avoidance system called Terrain Relative Navigation (TRN.) The TRN, or Landing Vision System (LVS), compares real-time images from its camera with an onboard map of the surface in Jezero Crater, Perseverance’s landing site. The map is created from high-definition orbital images of the crater area. If the rover is heading for a hazardous obstacle, it can fire its retrorockets and avoid the hazard.</p>



<p>Perseverance is equipped with a multi-functional instrument called SuperCam. SuperCam contains three separate spectrometers. One of them is called LIBS, or Laser-Induced Breakdown Spectroscopy. The rover also has a drill that uses rotary motion with or without percussion to penetrate the Martian surface to collect the precious samples. The drill is equipped with three different kinds of attachments that allow sample collection and surface analysis. SuperCam also has a microphone. The microphones onboard Perseverance will help scientists record the sounds of its tense “seven minutes of terror” touchdown sequence.</p>



<p>NASA had installed microphone in its Mars Polar Lander spacecraft, and had one built into the Phoenix lander’s descent camera. Sadly, neither mic returned any data. While Mars Polar Lander crashed during its touchdown attempt in December 1999, Phoenix’s descent camera&nbsp;was never turned on&nbsp;due to concerns that its use could complicate the entry, descent and landing (EDL) process. Phoenix which landed in May 2008, had found buried water ice during its successful surface mission.</p>



<p>The rover also has a ground penetrating radar that could detect water up to 10 meters deep.&nbsp;It further, contains multiple science experiments including an electrically powered oxygen generator called MOXIE (Mars Oxygen In-Situ Resource Utilization Experiment), that converts the CO2 from the atmosphere into oxygen. It also has several scientific cameras and a special sensor to protect the robot from incorrectly contacting the surface to reduce the chance of damage.</p>



<p>During the course of its two-year mission, Perseverance will collect up to 43 samples of Martian rock and soil. These samples will be stowed in white tubes on the Martian surface to be returned to Earth on a future planned mission. Official sources report that the sample tubes will be placed in caches on the surface, and the locations of these samples will be catalogued. Orbiter images will identify the sample locations to within one meter (3 ft) and the rover itself will increase that accuracy to within one centimeter. Perseverance also has what are called “witness tubes.” It has five of these tubes, which shall be used to collect molecular and particular contaminants during drilling sessions.</p>



<p>Collecting samples are key part of space-geology and other space sciences. Recently, Japan’s Hayabusa2 spacecraft collected samples from asteroid Ryugu and returned them to Earth. And NASA’s OSIRIS-REx just successfully collected samples from asteroid Bennu. Those samples will be returned to Earth in September 2023.</p>



<p>It robotics arm will also use an artificial intelligence powered device called the Planetary Instrument for X-ray Lithochemistry, or PIXL. PIXL is a lunchbox-size instrument carried at the end of Perseverance’s 7-foot-long robotic arm. Using a coring drill on the end of the arm, the rover will collect core samples from the planet that will be left on the Mars surface for collection by a future mission. A tiny, powerful X-ray beam blast from PIXL can detect over 20 chemical elements by pointing a beam at rocks. The beam produces a telling glow associated with each element present in about 10 seconds.</p>



<p>It also has a partner called Scanning Habitable Environments with Raman and Luminescence for Organics and Chemicals or SHERLOC. SHERLOC will seek out organic molecules and minerals, which helps inform science teams of where to collect and cache samples. Its ultraviolet laser will provide a different glow depending on the organic molecules and minerals it detects. And obviously, elementarily, SHERLOC will have his WATSON even on the Martian surface. WATSON, or Wide-Angle Topographic Sensor for Operations and eNgineering, is a camera that can take microscopic images of grains in rock and textures.</p>



<p>Before astronauts step their foot or even enter the atmosphere of Mars, it is crucial to understand the Martian weather and environment conditions they will face. Perseverance’s monitoring system, called Mars Environmental Dynamics Analyzer, called MEDA, is a suite of sensors that will help scientists study weather science, dust and radiation, and how they change over Martian seasons.</p>



<p>Meanwhile, artificial intelligence is used aboard Perseverance for navigation on the planet’s surface. Heather Justice, robotic operations downlink lead at JPL cites that the Perseverance rover will be using the autonomous technology that is used for self-driving cars on Earth. This autonomous technology, officially known as a vision compute element (VCE), will help it do something called “thinking while driving”. It is installed to help Perseverance land itself on Mars and avoid hazards on the Martian surface.</p>



<p>Perseverance also leverages technologies from HPE and Amazon Web Service (AWS). HPE’s specialized, second-generation Spaceborne Computer-2 (SBC-2), will mark the first time that broad AI and edge computing capabilities will be available to researchers on the space station. SBC-2 computer follows the original Spaceborne Computer-1 that was sent to the International Space Station (ISS) in 2017 as part of a validation study to test it in the rigors of space aboard the orbiting laboratory. SBC-1 returned to earth in 2019 after completing its mission. Both Spaceborne Computer-1 and Spaceborne Computer-2 are sponsored by the ISS National Lab.</p>



<p>Also given the massive volume of data that shall be dealt by the Perseverance rover, the Jet Propulsion Lab will store, process, and distribute this high volume of data using cloud features of AWS.</p>



<h3 class="wp-block-heading"><strong>Helicopter Ingenuity</strong></h3>



<p>The Perseverance rover will not be alone at Mars. Tagging along is a tiny drone-based helicopter called Ingenuity. Running on LINUX, Ingenuity won’t actually do any science, but it will provide important feedback on flight operations in the thin Martian atmosphere. If successful it will be designated as the first powered flight on any planet other than Earth and to hopefully be the blueprint for future Mars missions.</p>



<p>Designed to last about 30 Martian sols, Ingenuity weighs about 2kg, is 1.2 meters wide, and carries two computers. The drone is more of an experiment than a piece of equipment for scientific discovery; engineers want to test its ability to fly autonomously. It can travel to distances stretching to almost 300 meters, and hover 3 to 4.5 meters from the ground for 90 seconds.</p>



<h3 class="wp-block-heading"><strong>What’s Next?</strong></h3>



<p>Mars missions are difficult with a success rate of only 40% so far. However, that has not deterred it from planning its future missions to the Red Planet. For instance, NASA has proposed to send a similar but more capable quadcopter drone to Titan, one of the Moons of the ringed planet Saturn, in 2027. This device is called the Dragonfly mission and it will be nuclear-powered with the ability to fly many kilometers before landing to recharge its battery.</p>
<p>The post <a href="https://www.aiuniverse.xyz/artificial-intelligence-iot-sensors-tech-aboard-nasas-perseverance-rover/">ARTIFICIAL INTELLIGENCE, IOT SENSORS TECH, ABOARD NASA’S PERSEVERANCE ROVER</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/artificial-intelligence-iot-sensors-tech-aboard-nasas-perseverance-rover/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
