<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>environmental Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/environmental/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/environmental/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Thu, 25 Feb 2021 06:03:58 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.1</generator>
	<item>
		<title>Using Big Data to Measure Environmental Inclusivity in Cities</title>
		<link>https://www.aiuniverse.xyz/using-big-data-to-measure-environmental-inclusivity-in-cities/</link>
					<comments>https://www.aiuniverse.xyz/using-big-data-to-measure-environmental-inclusivity-in-cities/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 25 Feb 2021 06:03:56 +0000</pubDate>
				<category><![CDATA[Big Data]]></category>
		<category><![CDATA[Big data]]></category>
		<category><![CDATA[Cities]]></category>
		<category><![CDATA[environmental]]></category>
		<category><![CDATA[Inclusivity]]></category>
		<category><![CDATA[Measure]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=13094</guid>

					<description><![CDATA[<p>Source- https://eos.org/ Lower-income urban communities bear the brunt of environmental burdens, even in wealthy green cities around the world. The trouble with comparing cities, researchers have found, is you end up comparing apples and oranges—coasts and interiors, seasonal freezes and yearlong tropical humidity, strictly planned communities and suburban sprawl. It’s even more problematic than that <a class="read-more-link" href="https://www.aiuniverse.xyz/using-big-data-to-measure-environmental-inclusivity-in-cities/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/using-big-data-to-measure-environmental-inclusivity-in-cities/">Using Big Data to Measure Environmental Inclusivity in Cities</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source- https://eos.org/</p>



<p>Lower-income urban communities bear the brunt of environmental burdens, even in wealthy green cities around the world.</p>



<p>The trouble with comparing cities, researchers have found, is you end up comparing apples and oranges—coasts and interiors, seasonal freezes and yearlong tropical humidity, strictly planned communities and suburban sprawl. It’s even more problematic than that because a city is not defined by a single uniform identity. Each city comprises a unique blend of neighborhoods where social and environmental conditions can change from street to street.</p>



<p>Given this complexity, how can we possibly assess global progress toward Sustainable Development Goal 11 (SDG 11), making cities inclusive, safe, resilient and sustainable? Can we measure these concepts universally?</p>



<p>Now an international research team has developed an approach using publicly available big data. The tool—the Urban Environment and Social Inclusion Index (UESI)—can assess environmental conditions at the scale of individual neighborhoods.</p>



<p>Applying UESI to 164 cities spread across all continents (excluding Antarctica), the researchers found that most cities leave lower-income communities with higher shares of environmental burdens and lower shares of environmental benefits. Interestingly, stark inequalities are also seen in many cities with high overall environmental performance—wealthy cities that regularly receive plaudits for their green credentials.</p>



<p>“Copenhagen, Paris, and London, even if they’re doing really well overall on environmental indicators—so their air pollution levels are low, they have green space—they’re not providing amenities and environment benefits equally amongst all their citizens,” said Angel Hsu of the University of North Carolina at Chapel Hill, who led the research.</p>



<h3 class="wp-block-heading"><strong>Targets Are Cheap, Data Are Expensive</strong></h3>



<p>By 2050, two thirds of the global population—6.5 billion people—will reside in urban areas. With that trend comes a number of challenges with consequences for the planet and urban dwellers.</p>



<p>SDG 11 was developed to address our increasingly urban population and has 10 specific targets for 2030. Many of these have an environmental component. Targets include all city residents having access to affordable public transport (11.2), adequate waste management systems (11.6), and green spaces (11.7). There is also a crosscutting focus on protecting vulnerable communities exposed to urban hazards such as air pollution.</p>



<p>Few would disagree with these inclusive targets, but it is a challenge to quantify progress in the real world. “The word ‘inclusive’ in itself is very nebulous; what do we really mean by inclusivity? And then you also have to think about how we actually go about measuring it,” said Hsu.</p>



<p>Cost is another challenge. It is expensive to collect granular data at local levels, especially for cities in developing nations. As a result, data reported are inconsistent, and it can be precarious to start comparing urban areas in different parts of the world.</p>



<p>“Rankings can lead to perverse incentives to hide data to avoid negative publicity,” said Jacqueline Klopp, codirector of the Center for Sustainable Urban Development at Columbia University, who was not involved in the UESI project. Klopp recalled how Dakar, Senegal, was labeled as having some of the worst air pollution in Africa, purely because local officials had made the progressive move of making air pollution data public.</p>



<h3 class="wp-block-heading"><strong>Big Data Solution</strong></h3>



<p>To tackle this information deficit, Hsu’s team at the Data-Driven Lab turned to the skies. The group’s UESI tool tracks surface conditions using satellite data, mainly from the Landsat program and the MODIS (Moderate Resolution Imaging Spectroradiometer) instrument aboard NASA’s Terra satellite. Crowdsourced transport data are collected from Open Street Map. By integrating these data with demographic information, UESI can evaluate global neighborhoods in a consistent fashion.</p>



<p>In the study, published in <em>Frontiers in Sustainable Cities</em>, participating cities are placed within a four-quadrant plot. This plot indicates each city’s overall environmental performance as well as its environmental inclusivity.</p>



<p>Only a handful of cities scored highly on both environmental performance and environmental inclusivity, including Stockholm, Sweden; Darwin, Australia; Quito, Equator; and Freetown, Sierra Leone. No cities with a population greater than 10 million people made it into this category.</p>



<p>More than half of the 164 cities fell into the category of good overall performance but with lower-income neighborhoods bearing a disproportionate share of environmental burdens. This group includes Seattle, Copenhagen, and Melbourne—just three among several that regularly appear on “livable cities” lists.</p>



<p>Reasons for these inequalities are linked with local contexts. In inner-city communities, air pollution is often worse, and a lack of tree cover and green space can result in dangerously high summer temperatures through the urban heat island effect. There are exceptions to this pattern in emerging economies like China and India, however, where inner-city neighborhoods are occupied by wealthier communities.</p>



<p>Full results can be seen on the Data-Driven Lab website, and Hsu said that other locations can be added if city representatives provide basic demographic data.</p>



<p>“The UESI has some important advantages in helping to meet the challenges of measuring progress towards urban elements of the SDGs,” said David Simon, a development geography researcher at Royal Holloway, University of London, who was not involved in the new research. Simon stresses, however, that ground truthing will remain essential, given that many phenomena are invisible to remotely sensed imagery.</p>



<p>To develop the research, Hsu’s team plans to add higher-resolution pollution data and to develop more nuanced metrics such as recognizing informal transport networks in developing countries.</p>



<p></p>
<p>The post <a href="https://www.aiuniverse.xyz/using-big-data-to-measure-environmental-inclusivity-in-cities/">Using Big Data to Measure Environmental Inclusivity in Cities</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/using-big-data-to-measure-environmental-inclusivity-in-cities/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Here&#8217;s how Google is putting AI to work in healthcare, environmental conservation, agriculture and more</title>
		<link>https://www.aiuniverse.xyz/heres-how-google-is-putting-ai-to-work-in-healthcare-environmental-conservation-agriculture-and-more/</link>
					<comments>https://www.aiuniverse.xyz/heres-how-google-is-putting-ai-to-work-in-healthcare-environmental-conservation-agriculture-and-more/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 16 Jul 2019 09:51:05 +0000</pubDate>
				<category><![CDATA[Google AI]]></category>
		<category><![CDATA[Agriculture]]></category>
		<category><![CDATA[conservation]]></category>
		<category><![CDATA[environmental]]></category>
		<category><![CDATA[Google]]></category>
		<category><![CDATA[Healthcare]]></category>
		<category><![CDATA[putting]]></category>
		<category><![CDATA[work]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=4043</guid>

					<description><![CDATA[<p>Source:digit.in Earlier this year, Microsoft had invited us to its Bengaluru campus for a two-day briefing on how it&#8217;s incorporating artificial intelligence (AI) in many of its business solutions, including Azure, Power BI, Teams, and Office 365. In addition to letting a few of its business partners explain how these AI-enabled services help them, the <a class="read-more-link" href="https://www.aiuniverse.xyz/heres-how-google-is-putting-ai-to-work-in-healthcare-environmental-conservation-agriculture-and-more/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/heres-how-google-is-putting-ai-to-work-in-healthcare-environmental-conservation-agriculture-and-more/">Here&#8217;s how Google is putting AI to work in healthcare, environmental conservation, agriculture and more</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source:digit.in</p>



<p> Earlier this year, Microsoft had invited us  to its Bengaluru campus for a two-day briefing on how it&#8217;s  incorporating artificial intelligence (AI) in many of its business  solutions, including Azure, Power BI, Teams, and Office 365. In addition  to letting a few of its business partners explain how these AI-enabled  services help them, the Redmond-based software giant had demonstrated  its Garage-developed apps such as Kaizala, Seeing AI, and Soundscape. </p>



<p>In a style quite similar to Microsoft&#8217;s, Google invited us to its  Roppongi Hills office in Tokyo for a one-day briefing titled, “Solve…  with AI” earlier this week. The briefing was headed by Jeff Dean, a  Senior Fellow and AI Lead at Google. While Microsoft&#8217;s briefing on AI  mostly revolved around solutions that tackle IT business challenges,  Google&#8217;s briefing addressed solutions aimed towards the “social good”.  Product leads from Google AI explained how the company&#8217;s technology is  being put to use in areas like healthcare, environmental conservation,  agriculture, and others. Google invited a few of its business partners  to add inputs and examples during the briefing. </p>



<h4 class="wp-block-heading"><strong>Introduction</strong></h4>



<p>The briefing began with Dean delivering the keynote address in which  he explained the basics of machine learning (ML), which is a subset of  AI that involves training a computer to recognise patterns by example,  rather than programming it with specific rules. He explained how neural  networks can be trained to identify patterns that are either too vast or  complex for humans with the use of relatively simple mathematical  functions. ML models are developed for this purpose.</p>



<p>Apart from employing them in its own products, Google offers ML tools
 along with some reference implementation information to researchers and
 developers to build AI-enabled software. Examples of such tools include
 the open-source TensorFlow software library, CloudML platform, Cloud 
Vision API, Cloud Translate API, Cloud Speech API, and Cloud Natural 
Language API. Google incorporates ML models in its offerings, including 
Search, Photos, Translate, Gmail, YouTube, Chrome, etc.

</p>



<p>Dean used the example of an air quality monitoring tool called  Air Cognizer to demonstrate how TensorFlow is used in everyday mobile  app development. Air Cognizer is an app developed in India as part of  Celestini Project India 2018. It can help detect the air quality level  of the surrounding area by scanning a picture taken through the Android  device’s camera. Dean went on to say that that was only one such example  of developers and researchers using Google’s machine learning tools to  create AI-enabled apps and services. After Dean’s introduction, other  Google AI team leaders took the stage one by one to talk about other  areas in which Google’s ML efforts are making a difference.</p>



<h4 class="wp-block-heading"><strong>Healthcare</strong></h4>



<p>Lily Peng, Product Manager for Google Health, came on stage after  Dean&#8217;s introduction to talk about how Google&#8217;s AI ventures help in the  field of healthcare. “We believe that technology can have a big impact  in medicine, helping democratize access to care, returning attention to  patients and helping researchers make scientific discoveries,” she said  during her presentation. She supported her statement by citing three  specific areas in which Google&#8217;s ML models are seeing success: lung  cancer screening, breast cancer metastases detection, and diabetic eye  disease detection.</p>



<p>Google&#8217;s ML model can, according to the company, analyse CT scans and
 predict lung malignancies in cancer screening tests. In the tests 
conducted by Google, the company&#8217;s model detected 5 percent more cancer 
cases, thereby reducing false positives by over 11 percent compared to 
radiologists. According to Google, early diagnosis can go a long way in 
treating the deadly disease but over 80 percent of lung cancers are not 
caught early.

</p>



<p>In breast cancer metastases detection, Google says its ML model  can find 95 percent of cancer lesions in pathology images. Google claims  that pathologists can generally only detect 73 percent of cancer  lesions. Its model can scan medical slides better, which are each up to  10 GigaPixels in size. Google says it&#8217;s also more successful in  detecting false positives than doctors. Google says that it has found  that the combination of pathologists and AI was more accurate than  either alone.</p>



<p> Google says that, with the help of its sister company Verily,  it&#8217;s becoming increasingly more successful in treating diabetic  retinopathy. The company is currently piloting the use of its ML model  for detection of cases of diabetic retinopathy in India and Thailand.  Google believes that there&#8217;s a shortage of doctors and special equipment  in many places, which is one of the reasons the disease isn&#8217;t caught  early, leading to lifelong blindness amongst patients. </p>



<h4 class="wp-block-heading"><strong>Environmental conservation</strong></h4>



<p>Julie Cattiau, a Product Manager at Google AI, explained how wildlife  on the planet has decreased by 58 percent in the past half a century.  According to her, Google&#8217;s AI technology is currently helping  conservationists track the sound of humpback whales, an at-risk marine  species, in order to prevent losing them altogether to extinction. In one bioacoustics project,  Google has apparently partnered with NOAA (National Oceanic and  Atmospheric Administration), which has collected over 19 years worth of  underwater audio data so far. </p>



<p>Google says that it was able to train its neural network (or 
“whale classifier”) to identify the call of a humpback whale within that
 19-year-long audio data set. During her presentation, Cattiau said that
 this was a big challenge for the researchers partly because the sound 
of a humpback whale can easily be mistaken for that of another type of 
whale or ships passing by. Google believes that its AI technology was 
successful and helpful in the project as listening for the call of a 
whale in a data set that vast is a task that would take a human being an
 inordinate amount of time to complete.

</p>



<p>Topher White, the CEO of Rainforest Connection, was one of the many partners invited by Google  to participate in the briefing. With the use of a proprietary  technology, Rainforest Connection prevents illegal deforestation by  listening for sounds of chainsaws and logging trucks in rainforests  across ten countries and alerting local authorities. Its technology  involves the use of refurbished solar-charged Android smartphones that  use Google TensorFlow to analyse the auditory data in real-time from  within a rainforest. According to White, deforestation is a bigger cause  of climate change than pollution caused by vehicles. </p>



<p>Febriadi Pratama, the Co-Founder of Gringgo Indonesia Foundation,
 was another one of the many partners invited by Google for the 
briefing. The foundation, which is a recipient of the Google AI Impact 
Challenge, is currently using Google&#8217;s ML models to identify types of 
waste material using image recognition in the Indonesian city of 
Denpasar. Pratama said during his speech that the project was 
effectively helping the foundation rake up plastic in a city where 
there&#8217;s no formal system for waste management.

</p>



<h2 class="wp-block-heading"><strong>Agriculture</strong></h2>



<p>Raghu Dharmaraju, Vice President of Products &amp; Programs at the  Wadhwani Institute for Artificial Intelligence, was also one of the  partners invited by Google to participate in the briefing. The institute  uses a proprietary Android app along with pheromone traps to scan  samples of crops for signs of pests, which, in a large farm in India,  can potentially wreck a farmer&#8217;s harvest. The app uses ML models developed by Google.  In his presentation, Dharmaraju said that the solution developed by the  institute was notably effective in detecting pink bollworms in cotton  crops in India. </p>



<h2 class="wp-block-heading"><strong>Flood forecasting</strong></h2>



<p>Sella Nevo, a Software Engineering Manager at Google AI, took the stage to talk about the company&#8217;s flood forecasting initiative. According to him, dated, low-resolution elevation maps make it hard to predict floods in any given area. SRTM,  the provider of elevation maps, hands out data that&#8217;s nearly two  decades old, he said during his presentation. In a pilot project started  last year in Patna, Google was able to produce high-definition  elevation maps using its ML models with the help of data taken from  satellites and other sources in order to forecast floods. It was then  able to alert its users about a flood incident in Gandhi Ghat. The flood  alert was sent out as a notification on smartphones. </p>



<p>“The number one issue is access to data, and we have tried to  tackle that. With different types of data, we find different solutions.  So, for the elevation maps, the data just doesn&#8217;t exist. So we worked on  different algorithms to produce and create that data for stream gauge  measurements. For various satellite data, we purchased and aggregated  most of it,” Nevo told us in an interview. According to him, Google is  trying to produce elevation maps that can be updated every year, unlike  the ones given out by SRTM. </p>



<h2 class="wp-block-heading"><strong>Accessibility</strong></h2>



<p>Sagar Savla, a Product Manager at Google AI, took the stage to talk about Google&#8217;s Live Transcribe  app. Available in 70 languages currently, the app helps the deaf and  hard-of-hearing communicate with others by transcribing speech in the  real world to on-screen text. The app is developed using Google&#8217;s ML  models to ensure precision in its transcription. For example, the app  can tell whether the user means to say “New Jersey” or “a new jersey”  depending on the context of the sentence. Talking about the app and its  development, Savla said that he had used it with his grandmother, who,  despite being hard of hearing, was able to join in on the conversation  using the Live Transcribe app in Gujarati. </p>



<p>Julie Cattiau returned to the stage to talk about Project Euphonia,  a Google initiative dedicated to building speech models that are  trained to understand people with impaired speech. The initiative could  in the future combine speech with computer vision, she said during her  presentation. For example, people who suffer from speech impairments  caused by neurological conditions could use gestures such as blinking to  communicate with others. Cattiau said that the company&#8217;s ML models are  currently being trained to recognise more gestures. </p>



<h2 class="wp-block-heading"><strong>Cultural Preservation</strong></h2>



<p>Tarin Clanuwat, a Project Researcher at the ROIS-DS Center for Open 
Data in the Humanities, went on stage about an ancient cursive Japanese 
script called Kuzushiji. Although there are millions of books and over a
 billion historical documents recorded in Kuzushiji, less than 0.01 
percent of the population can read it fluently today, she said during 
her presentation. She fears that this cultural heritage is currently at 
risk of becoming inaccessible in the future owing to disuse in modern 
texts.

</p>



<p>Google says that Turin and her fellow researchers trained an ML 
model to recognise Kuzushiji characters and transcribe them into modern 
Japanese. According to Google, the model takes approximately two seconds
 to transcribe an entire page and roughly an hour to transcribe an 
entire book. According to test data, the model is currently capable of 
detecting about 2,300 character types with an average accuracy of 85 
percent. Turin and her team are working towards improving the model in 
order to preserve the cultural heritage captured in Kuzushiji texts.

</p>



<h2 class="wp-block-heading"><strong>Summary</strong></h2>



<p>Google seems convinced it’s headed in the right direction when it 
comes to applying machine learning the right way for social causes. In 
the future, we can expect Google to take on more such projects, where 
neural networks are trained to understand data sets that hold keys and 
clues to hitherto insoluble problems in areas never tried before. At the
 same time, more and more developers and researchers should be able to 
incorporate Google’s open-source TensorFlow library in their projects as
 long as Google continues to provide support and reference material for 
it.</p>
<p>The post <a href="https://www.aiuniverse.xyz/heres-how-google-is-putting-ai-to-work-in-healthcare-environmental-conservation-agriculture-and-more/">Here&#8217;s how Google is putting AI to work in healthcare, environmental conservation, agriculture and more</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/heres-how-google-is-putting-ai-to-work-in-healthcare-environmental-conservation-agriculture-and-more/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Google AI helping scientists track endangered whales, releases whale soundtracks</title>
		<link>https://www.aiuniverse.xyz/google-ai-helping-scientists-track-endangered-whales-releases-whale-soundtracks/</link>
					<comments>https://www.aiuniverse.xyz/google-ai-helping-scientists-track-endangered-whales-releases-whale-soundtracks/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 07 Jun 2019 06:27:12 +0000</pubDate>
				<category><![CDATA[Google AI]]></category>
		<category><![CDATA[biggest humanitarian]]></category>
		<category><![CDATA[endangered whales]]></category>
		<category><![CDATA[environmental]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[NOAA]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=3587</guid>

					<description><![CDATA[<p>Source:- techau.com.au Whales are some of the most amazing creatures on the planet and some species are endangered. Google are helping fight the good fight and lending some of their AI smarts to help scientists track and study the remaining whales. Back in the 1960s, scientists first discovered that humpback whales actually sing songs, which evolve over <a class="read-more-link" href="https://www.aiuniverse.xyz/google-ai-helping-scientists-track-endangered-whales-releases-whale-soundtracks/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/google-ai-helping-scientists-track-endangered-whales-releases-whale-soundtracks/">Google AI helping scientists track endangered whales, releases whale soundtracks</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source:- techau.com.au</p>
<p>Whales are some of the most amazing creatures on the planet and some species are endangered. Google are helping fight the good fight and lending some of their AI smarts to help scientists track and study the remaining whales.</p>
<p>Back in the 1960s, scientists first discovered that humpback whales actually sing songs, which evolve over time. But there’s still so much we don’t understand. Why do humpbacks sing? What is the meaning of the patterns within their songs?</p>
<p>Scientists sift through an ocean of sound to find answers to these questions. But what if anyone could help make discoveries?</p>
<p>For the past year, Google AI has been partnering with NOAA’s Pacific Island Fisheries Science Center to train an artificial intelligence model on their vast collection of underwater recordings. This project is helping scientists better understand whales’ behavioural and migratory patterns, so scientists can better protect whales.</p>
<p>The effort fits into Google’s AI for Social Good program, applying the latest in machine learning to the world’s biggest humanitarian and environmental challenges.</p>
<p>The post <a href="https://www.aiuniverse.xyz/google-ai-helping-scientists-track-endangered-whales-releases-whale-soundtracks/">Google AI helping scientists track endangered whales, releases whale soundtracks</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/google-ai-helping-scientists-track-endangered-whales-releases-whale-soundtracks/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
