<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>energy Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/energy/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/energy/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Tue, 13 Aug 2019 18:06:34 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>Big data centres on regional areas</title>
		<link>https://www.aiuniverse.xyz/big-data-centres-on-regional-areas/</link>
					<comments>https://www.aiuniverse.xyz/big-data-centres-on-regional-areas/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 13 Aug 2019 18:06:30 +0000</pubDate>
				<category><![CDATA[Big Data]]></category>
		<category><![CDATA[cloud]]></category>
		<category><![CDATA[Digital Business]]></category>
		<category><![CDATA[Digital Transformation]]></category>
		<category><![CDATA[energy]]></category>
		<category><![CDATA[Infrastructure]]></category>
		<category><![CDATA[Schneider Electric]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=4340</guid>

					<description><![CDATA[<p>Source: afr.com Australia’s data centres are concentrated in Sydney, Melbourne and Brisbane, but that is starting to change as storage and cloud providers and their customers see <a class="read-more-link" href="https://www.aiuniverse.xyz/big-data-centres-on-regional-areas/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/big-data-centres-on-regional-areas/">Big data centres on regional areas</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: afr.com</p>



<p>Australia’s data centres are concentrated in Sydney, Melbourne and Brisbane, but that is starting to change as storage and cloud providers and their customers see the advantages of situating them in regional Australia.<br>For example, Australian-owned cloud, data centre and connectivity provider iseek is set to open a data centre in Townsville shortly.</p>



<p>Founder and managing director Jason Gomersall says the decision to build in Townsville was prompted by a number of local organisations wanting to have a data centre nearby.</p>



<p>“In theory it doesn&#8217;t matter where it is but we&#8217;re finding government organisations in particular, and also corporates, are starting to care more and more about where their data is hosted and who it’s hosted with,” Gomersall says. “It’s just the security of knowing where it is and who’s got it.”<br>Regional data centres can also boost the resilience of Australia’s data by boosting the geographic area over which it is spread, he says.</p>



<p>“The internet as such was designed to distribute information globally and does that very efficiently and effectively. Then ironically we go and then concentrate all our data in one geographic location. It seems a little bit counterintuitive to me.</p>



<p>“From a policy perspective, I think the government should be looking at how they sort of spread the data centre load around the nation.”</p>



<h4 class="wp-block-heading">Developing skills</h4>



<p>Gomersall says the cost of connecting regional data centres to capital cities had been an impediment in the past, but this is now being addressed with better infrastructure.</p>



<p>Regional data centres can also play a significant role in local job creation and keeping and developing IT skills in the regions.</p>



<p>In the short term the Townsville data centre will create a handful of jobs but iseek is already starting to consider how it will expand that workforce. Data centres also create indirect jobs, such as the cloud services and hosted services within them.</p>



<p>“If we take a five to 10-year view on this – and when you build a data centre you&#8217;re taking a 20 year-plus view – I see significant job creation,” Gomersall says.</p>



<p>“We&#8217;re going to create the skills and do the training to build those skills in those areas.”</p>



<p>Joe Craparotta, vice-president secure power at Schneider Electric, says improvements in Australia’s telecommunications infrastructure outside the major carriers has made regional data centres more feasible while the rising cost of land in capital cities has made them more desirable.</p>



<p>Another advantage of housing them in cooler climates such as Toowoomba, where the $40 million Pulse Data Centre supported by Schneider Electric, Telstra and the Queensland government opened last year, is they draw on less power to keep cool.</p>



<p>“The running cost of the data centre can be a lot more efficient and a lot more optimised than a metro data centre,” Craparotta says. “There’s almost free cooling for a big chunk of the year.”</p>



<h4 class="wp-block-heading">Close to power source</h4>



<p>Additionally, they can often have access to good power, being close to the source of generation.</p>



<p>Craparotta says keeping IT skills in the regions will become more important as regional Australia becomes more digital and the use of the internet of things increases, creating further benefits for local industry.</p>



<p>“They have a facility locally they can rely on to help them digitise, either their farm – from paddock to plate – or the local university or school, or just the local businesses in general.”</p>



<p>While he is not suggesting that regional data centres will replace metro data centres, he does expect Australia will have a mix of both.</p>



<p>“I would expect it to be aligned with the population, so I would expect that there&#8217;s more regional data centres that are built over the next five years to accommodate the third of the population that is outside metro area,” Craparotta says.</p>



<p>“I’d predict in the next two to three years, we will see a much faster acceleration as the 35 per cent of Australia that lives outside the metro areas becomes more digital and more reliant on those services.”</p>
<p>The post <a href="https://www.aiuniverse.xyz/big-data-centres-on-regional-areas/">Big data centres on regional areas</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/big-data-centres-on-regional-areas/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>How to build disruptive strategic flywheels</title>
		<link>https://www.aiuniverse.xyz/how-to-build-disruptive-strategic-flywheels/</link>
					<comments>https://www.aiuniverse.xyz/how-to-build-disruptive-strategic-flywheels/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 25 Jun 2019 06:42:47 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[advanced technology]]></category>
		<category><![CDATA[energy]]></category>
		<category><![CDATA[flywheels]]></category>
		<category><![CDATA[Future]]></category>
		<category><![CDATA[mechanisms]]></category>
		<category><![CDATA[storage]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=3959</guid>

					<description><![CDATA[<p>Source:- strategy-business.com A large auto manufacturer asked a consulting firm to evaluate its competitive position in relation to ride-sharing startups building autonomous vehicles. Instead of viewing this as <a class="read-more-link" href="https://www.aiuniverse.xyz/how-to-build-disruptive-strategic-flywheels/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-to-build-disruptive-strategic-flywheels/">How to build disruptive strategic flywheels</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source:- strategy-business.com</p>
<p>A large auto manufacturer asked a consulting firm to evaluate its competitive position in relation to ride-sharing startups building autonomous vehicles. Instead of viewing this as a classic strategy project, with a business case, PowerPoint decks, and five-year projections, the firm created a “game” that the automaker could “play” against its competitors. An artificial intelligence (AI) system modeled the voluminous individual choices available to customers, companies, and other entities as digital twins (a digital twin is a computerized replica of a physical asset, process, consumer, actor, or other decision-making entity). The hundreds of thousands of simulations suggested many strategic bets, option-value bets, and “no-regret strategies,” or moves that made strategic and financial sense in a multitude of situations. The selection of those strategies, in turn, made the AI system smarter through learning mechanisms called reinforcement learning, which then further empowered humans to make better decisions. As time progressed, the company was able to choose precise market approaches, pricing, advertising, and customer strategies for multiple cities and communities.</p>
<p>Taken together, these actions created a flywheel, a concept borrowed from the power industry to describe a source of stabilization, energy storage, and momentum, and that was popularized in the strategy context by the author Jim Collins. Executives, instead of trusting instincts and prior assumptions, were able to harness the power of this strategic flywheel to verify hypotheses in simulation and in the real world. Doing so exponentially expanded the array of strategic choices and reduced the cost of experimentation. Rather than paralyzing decision makers with the abundance of options they created, the simulations produced clarifying insights. The result for this auto manufacturer has been a multibillion-dollar valuation of its new services, achieved in less than two years.</p>
<p>Games. AI. Continuous execution and adjustment. Thousands of scenarios to consider. This is not how strategy at blue-chip companies has been done in the past. But it is how business leaders are starting to do strategy now, and how we will need to do strategy in the future — that is, if we are to develop strategies that can both withstand and adapt to the increasing pace of change and disruption that is evident in all industries.</p>
<div class="articleList thumbLeft outdentLeft related">
<h2>Related stories</h2>
<ul>
<li>
<article class="clearfix">
<div class="text">
<h3 class="title">A Strategist’s Guide to Artificial Intelligence</h3>
<div class="authors"><span class="author">by Anand Rao</span></div>
</div>
</article>
<article class="clearfix">
<div class="thumb"></div>
<div class="text">
<h3 class="title">The Future of Artificial Intelligence Depends on Trust</h3>
<div class="authors"><span class="author">by Anand Rao</span><span class="author">, Euan Cameron</span></div>
</div>
</article>
<article class="clearfix">
<div class="thumb"></div>
<div class="text">
<h3 class="title">Alexa, What’s Going on with Healthcare?</h3>
<div class="authors"><span class="author">by Jay Godla</span><span class="author">, Igor Belokrinitsky</span><span class="author">, Sundar Subramanian</span></div>
</div>
</article>
</li>
</ul>
</div>
<p>Strategy, the way companies create competitive advantage, has traditionally been a deterministic, linear, and rigid undertaking. The idea is that strategists develop a perfect vision of the future demands of the market, pick a direction or position, invest the full set of resources against it, and execute relentlessly. Strategic planning came into vogue in the late 1960s, and in its pure form was an overarching plan for growth, usually written up in a formal document and endorsed by the CEO. Two decades into the 21st century, this 20th-century tradition continues to be propagated by business schools, by internal planning groups, and by strategy consultants. Even the strategies that seem to work set generic goals and position statements, typically allocate investments on the basis of linear priorities or success metrics (such as return on investment), and create five-year pro forma plans, which are rarely rethought deeply.</p>
<p>But this approach is problematic. The world today is not so deterministic, and the future is highly uncertain. Market and consumer demands, competition, technology, suppliers, and regulations change continually, and the levels and speed of change are intensifying. As a result, the traditional strategic planning process — deterministic, annual, and linear — needs to evolve to become more probabilistic, continual, and multidimensional. In a word, it needs to become more resilient. Organizations can make that shift by adopting a more dynamic approach that leverages AI and advanced analytical techniques. They can then be more sensitive to external market changes, be more rigorous and analytical in evaluating choices and portfolio investments, and make decisions with speed and confidence. In the process, they can develop strategic and growth flywheels that continually reinforce and recalibrate their approach to markets, innovation, and competition.</p>
<h3>Clock speed and the flywheel effect</h3>
<p>Charles Fine, an MIT professor, introduced the idea of clock speed, the rate at which products or capabilities or business models evolve in different industries. Changes in consumer preferences, technological advances, and regulation are radically accelerating the clock speed of all industries to some degree, and hence the degree of disruption they feel (see the chart in this article).</p>
<p>Capabilities-driven strategy suggests that companies that have a clear way to play (WTP) that aligns with market demands, and that invest in a system of four to six differentiating capabilities that enable the company to excel at the WTP, are better positioned for success. But increasing clock speed changes the calculation. Today, the half-life of a competitive advantage may be fleeting. As industries are disrupted, players that have been successful within the context of one business cycle might need to rethink their differentiating capabilities, their investment portfolios, and possibly even their WTP more frequently and dynamically. Ford no longer just makes cars; it focuses instead on mobility solutions. Big oil companies are investing in renewable energy as a hedge against constraints on emissions. Amazon is competing with…everyone. As a result, it behooves organizations and managers to continually assess competitive moves, regulatory and technology evolution, and consumer preferences — and to adapt decisions in a dynamic fashion.</p>
<form action="https://www.strategy-business.com/article/How-to-build-disruptive-strategic-flywheels" enctype="multipart/form-data" method="POST" name="obs35794075">
<div class="inArticleOptIn">
<div class="thumb"> Using adaptation and experimentation as part of strategy was first suggested by Henry Mintzberg, a professor of management at McGill University. In Mintzberg’s words, companies should “let a thousand strategic flowers bloom&#8230;[using] an insightful style, to detect the patterns of success in these gardens of strategic flowers, rather than a cerebral style that favors analytical techniques to develop strategies in a hothouse.” In his book <em>The Fifth Discipline</em>, Peter Senge wrote of the potential to use computer simulations as “growth laboratories.” In fact, many of the most successful companies, including Amazon, Netflix, and Google, experiment scientifically (with test and control groups) and use their growth laboratories to learn from literally hundreds of thousands of experiments in a day. Today, through the use of advanced analytics and AI, the thousand flowers have scaled up exponentially.</div>
</div>
</form>
<p>Successful disruptors are able to exploit market trends by creating reinforcing feedback loops that give them an advantage over time. Consider the power of data network effects. The more data one has, the more one can personalize the customer experience; the more one personalizes the experience, the more customers are attracted; the more customers one has, the more data one gets. This effect leads consumers to flock disproportionately to a few leaders, thereby creating monopolies or oligopolies.</p>
<p>Let’s look at two remarkably simple examples of companies that have thrived in this age of higher clock speed. Jeff Bezos’s original “napkin” diagram (see “Constructing flywheels”), drawn well before Amazon became a leader in online retailing, describes a virtuous circle of broader product selection, better customer experience, more sellers, more traffic, lower cost structure, and lower prices, all reinforcing one another. A diagram describing Uber’s strategy shows a similar dynamic at work. Faster pickups generate more demand, which attracts more drivers, leading to better geographic coverage, less driver downtime, and lower prices. The components of the flywheels include positions or features that encourage reinforcement through causal effects, thereby increasing exponential and nonlinear adoption. One can assemble such flywheel approaches by thinking carefully about the most important features that are going to drive demand, and the causal linkages between them.</p>
<div class="sidebar-full imageShare zoomable">
<ul class="social">
<li class="twitter">Twitter</li>
<li class="linkedin">LinkedIn</li>
<li class="facebook">Facebook</li>
</ul>
</div>
<h3>Resilient companies</h3>
<p>A well-defined corporate identity, which includes a chosen way to play, associated capabilities, and a portfolio of operations, is helpful in creating boundaries and guidelines for focus. But to thrive through new and disruptive business cycles, companies must continually evolve their capabilities system to match — or even shape — the demands of the market. The secret to more sustained success involves three vital steps. First is adapting by continually sensing market variables, and experimenting with new ideas or bets using a clear mental model for the “spread” (or range of scenarios and possible outcomes) of the decisions being made. Second is developing reinforcing causal feedback loops that provide disproportionate advantage as disruptive market trends take off, and testing, killing, or modifying ideas against this framework constantly. And third is building focus on a WTP and scaling an associated capabilities system, as stated in the capabilities-driven strategy, informed by the needs of the dynamic feedback loop to scale and mature the business model.</p>
<p>Netflix, which our colleagues discussed in their 2017 <em>s</em>+<em>b</em> article on digital disruption, started out competing with Blockbuster Video on the basis of customer convenience and fees. It sent DVDs out by mail and didn’t require return by a specified date, and replaced individual rental fees with a monthly subscription model. In 2007, Netflix disrupted its own business (and killed the rest of Blockbuster’s) by introducing streaming video on demand, which pushed it into competition with cable television. As it amassed customers, Netflix continually refined its analytic capabilities, crunching data to offer more fine-grained recommendations, which consumers could explore more quickly with the flexibility of streaming. This analytic capability fed naturally into the creation of appealing original content, which began with <em>House of Cards</em> in 2013 and expanded to more than 350 original series released in 2017. <em>Bird Box</em>, a Netflix-produced horror film starring Sandra Bullock, was viewed on 45 million accounts in the first seven days after its release in December 2017.</p>
<p>Netflix’s culture and strategy allowed resilience and adaptation all along the way. It gave employees the freedom to explore ideas and learn, was willing to pay at the top of the market for talent, and openly eschewed conventional processes in order to provide room for agility. In effect, Netflix has built three virtuous circles that function as flywheels. There’s a personalization circle: better personalization with AI leading to more customers, more viewing, more data, and, in turn, better personalization. There’s a decision frequency circle: The subscription model leads to more decisions per time unit, which leads to more data and better personalization. And there’s a content creation circle: With more customers, Netflix has more viewings, and hence a better understanding of individual customer preferences, and becomes a more attractive partner for content creators.</p>
<p>Netflix also maintained its identity all along and focused on the capabilities system that enabled the business model to focus and scale within each business cycle. The unlimited subscription model, at the core of the company’s first disruption, was intended to position Netflix to offer streaming when the technology caught up. Once it did, Netflix focused on building world-class capabilities in subscription model administration, online streaming, customer insights, and tailored content production.</p>
<div class="inArticleAdWrapper"> Amazon is another great example of a company that has built a resilient strategy by dynamically shaping the market through high-velocity decision making, and creating a flywheel business model. Bezos started Amazon as an online retailer for books, outcompeting traditional brick-and-mortar retailers such as Barnes &amp; Noble, and has since evolved into an online retailer offering anything that can be sold online and shipped.</div>
<p>The focus on technology and logistics helped keep costs low, and enabled Amazon to vastly increase product variety. That, in turn, began to make it the go-to online shopping portal. But Amazon tapped into deeper capabilities to create a flywheel effect with consumers. It mined data to understand consumer preferences, and shaped buying behavior and convenience by introducing such features as one-click ordering and free shipping through Amazon Prime. Over time, the company also expanded into businesses that had nothing to do with retailing: online content streaming; Amazon Web Services, which provides cloud computing services; and new physical products such as Kindle (e-reader), Fire (digital media player), and Echo (smart speaker assistant). Echo’s Alexa has become the technology backbone for interoperability for countless devices exploiting the Internet of Things. This has created another causal loop, as making more devices interoperable on Alexa led to more integration and convenience, leading to higher customer sales, driving more suppliers to integrate with Alexa. Through these many evolutions, Amazon has retained its founding identity of being a customer-driven and technology-led retailer. Bezos wrote in a famous letter to shareholders, “Staying in Day 1 requires you to experiment patiently, accept failures, plant seeds, protect saplings, and double down when you see customer delight.” Amazon’s capabilities system in supply chain and logistics; customer insights and preferences; and online, retail, and technology platform innovation have all been unparalleled.</p>
<p>Peloton, a fitness startup whose US$2,000 stationary bicycles and high-energy classes have gained a cult following since its founding in 2012, started out as a software player, but has since focused directly on every aspect of the value chain, including software, hardware, studio instructors, logistics, and even retail. As CEO John Foley tells the story of its evolution, it is clear that Peloton continued to adapt to market realities and opportunities against conventional thinking, creating a vertically integrated business that controlled all aspects of the customer’s experience. The dynamic feedback loops are evident in the fact that Peloton’s Net Promoter Score (its chief customer satisfaction metric) rose as every new aspect of the experience was controlled. Customer delight with the product and service led to highly efficient word-of-mouth marketing, which led to more customers and greater scale, which improved the company’s financial capacity to build an immersive experience. As it has grown and evolved, Peloton has built out its capabilities systems for software, logistics, and instruction.</p>
<h3>Advanced strategy tools</h3>
<p>As they work to build growth flywheels, companies can utilize the power of strategy flywheels. In fact, the forces that have led to disruption and accelerated clock speed are also providing businesses with the tools to develop resilient strategy. Automation, analytics, and AI have made huge advances in recent years. In business, we increasingly see machines perform manual or cognitive tasks; organize, analyze, synthesize, and act on large amounts of data; and make operational and management decisions, or at least recommend them. But what about strategy, which is generally regarded as a uniquely human endeavor? To be sure, AI alone can’t develop our strategy for us. But it can change the way we do strategy, and help organizations reimagine the future and develop their own flywheels. In fact, it is already starting to do so.</p>
<div class="pullquote">
<p>To be sure, AI alone can’t develop our strategy for us. But it can change the way we do strategy, and help organizations reimagine the future and develop their own flywheels.</p>
<div class="social">
<ul>
<li>Share to:</li>
<li class="twitter">Twitter</li>
<li class="linkedin">LinkedIn</li>
<li class="facebook">Facebook</li>
</ul>
</div>
</div>
<p>It’s helpful to consider the process of strategy formulation, planning, and execution as a game. Games, in general, have a fixed set of rules, are played among a small and known number of players, have well-defined and agreed-upon outcomes, and have known uncertainty (that is, the uncertainty stems from the choices available within the constraints of the game and is not environmentally introduced). In contrast, strategy, or more broadly how businesses operate, is subject to changing rules, is often played against a large and unknown number of players (e.g., disruptors from brand-new industries), has an outcome that is not clear or agreed upon (e.g., maximizing profits or being a socially responsible organization), and is subject to both known and unknown uncertainty (e.g., the emergence of new technology). But the building blocks for games and strategy are actually quite similar, as both involve setting policies, grappling with dynamic models amid the backdrop of environmental assumptions, and coping with randomness.</p>
<p>Looking at strategy through the lens of gaming can lead to a new approach to building dynamic strategic plans that can embed foresight and resilience. As shown in “Sense, think, act” below, the dynamic and resilient flywheel strategy has three components.</p>
<p>The post <a href="https://www.aiuniverse.xyz/how-to-build-disruptive-strategic-flywheels/">How to build disruptive strategic flywheels</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-to-build-disruptive-strategic-flywheels/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Researchers show glare of energy consumption in the name of deep learning</title>
		<link>https://www.aiuniverse.xyz/researchers-show-glare-of-energy-consumption-in-the-name-of-deep-learning/</link>
					<comments>https://www.aiuniverse.xyz/researchers-show-glare-of-energy-consumption-in-the-name-of-deep-learning/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Mon, 10 Jun 2019 10:21:40 +0000</pubDate>
				<category><![CDATA[Deep Learning]]></category>
		<category><![CDATA[consumption]]></category>
		<category><![CDATA[deep learning]]></category>
		<category><![CDATA[energy]]></category>
		<category><![CDATA[glare]]></category>
		<category><![CDATA[name]]></category>
		<category><![CDATA[researchers]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=3680</guid>

					<description><![CDATA[<p>Source:- techxplore.com Wait, what? Creating an AI can be way worse for the planet than a car? Think carbon footprint. That is what a group at the University <a class="read-more-link" href="https://www.aiuniverse.xyz/researchers-show-glare-of-energy-consumption-in-the-name-of-deep-learning/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/researchers-show-glare-of-energy-consumption-in-the-name-of-deep-learning/">Researchers show glare of energy consumption in the name of deep learning</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source:- techxplore.com</p>
<p>Wait, what? Creating an AI can be way worse for the planet than a car? Think carbon footprint. That is what a group at the University of Massachusetts Amherst did. They set out to assess the energy consumption that is needed to train four large neural networks.</p>
<p>Their paper is currently attracting attention among tech watching sites. It&#8217;s titled &#8220;Energy and Policy Considerations for Deep Learning in NLP,&#8221; by Emma Strubell, Ananya Ganesh and Andrew McCallum.</p>
<p>This, said Karen Hao, artificial intelligence reporter for <i>MIT Technology Review</i>, was a life cycle assessment for training several common large AI models.</p>
<p>&#8220;Recent progress in hardware and methodology for training neural networks has ushered in a new generation of large networks trained on abundant data,&#8221; said the researchers.</p>
<p>What is your guess? That training an AI model would result in a &#8220;heavy&#8221; footprint? &#8220;Somewhat heavy?&#8221; How about &#8220;terrible?&#8221; The latter was the word chosen by <i>MIT Technology Review</i> on July 6, Thursday, reporting on the findings.</p>
<p>Deep learning involves processing very large amounts of data. (The paper specifically examined the model training process for natural-language processing, the subfield of AI that focuses on teaching machines to handle human language, said Hao.) Donna Lu in <i>New Scientist</i> quoted Strubell, who said, &#8220;In order to learn something as complex as language, the models have to be large.&#8221; What price making models obtain gains in accuracy? Roping in exceptionally large computational resources to do so is the price, causing substantial energy consumption.</p>
<p>Hao reported their findings, that &#8220;the process can emit more than 626,000 pounds of carbon dioxide equivalent—nearly five times the lifetime emissions of the average American car (and that includes manufacture of the car itself).&#8221;</p>
<p>These models are costly to train and develop—-costly in the financial sense due to the cost of hardware and electricity or cloud compute time, and costly in the environmental sense. The environmental cost is due to the carbon footprint. The paper sought to bring this issue to the attention of NLP researchers &#8220;by quantifying the approximate financial and environmental costs of training a variety of recently successful neural network models for NLP.&#8221;</p>
<p>How they tested: To measure environmental impact, they trained four AIs for one day each, and sampled the energy consumption throughout. They calculated the total power required to train each AI by multiplying this by the total training time reported by each model&#8217;s developers. A carbon footprint was estimated based on the average carbon emissions used in power production in the US.</p>
<p>What did the authors recommend? They went in the direction of recommendations to reduce costs and &#8220;improve equity&#8221; in NLP research. Equity? The authors raise the issue.</p>
<p>&#8220;Academic researchers need equitable access to computation resources. Recent advances in available compute come at a high price not attainable to all who desire access. Most of the models studied in this paper were developed outside academia; recent improvements in state-of-the-art accuracy are possible thanks to industry access to large-scale compute.&#8221;</p>
<p>The authors pointed out that &#8220;Limiting this style of research to industry labs hurts the NLP research community in many ways.&#8221; Creativity is stifled. Good ideas are not enough if the research team lacks access to large-scale compute.</p>
<p>&#8220;Second, it prohibits certain types of research on the basis of access to financial resources. This even more deeply promotes the already problematic &#8216;rich get richer&#8217; cycle of research funding, where groups that are already successful and thus well-funded tend to receive more funding due to their existing accomplishments.&#8221;</p>
<p>The authors said, &#8220;Researchers should prioritize computationally efficient hardware and algorithms.&#8221; In this vein, the authors recommended an effort by industry and academia to promote research of more computationally efficient algorithms, and hardware requiring less energy.</p>
<p>The post <a href="https://www.aiuniverse.xyz/researchers-show-glare-of-energy-consumption-in-the-name-of-deep-learning/">Researchers show glare of energy consumption in the name of deep learning</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/researchers-show-glare-of-energy-consumption-in-the-name-of-deep-learning/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>A new frontier for artificial intelligence: energy development in rural America</title>
		<link>https://www.aiuniverse.xyz/a-new-frontier-for-artificial-intelligence-energy-development-in-rural-america/</link>
					<comments>https://www.aiuniverse.xyz/a-new-frontier-for-artificial-intelligence-energy-development-in-rural-america/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 14 Sep 2018 04:54:11 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[america]]></category>
		<category><![CDATA[energy]]></category>
		<category><![CDATA[energy development]]></category>
		<category><![CDATA[Impact]]></category>
		<category><![CDATA[rural]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=2864</guid>

					<description><![CDATA[<p>Source-edf.org This spring, regulators in Wyoming were scrambling to figure out how to process 10,000 applications for oil and gas permits filed since oil prices began to <a class="read-more-link" href="https://www.aiuniverse.xyz/a-new-frontier-for-artificial-intelligence-energy-development-in-rural-america/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/a-new-frontier-for-artificial-intelligence-energy-development-in-rural-america/">A new frontier for artificial intelligence: energy development in rural America</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source-edf.org</p>
<p>This spring, regulators in Wyoming were scrambling to figure out how to process 10,000 applications for oil and gas permits filed since oil prices began to rise in early 2016. At the same time, new permits were streaming in at an unprecedented rate, raising concerns that some proposed energy projects would not get the scrutiny – or public input – they warranted.</p>
<p>It was exactly the kind of problem people in the artificial intelligence community love to solve; it just hadn’t been considered for environmental permitting before.</p>
<p>Besides making life easier for regulators, machine learning could help watchdog groups and local residents better understand what risks new energy projects pose to the environment and public health.</p>
<h4>A new AI application</h4>
<p>I decided to run the idea by some acquaintances in the Bay Area tech community, one of whom had recently used Natural Language Processing to sort through thousands of public comments on the controversial XL Pipeline project.</p>
<p>What if we can use trained algorithms to also process oil and gas filings and extract information and themes that might otherwise fall through the cracks, I asked them.</p>
<p>It could turbo-charge our analysis of applications filed under the National Environmental Policy Act and help us catch impacts before they happen. Other kinds of development proposals could benefit, too, were the application to catch on.</p>
<p>Two people – a programmer and graduate student at Stanford University, and the XL Pipeline researcher, also at Stanford – were immediately excited about the idea and agreed to help me get started.</p>
<h4>Scrub government websites? The project takes shape</h4>
<p>Environmental Defense Fund laid the groundwork for the AI project earlier this year, planning to use a trained computer program to assess future impacts on greater sage-grouse habitat in the western United States.</p>
<p>It quickly became clear that using artificial intelligence to analyze energy development projects could bring a new level of transparency to today’s outdated and overwhelmed permitting process. By combining software and algorithms trained to “scrub” data from government websites, critical and potentially overlooked filings can be located, downloaded and processed in a matter of hours.</p>
<p>It would also solve another challenge: a lack of standard formats for permit applications that make such documents hard to identify and flag potential problems with NEPA filings that might otherwise be missed.</p>
<p>Future impacts of projects – such as pollution of a groundwater basin or habitat loss for an imperiled animal – could be identified and compiled on a spreadsheet, registry or an online map for everyone to see.</p>
<p>For that, you also need resources.</p>
<h4>Grueling work, big gains</h4>
<p>Getting artificial intelligence treatment of NEPA filings off the ground is a question of manpower, plain and simple. That also explains why progress with our initial greater sage-grouse project has been slow.</p>
<p>It requires weeks of data entry to train algorithms, on top of high-level math and data processing skills – neither of which your typical watchdog group or local government office possesses. But if we stop and consider what we’d gain – a grasp of what’s really happening in America’s rapidly developing rural areas – we may see just how much such advances could be worth.</p>
<p>So I wasn’t surprised to get a call the other week from an entrepreneur on the East Coast who had heard about our project. He proposed a different approach to the problem, which would feed in reams of permit applications and then extract information we care about based on some simple criteria.</p>
<p>Exploring these methodologies to pull from deep wells of data could circumvent the time-consuming work of training initial algorithms and save an immense amount of time and resources. We think we’re onto something potentially big.</p>
<p>The post <a href="https://www.aiuniverse.xyz/a-new-frontier-for-artificial-intelligence-energy-development-in-rural-america/">A new frontier for artificial intelligence: energy development in rural America</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/a-new-frontier-for-artificial-intelligence-energy-development-in-rural-america/feed/</wfw:commentRss>
			<slash:comments>2</slash:comments>
		
		
			</item>
	</channel>
</rss>
