<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>software tools Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/software-tools/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/software-tools/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Tue, 17 Mar 2020 08:31:25 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.1</generator>
	<item>
		<title>Deep Learning Tool Spots Bugs In New Software Hours After Launch</title>
		<link>https://www.aiuniverse.xyz/deep-learning-tool-spots-bugs-in-new-software-hours-after-launch/</link>
					<comments>https://www.aiuniverse.xyz/deep-learning-tool-spots-bugs-in-new-software-hours-after-launch/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 17 Mar 2020 08:31:23 +0000</pubDate>
				<category><![CDATA[Deep Learning]]></category>
		<category><![CDATA[deep learning]]></category>
		<category><![CDATA[Microsoft]]></category>
		<category><![CDATA[researchers]]></category>
		<category><![CDATA[software tools]]></category>
		<category><![CDATA[Tools]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=7490</guid>

					<description><![CDATA[<p>Source: rtinsights.com Almost every update by an operating system provider inevitably comes with new bugs. The recent Windows 10 update, published by Microsoft earlier this week, significantly slowed PC boot time. While most software developers test updates well in advance, publishing it to millions of different machines can lead to significant differences in how the software performs. <a class="read-more-link" href="https://www.aiuniverse.xyz/deep-learning-tool-spots-bugs-in-new-software-hours-after-launch/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/deep-learning-tool-spots-bugs-in-new-software-hours-after-launch/">Deep Learning Tool Spots Bugs In New Software Hours After Launch</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: rtinsights.com</p>



<p>Almost every update by an operating system provider inevitably comes with new bugs. The recent Windows 10 update, published by Microsoft earlier this week, significantly slowed PC boot time.</p>



<p>While most software developers test updates well in advance, publishing it to millions of different machines can lead to significant differences in how the software performs.</p>



<p>To help ascertain if there are issues with an update quicker, researchers at Texas A&amp;M University, in collaboration with Intel Labs, have developed a deep learning model that finds performance bugs in a matter of hours, instead of days.</p>



<p>“Updating software can sometimes turn on you when errors creep in and cause slowdowns. This problem is even more exaggerated for companies that use large-scale software systems that are continuously evolving,” said Abdullah Muzahid, assistant professor of computer science and engineering at Texas A&amp;M. “We have designed a convenient tool for diagnosing performance regressions that is compatible with a whole range of software and programming languages, expanding its usefulness tremendously.”</p>



<p>Due to the wide range of performance counters, it is difficult for even multiple staff members to notice issues on a global scale. That is where deep learning has an advantage, able to sift through millions of counters, spot patterns and inform the developers of any issues.</p>



<p>With this, developers may be able to patch an update before users recognize the issue. Many organizations already pay users to inform them of day-zero bugs, this new tool could save them a lot of time and money.</p>
<p>The post <a href="https://www.aiuniverse.xyz/deep-learning-tool-spots-bugs-in-new-software-hours-after-launch/">Deep Learning Tool Spots Bugs In New Software Hours After Launch</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/deep-learning-tool-spots-bugs-in-new-software-hours-after-launch/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Leveraging big data for grc success</title>
		<link>https://www.aiuniverse.xyz/leveraging-big-data-for-grc-success/</link>
					<comments>https://www.aiuniverse.xyz/leveraging-big-data-for-grc-success/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Mon, 30 Dec 2019 10:56:03 +0000</pubDate>
				<category><![CDATA[Big Data]]></category>
		<category><![CDATA[Big data]]></category>
		<category><![CDATA[big data tools]]></category>
		<category><![CDATA[Development]]></category>
		<category><![CDATA[software tools]]></category>
		<category><![CDATA[Technology]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=5881</guid>

					<description><![CDATA[<p>Source: businessamlive.com The idea of data generating business value is not new.&#160;&#160;However, the effective use of data is becoming the foundation of competition. Business has always wanted to develop insights from information in order to make better, smarter, real-time, fact-based decisions. It is this demand for profundity of knowledge that has powered the growth of <a class="read-more-link" href="https://www.aiuniverse.xyz/leveraging-big-data-for-grc-success/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/leveraging-big-data-for-grc-success/">Leveraging big data for grc success</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: businessamlive.com</p>



<p>The idea of data generating business value is not new.&nbsp;&nbsp;However, the effective use of data is becoming the foundation of competition. Business has always wanted to develop insights from information in order to make better, smarter, real-time, fact-based decisions. It is this demand for profundity of knowledge that has powered the growth of big data tools and platforms.</p>



<p>But just what is big data? According to Wikipedia, ‘’big data is a field that treats ways to analyze, systematically extract information from, or otherwise deal with data sets that are too large or complex to be dealt with by traditional data-processing application software’’.</p>



<p>Big data was originally associated with four key concepts: volume [the quantity of generated and stored data], variety [the type and nature of the data], velocity [the speed at which the data is generated and processed to meet the demands and challenges that lie in the path of growth and development] and veracity [the data quality and the data value].</p>



<p>Big data usually includes data sets with sizes beyond the ability of commonly used software tools to capture, curate, manage, and process data within a tolerable elapsed time. Its philosophy comprises unstructured, semi-structured and structured data. However, the main focus is usually on unstructured data.</p>



<p>Evolving technology has brought data analysis out of IT backrooms, and extended the potential of using data-driven results into every facet of an organization. However, while advances in software and hardware have facilitated the age of big data, technology is not the only consideration.</p>



<p>The diagram below by Guy Pearce explains it all: big data and other external data are fed into data analytics to generate reports for decision making. It’s that simple.</p>



<p>Big data is today transforming the world of GRC. A robust GRC culture represents how organizations govern, allocate resources and set internal control practices to regulate the actions. Big data potentially transmutes all of those areas. That’s why cultures built on big data and advanced analytics are increasingly synonymous with high-performance organizations.</p>



<p>The value of analytics is clear: finding insights in enormous amounts of previously untapped data. This helps management to base their decisions and strategies on facts and not dreads.</p>



<p>With increasing complexities, businesses need dynamic solutions, new investments and functional ideas. This can be supported by innovative technology solutions that can integrate and automate various processes and controls. Such tools analyze the risk landscape and helps management to monitor them frequently.</p>



<p>The effectiveness of GRC hinges on data. Being able to gather, analyze, and communicate information with the stakeholders and right format is critical. Hence, forward-thinking organizations are creating an infrastructure within the organization.</p>



<p>Moreover, the vast and growing volume of unstructured and structured data today provides limitless opportunities to improve risk intelligence, support compliance and augment customer relationships. The key elements of a digital GRC is shown in the diagram below.</p>



<p><strong>The many benefits of big data in GRC include:</strong></p>



<p>• Faster and more cost-effective transaction and fraud analysis;</p>



<p>•&nbsp;&nbsp;Improved continuous monitoring capabilities;</p>



<p>• Visual dashboards that compile data in new, more powerful ways;</p>



<p>• Integration of risk management, compliance, audit, and control management with business performance;</p>



<p>• Forward-looking (predictive) risk identification and assessment;</p>



<p>• Google-like searches capability of historical data; and</p>



<p>• Comprehensive relationship analysis for third party vendor management.</p>



<p>More and more organizations are investing resources to ramp up their efforts to use big data and analytics to drive growth. Yet, many companies feel they haven’t realized the full potential of their analytic capabilities.&nbsp;&nbsp;They feel exasperated that they aren’t doing more, faster.</p>



<p>To be sure, an organization can use a GRC platform leverage big data for a one stop solution for the data. That way, it is easier to understand the risks, standards and internal controls governing them and facilitate a better understanding of corporate risk profile.</p>



<p>Being able to pull data from several sources, and then fuse the data into actionable intelligence via graphical dashboards and reports is the key to driving operational efficiency and success. Innovative technology systems can deliver dynamic data visualizations that showcase trends and patterns in real-time which will aid executives make faster decisions.</p>



<p>However, the focus should not just remain on what has happened in the past and what is happening in the present, but also what it means for the future; the blending of historical insight and predictive foresight paves the way for a risk averse organization.</p>



<p>Big data and analytics are here to stay, and only those companies that understand the immeasurable potential of these tools, and effectively tap into various big data sources such as social media, location, multimedia, text, documents, surveillance, medical records, videos, e-commerce, emails, voice, audio transcripts, stock trades, transaction logs, geospatial data, and weblogs will be best positioned to enhance their GRC initiatives.</p>



<p>Leveraging the right systems, engaging the right teams, and taking a forward-looking approach to big data can help fast-track an organization’s journey to arrive at the ideal data-driven GRC culture, and push the envelope to achieve long-term success. So what’s old, is new again!</p>
<p>The post <a href="https://www.aiuniverse.xyz/leveraging-big-data-for-grc-success/">Leveraging big data for grc success</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/leveraging-big-data-for-grc-success/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>A MYTHIC APPROACH TO DEEP LEARNING INFERENCE</title>
		<link>https://www.aiuniverse.xyz/a-mythic-approach-to-deep-learning-inference/</link>
					<comments>https://www.aiuniverse.xyz/a-mythic-approach-to-deep-learning-inference/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 24 Aug 2018 06:19:04 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Deep Learning]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[deep learning]]></category>
		<category><![CDATA[deep learning architecture]]></category>
		<category><![CDATA[Mythic]]></category>
		<category><![CDATA[software tools]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=2780</guid>

					<description><![CDATA[<p>Source &#8211; nextplatform.com Another Hot Chips conference has ended with yet another deep learning architecture to consider. This one is actually quite a bit different in that it relies on analog computation inside flash memory for inference. Mythic, which was founded in 2012 by Dave Fick (also CTO) and a colleague from the Michigan Integrated Circuits <a class="read-more-link" href="https://www.aiuniverse.xyz/a-mythic-approach-to-deep-learning-inference/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/a-mythic-approach-to-deep-learning-inference/">A MYTHIC APPROACH TO DEEP LEARNING INFERENCE</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source &#8211; nextplatform.com</p>
<p>Another Hot Chips conference has ended with yet another deep learning architecture to consider. This one is actually quite a bit different in that it relies on analog computation inside flash memory for inference.</p>
<p>Mythic, which was founded in 2012 by Dave Fick (also CTO) and a colleague from the Michigan Integrated Circuits lab has managed to raise $55 million from a number of investors, including SoftBank for its ultra low-power approach to inference at the edge. What is unique here is not just the hardware approach but also that the inference chip puts server class silicon to the test in terms of capability and certainly power consumption.</p>
<p>This week at Hot Chips we already talked about some different ways of thinking about MAC engines (including via Arm’s new machine learning processor) but Mythic takes the same ideas about trimming down MAC operation overhead and turns it on its head. Instead of reusing the weights (the highest overhead due to accesses versus computation) or reducing the weights by precision or compression, Mythic skips much of the MAC hit entirely. For most of you, that probably requires a denser explanation.</p>
<p>Neural networks are largely MAC (multiply-accumulate) based. There can be millions of these operations in a typical neural network and while they are not complicated at all, the real question is how many picojoules one needs to execute one of these operations. Getting the data to this unit is the hard part. The weight data for the neurons (or trained parameters) and the intermediate data (the input and output vectors) change each cycle and have different properties.</p>
<p>The size of the weight matrix is thus far larger than other accesses for things like intermediate data accesses. And while it has been tried, storing the data all in DRAM to make this more efficient isn’t the answer since yes, it is possible to fit large models but that comes with a cost for reading weights and provides limited bandwidth to get to the weight data. There are also other strategies to get around these inherent inefficiencies in terms of reusing weights, but that doesn’t work well with small batch sizes and compression or weight precision loss comes with capability limitations.</p>
<p>That was a long way of explaining how Mythic came to non-volatile memory and removing the weight of the weights almost entirely (again, for edge inference).</p>
<p>Take a look below at these common neural net accelerator design points for enterprise and edge. The latter must be small and lower power. On the DRAM versus non-DRAM side is the question of fitting the entire application on chip or not; it’s no big deal for server side work but for edge it’s not possible to add that and make use of sparsity and compression. This is a baseline.</p>
<p><img fetchpriority="high" decoding="async" class="aligncenter size-full wp-image-38215" src="http://3s81si1s5ygj3mzby34dq6qf-wpengine.netdna-ssl.com/wp-content/uploads/2018/08/Mythic_Chart1.png" sizes="(max-width: 806px) 100vw, 806px" srcset="https://3s81si1s5ygj3mzby34dq6qf-wpengine.netdna-ssl.com/wp-content/uploads/2018/08/Mythic_Chart1.png 806w, https://3s81si1s5ygj3mzby34dq6qf-wpengine.netdna-ssl.com/wp-content/uploads/2018/08/Mythic_Chart1-768x376.png 768w" alt="" width="806" height="395" /></p>
<p>So mythic introduces a new type of non-volatile new memory that fits an entire application on a chip with a smaller power budget that is on par with enterprise chips and with quite good reported performance.</p>
<p><img decoding="async" class="aligncenter size-full wp-image-38216" src="http://3s81si1s5ygj3mzby34dq6qf-wpengine.netdna-ssl.com/wp-content/uploads/2018/08/Mythic_Chart2.png" sizes="(max-width: 962px) 100vw, 962px" srcset="https://3s81si1s5ygj3mzby34dq6qf-wpengine.netdna-ssl.com/wp-content/uploads/2018/08/Mythic_Chart2.png 962w, https://3s81si1s5ygj3mzby34dq6qf-wpengine.netdna-ssl.com/wp-content/uploads/2018/08/Mythic_Chart2-768x311.png 768w" alt="" width="962" height="390" /></p>
<p>Interestingly, Mythic says it is “resetting Moore’s Law” which is a slick way of saying they’re doing this at 40nm for reasons that we are sure are more nuanced than allowing improvements at 28nm versus pushing the 5 and under envelope.</p>
<p>At a lower level, this idea of matrix multiplication memory without reading weights and only paying for MAC is done with analog circuits—flash transistors that can be modeled as variable resistors representing the weights. This ultimately allows for those “free” accesses and also eliminates the need for large batch sizes, sparsity or compression or “nerfed” DNN models.</p>
<p><img decoding="async" class="aligncenter size-full wp-image-38217" src="http://3s81si1s5ygj3mzby34dq6qf-wpengine.netdna-ssl.com/wp-content/uploads/2018/08/Mythic_circuits.png" sizes="(max-width: 988px) 100vw, 988px" srcset="https://3s81si1s5ygj3mzby34dq6qf-wpengine.netdna-ssl.com/wp-content/uploads/2018/08/Mythic_circuits.png 988w, https://3s81si1s5ygj3mzby34dq6qf-wpengine.netdna-ssl.com/wp-content/uploads/2018/08/Mythic_circuits-768x361.png 768w" alt="" width="988" height="465" /></p>
<p>In the variable resistor array above each is a flash transistor that can be programmed and read again for accuracy. Instead of trying to read individual cells, Mythic applies a set of voltages to the input vector and gets a set of currents as outputs that are then run through the ADCs to turn those currents into values.</p>
<p>In other words, the flash transistors will generate current currents and those will summon the bitline and the resulting formal current represents the answer to the question being asked. Mythic uses a digital approximation technique for input.</p>
<p><img loading="lazy" decoding="async" class="aligncenter size-full wp-image-38218" src="http://3s81si1s5ygj3mzby34dq6qf-wpengine.netdna-ssl.com/wp-content/uploads/2018/08/Mythic_Tiles.png" sizes="auto, (max-width: 898px) 100vw, 898px" srcset="https://3s81si1s5ygj3mzby34dq6qf-wpengine.netdna-ssl.com/wp-content/uploads/2018/08/Mythic_Tiles.png 898w, https://3s81si1s5ygj3mzby34dq6qf-wpengine.netdna-ssl.com/wp-content/uploads/2018/08/Mythic_Tiles-768x370.png 768w" alt="" width="898" height="433" /></p>
<p>Above you can see how Mythic can take the above array (the pink middle box) and pack this into tiles with one memory array and other logic that supports configuration and intermediate data storage (the SRAM) with a RISC-V control CPU, router, and the SIMD unit for matrix multiples. It is shown here in a camera for on the fly AI.</p>
<p>The initial system can have a 50 million weight capacity and currently is designed with four lanes of PCIe. Fick says they can also make variants with up to 250 million weight capacity, 16 lanes of PCIe or USB as well as the use of an enhanced control processor (ARM for instance).</p>
<p>The full accounting of energy consumption is below. This is for everything—the digital logic, I/O, PCIe, etc., the entire process. It is followed by their ResNet results on what we assume to be a Tegra GPU.</p>
<p><img loading="lazy" decoding="async" class="aligncenter size-full wp-image-38219" src="http://3s81si1s5ygj3mzby34dq6qf-wpengine.netdna-ssl.com/wp-content/uploads/2018/08/MythicEnergy_stacked.png" sizes="auto, (max-width: 919px) 100vw, 919px" srcset="https://3s81si1s5ygj3mzby34dq6qf-wpengine.netdna-ssl.com/wp-content/uploads/2018/08/MythicEnergy_stacked.png 919w, https://3s81si1s5ygj3mzby34dq6qf-wpengine.netdna-ssl.com/wp-content/uploads/2018/08/MythicEnergy_stacked-768x334.png 768w" alt="" width="919" height="400" /></p>
<p><img loading="lazy" decoding="async" class="aligncenter size-full wp-image-38220" src="http://3s81si1s5ygj3mzby34dq6qf-wpengine.netdna-ssl.com/wp-content/uploads/2018/08/MythicEnergy_stacked2.png" sizes="auto, (max-width: 980px) 100vw, 980px" srcset="https://3s81si1s5ygj3mzby34dq6qf-wpengine.netdna-ssl.com/wp-content/uploads/2018/08/MythicEnergy_stacked2.png 980w, https://3s81si1s5ygj3mzby34dq6qf-wpengine.netdna-ssl.com/wp-content/uploads/2018/08/MythicEnergy_stacked2-768x363.png 768w" alt="" width="980" height="463" /></p>
<p>The first generation release of software tools and the profiler will be available late this year. PCIe development boards with 1 and 4 IPUs will arrive mid 2019 with volume shipments of both chips and PCIe boards with 1,4 and 16 IPUs expected.</p>
<p>The post <a href="https://www.aiuniverse.xyz/a-mythic-approach-to-deep-learning-inference/">A MYTHIC APPROACH TO DEEP LEARNING INFERENCE</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/a-mythic-approach-to-deep-learning-inference/feed/</wfw:commentRss>
			<slash:comments>32</slash:comments>
		
		
			</item>
		<item>
		<title>7 Things Lawyers Should Know About Artificial Intelligence</title>
		<link>https://www.aiuniverse.xyz/7-things-lawyers-should-know-about-artificial-intelligence/</link>
					<comments>https://www.aiuniverse.xyz/7-things-lawyers-should-know-about-artificial-intelligence/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Sat, 12 May 2018 05:34:03 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[AI technologies]]></category>
		<category><![CDATA[computer software]]></category>
		<category><![CDATA[Lawyers]]></category>
		<category><![CDATA[smart technology]]></category>
		<category><![CDATA[software tools]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=2361</guid>

					<description><![CDATA[<p>Source &#8211; abovethelaw.com If you’re thinking about implementing artificial intelligence (AI) into your legal organization, congratulations on being a forward thinker. Although … it’s actually not as forward thinking as it may seem. AI is no longer the nebulous, otherworldly techno-universe that you many have once envisioned. It’s already here, today, giving us directions, telling us <a class="read-more-link" href="https://www.aiuniverse.xyz/7-things-lawyers-should-know-about-artificial-intelligence/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/7-things-lawyers-should-know-about-artificial-intelligence/">7 Things Lawyers Should Know About Artificial Intelligence</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source &#8211; abovethelaw.com</p>
<p>If you’re thinking about implementing artificial intelligence (AI) into your legal organization, congratulations on being a forward thinker. Although … it’s actually not as forward thinking as it may seem. AI is no longer the nebulous, otherworldly techno-universe that you many have once envisioned. It’s already here, today, giving us directions, telling us jokes, recommending music, answering our questions, generally making our jobs – and lives – easier. Here are seven things to keep in mind about AI as you prepare to implement it in your legal organization:</p>
<p><b></b><b>1. It’s all about the data.</b><i><br />
</i>Artificial intelligence is built on data. Therefore, the effectiveness of an AI solution can only be as good as the accuracy of the data it is relying on. That’s why you can’t just decide to “do AI.” You first have to identify the problem you are trying to solve, then take a look at the data (for example, electronic billing data or case and matter management data from a matter management system) to see if it’s “clean” or if it needs some data hygiene.</p>
<p><b>2. AI is not just one technology.</b><i><br />
</i>If you’re searching for the next big thing in AI, you’re not going to find it. AI is not a singular thing. There’s no “killer” AI app for the legal industry. Instead, there are AI applications in many areas of the legal industry, and each of those applications might use a different AI-related technology.</p>
<p><b>3. It’s not magic, it’s just software. </b><b><br />
</b>While the term artificial intelligence has a mystic, futuristic aura about it, in reality it’s basically just computer software. Sure, it’s software created by really smart technology engineers and product developers who integrate complicated algorithms to compute calculations … but in the end, it’s just one of the tools legal professionals have at their disposal to help them work more efficiently.</p>
<p><b>4. AI can help you run a business.<br />
</b>It’s the business side of being a lawyer where AI technologies are most helpful. Legal organizations have the same kind of business processes as any type of business, such as billing, pricing, and marketing, etc. Most of those processes involve numbers and data (prices, margins, budgets, expenses, etc.). And remember, it’s all about the data, so all of these business processes can be analyzed and managed with the help of AI technologies.</p>
<p><b>5. AI does not replace humans, it assists them.<br />
</b>Attorneys are not going to become obsolete, replaced by robots. Sure, AI solutions can take in the data and make predictions or suggest likely outcomes, but those predictions are of varying degrees of certainty. And the conclusions may be based on inaccurate data. That’s where human lawyers come in. They evaluate the data, draw upon past experiences that may or may not be part of the data, and generate their own answers, predictions, and advice – all informed by (not determined by) artificial intelligence.</p>
<p><b>6. Adopting AI means embracing change.<br />
</b>If you intend to implement AI technologies into your legal organization, you must be ready for change.<br />
Not only will your processes and workflows need to change to incorporate AI into the business, but you’ll also likely be working with a whole new set of people. Whether they are part of your firm or outside consultants, expect to collaborate with data analysts, process engineers, pricing specialists, and other data-driven professionals.</p>
<p><b>7. Clients will drive your need for AI. </b><b><br />
</b>Speaking of collaboration, AI can also be a catalyst for collaboration between a law firm and its clients. For starters, clients’ needs will often drive the adoption of AI solutions. And in many cases, it’s the clients who have the data needed to build effective AI solutions.</p>
<p>The post <a href="https://www.aiuniverse.xyz/7-things-lawyers-should-know-about-artificial-intelligence/">7 Things Lawyers Should Know About Artificial Intelligence</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/7-things-lawyers-should-know-about-artificial-intelligence/feed/</wfw:commentRss>
			<slash:comments>5</slash:comments>
		
		
			</item>
		<item>
		<title>Google Takes Machine Learning Chip to the Cloud</title>
		<link>https://www.aiuniverse.xyz/google-takes-machine-learning-chip-to-the-cloud/</link>
					<comments>https://www.aiuniverse.xyz/google-takes-machine-learning-chip-to-the-cloud/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 15 Feb 2018 05:04:52 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[Cloud Computing]]></category>
		<category><![CDATA[Google]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[software tools]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=2018</guid>

					<description><![CDATA[<p>Source &#8211; electronicdesign.com Almost two years ago, Google disclosed that it had built a slab of custom silicon called the tensor processing unit to improve its StreetView software’s reading of street signs, the accuracy of its search engine algorithm, and the machine learning methods that it uses in dozens of other internet services. But the company never <a class="read-more-link" href="https://www.aiuniverse.xyz/google-takes-machine-learning-chip-to-the-cloud/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/google-takes-machine-learning-chip-to-the-cloud/">Google Takes Machine Learning Chip to the Cloud</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source &#8211; electronicdesign.com</p>
<p>Almost two years ago, Google disclosed that it had built a slab of custom silicon called the tensor processing unit to improve its StreetView software’s reading of street signs, the accuracy of its search engine algorithm, and the machine learning methods that it uses in dozens of other internet services.</p>
<p>But the company never planned to keep its custom accelerator on the backend indefinitely. The goal had always been to hand the keys of the tensor processing unit – more commonly called the TPU – to software engineers via the cloud. On Monday, Google finally started offering its TPU chips to other companies, putting the custom silicon in more hands and its performance in front of more eyes.</p>
<p>The tensor processing unit is equipped with four custom chips connected together to provide 180 trillion operations per second for machine learning workloads. With all that computing power, software engineers can train neural networks used in machine learning in hours rather than days, and without having to build a private computing cluster, according to Google.</p>
<p>“Traditionally, writing programs for custom ASICs and supercomputers has required deeply specialized expertise,” wrote John Barrus, Google Cloud’s product manager for Cloud TPUs, and Zak Stone, product manager for TensorFlow and Cloud TPUs inside the Google Brain team, in a blog post.</p>
<p>To lower the bar for programming, Google is offering a set of software tools based on TensorFlow, a software framework for machine learning developed by Google. The company said it would make models available for object detection, image classification and language translation as a reference for customers. Google said that it would charge $6.50 per TPU per hour for those participating in the beta.</p>
<p>Google’s goal is to bait software engineers into using its cloud computing platform over Amazon’s and Microsoft’s clouds. But in the process, Google is starting to compete with other chip suppliers, particularly Nvidia, which is spending billions of dollars on chips festooned with massive amounts of memory and faster interconnects to hold its lead over the market for machine learning.</p>
<p>Moving its custom chips into the cloud also puts Google into a collision course with semiconductor startups targeting machine learning, including Groq, founded by the former Google engineers that worked on the first TPU. Other companies close to releasing chips, like Graphcore and Wave Computing, could be hurt by Google’s bid for vertical integration, especially if other internet giants follow its lead.</p>
<p>Google said that the first customers to use the custom hardware include Lyft, which is working on autonomous vehicles that can be deployed in its ride sharing network, and which raised $1 billion in a financing round last year led by Google’s parent Alphabet. Another early user is Two Sigma, an investment management firm, wrote Barrus and Stone in the blog post.</p>
<p>There are a few questions we do not have answers to yet. It is not clear how much of Google’s internal training and inferencing runs on its custom chips, which since last year support both phases of machine learning. And Google is still facing the question of whether it can convince cloud customers to stop using Nvidia’s graphics processors for training and Intel’s central processing units for inferencing.</p>
<p>Last year, Google pulled the curtain off the performance of its first generation chip, which was only optimized for the inferencing phase of machine learning. It claimed that the accelerator ran thirteen times faster than chips based on Nvidia’s Kepler architecture. We point that out because Kepler is now two full generations behind Nvidia’s Volta architecture, which is built around custom cores that thrive on the intensive number-crunching involved in machine learning.</p>
<div class="article-content ">
<p>Nvidia estimates that it spent $3 billion building Volta. Its latest line of chips based on the architecture can deliver 125 trillion floating point operations per second for training and running neural networks. The chips can be packaged into miniature supercomputers that can be slid into a standard server or taken to oil rigs or construction sites to train algorithms locally instead of in a public cloud.</p>
<p>Google is not going to stop buying the latest chips for its cloud. The company said it would bolster its cloud with Nvidia’s Tesla V100, which are actually more flexible than Google’s chips because they support a wider range of software libraries, including TensorFlow. It will also offer customers chips based on Intel’s Skylake architecture, which has been beefed up for machine learning.</p>
<p>Later this year, Google plans to hand customers the keys to supercomputers called TPU pods, which link together 64 custom chips to provide up to 11.5 petaflops of performance. In December, Google said that a single pod could train ResNet-50, a program for image classification used as a benchmark for machine learning chips, in less than a half hour – much faster than previous methods.</p>
</div>
<div id="comments-76782" class="comments-wrapper"></div>
<p>&nbsp;</p>
<p>The post <a href="https://www.aiuniverse.xyz/google-takes-machine-learning-chip-to-the-cloud/">Google Takes Machine Learning Chip to the Cloud</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/google-takes-machine-learning-chip-to-the-cloud/feed/</wfw:commentRss>
			<slash:comments>4</slash:comments>
		
		
			</item>
		<item>
		<title>How Robots, IoT And Artificial Intelligence Are Transforming The Police</title>
		<link>https://www.aiuniverse.xyz/how-robots-iot-and-artificial-intelligence-are-transforming-the-police/</link>
					<comments>https://www.aiuniverse.xyz/how-robots-iot-and-artificial-intelligence-are-transforming-the-police/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Wed, 20 Sep 2017 07:39:24 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[Big data]]></category>
		<category><![CDATA[Digital tools]]></category>
		<category><![CDATA[IoT]]></category>
		<category><![CDATA[Robots]]></category>
		<category><![CDATA[software tools]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=1209</guid>

					<description><![CDATA[<p>Source &#8211; forbes.com It’s happened. Arrests have been made thanks to the evidence collected from connected digital devices such as the Amazon dot and a Fitbit. This is just the tip of the transformation that law enforcement will experience because of the Internet of Things (IoT), artificial intelligence and robots. There are certainly benefits to applying this new <a class="read-more-link" href="https://www.aiuniverse.xyz/how-robots-iot-and-artificial-intelligence-are-transforming-the-police/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-robots-iot-and-artificial-intelligence-are-transforming-the-police/">How Robots, IoT And Artificial Intelligence Are Transforming The Police</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source &#8211;<strong> forbes.com</strong></p>
<p>It’s happened. Arrests have been made thanks to the evidence collected from connected digital devices such as the Amazon dot and a Fitbit. This is just the tip of the transformation that law enforcement will experience because of the Internet of Things (IoT), artificial intelligence and robots. There are certainly benefits to applying this new technology to help fight crime, but it also raises some challenging questions regarding our right to privacy and security breaches.</p>
<p><strong>Internet of Things Used to Help Fight Crime</strong></p>
<p>Law enforcement agencies across the world are getting trained on what to look for at crime scenes and how to handle digital evidence. Gaming consoles, Echo devices and even Fitbits have provided valuable information to help solve crimes. Most people don’t comprehend the power of these connected devices to contradict alibis and catch lies. As our reliance on these digital devices for entertainment and convenience continues to grow—watches, phones, televisions, pacemakers and more—there will be a longer trail for detectives to analyze when trying to solve a crime.</p>
<p>It’s commonplace now for officers to have body cams on when on patrol. These cameras can provide another set of eyes to sort through an interaction after the fact and studiessuggest they can improve self-awareness to prevent unacceptable behavior from officers and those they interact with. Knowing these interactions will be recorded is a big deterrent for bad behavior.</p>
<p>Some squad cars are equipped with GPS projectiles that can be shot via remote control and hook onto the back of an alleged perpetrator’s vehicle. These allow officers to know where a suspect is located and therefore prevent high-speed and dangerous car pursuits. Smart sensors have been developed that can be fixed to the inside of an officer’s gunto track how the gun is being used including whether it has been unholstered or discharged. This information could prove valuable in criminal trials.</p>
<p><strong>Artificial Intelligence Aids in Predictive Policing</strong></p>
<p>Several law enforcement agencies have dabbled in predictive policing including my customer the UK police in the city of Durham, England. They used a system called Hart (Harm Assessment Risk Tool) that classifies individuals and ranks the probability that they will commit another offense in the future. The system was fed data gathered between 2008-2013 and assesses people based on severity of the current crime, criminal history, flight risk and more. Although Hart’s forecasts were accurate a high percentage of the time, there are other studies that warn of using algorithms and predictive software tools because they flag minority defendants as high risk at double the rate of white defendants. One such study from ProPublica shows the human bias that is injected into such formulas because the flawed judgement of humans was used to create the programs in the first place.</p>
<p>Agencies across the world are moving toward more data-driven approaches to solving crimes. Machine learning is particularly skilled at identifying patterns and can be quite useful when trying to discern a modus operandi (M.O.) of an offender. Digital tools can speed up this work and find connections that might take humans much longer to uncover. In the future, these types of algorithms might prove useful to detect serial crimes committed by the same individual or group.</p>
<p><strong>Robo Cops Make their Debut</strong></p>
<p>There’s a new officer in Dubai to help fight crime, but although he wears a police cap, he’s 100% robot. Dubai police plan to have robotic officers make up a quarter of the force by 2030. It can speak six languages and is designed to read facial expressions. It has a computer touch screen where people can report a crime. The robot is deployed mainly to tourist spots and is equipped with a camera that sends live images back to police headquarters to identify wanted suspects. Although the robo cop can help deter crime and relieve some tasks from its human counterparts, humans are still expected to make arrests.</p>
<p>Other robots are deployed around the world to collect evidence, investigate and detonate bombs and for crowd control among other tasks. That hasn’t stopped more than a thousand robotic experts such as Elon Musk and Stephen Hawking to warn against arming machines without human control.</p>
<p>As with any adoption of artificial intelligence and the Internet of Things, there are questions to ask and answer and concerns to address. Law enforcement agencies across the world are grappling with these and trying to find the right balance to take advantage of the benefits of this technology to fight and solve crime while preserving privacy and security.</p>
<p>Bernard Marr is a best-selling author &amp; keynote speaker on business, technology and big data. His new book is Data Strategy. To read his future posts simply join his network here.</p>
<p>It’s happened. Arrests have been made thanks to the evidence collected from connected digital devices such as the Amazon dot and a Fitbit. This is just the tip of the transformation that law enforcement will experience because of the Internet of Things (IoT), artificial intelligence and robots. There are certainly benefits to applying this new technology to help fight crime, but it also raises some challenging questions regarding our right to privacy and security breaches.</p>
<p><strong>Internet of Things Used to Help Fight Crime</strong></p>
<p>Law enforcement agencies across the world are getting trained on what to look for at crime scenes and how to handle digital evidence. Gaming consoles, Echo devices and even Fitbits have provided valuable information to help solve crimes. Most people don’t comprehend the power of these connected devices to contradict alibis and catch lies. As our reliance on these digital devices for entertainment and convenience continues to grow—watches, phones, televisions, pacemakers and more—there will be a longer trail for detectives to analyze when trying to solve a crime.</p>
<p>It’s commonplace now for officers to have body cams on when on patrol. These cameras can provide another set of eyes to sort through an interaction after the fact and studiessuggest they can improve self-awareness to prevent unacceptable behavior from officers and those they interact with. Knowing these interactions will be recorded is a big deterrent for bad behavior.</p>
<p>Some squad cars are equipped with GPS projectiles that can be shot via remote control and hook onto the back of an alleged perpetrator’s vehicle. These allow officers to know where a suspect is located and therefore prevent high-speed and dangerous car pursuits. Smart sensors have been developed that can be fixed to the inside of an officer’s gunto track how the gun is being used including whether it has been unholstered or discharged. This information could prove valuable in criminal trials.</p>
<p><strong>Artificial Intelligence Aids in Predictive Policing</strong></p>
<p>Several law enforcement agencies have dabbled in predictive policing including my customer the UK police in the city of Durham, England. They used a system called Hart (Harm Assessment Risk Tool) that classifies individuals and ranks the probability that they will commit another offense in the future. The system was fed data gathered between 2008-2013 and assesses people based on severity of the current crime, criminal history, flight risk and more. Although Hart’s forecasts were accurate a high percentage of the time, there are other studies that warn of using algorithms and predictive software tools because they flag minority defendants as high risk at double the rate of white defendants. One such study from ProPublica shows the human bias that is injected into such formulas because the flawed judgement of humans was used to create the programs in the first place.</p>
<p>Agencies across the world are moving toward more data-driven approaches to solving crimes. Machine learning is particularly skilled at identifying patterns and can be quite useful when trying to discern a modus operandi (M.O.) of an offender. Digital tools can speed up this work and find connections that might take humans much longer to uncover. In the future, these types of algorithms might prove useful to detect serial crimes committed by the same individual or group.</p>
<p><strong>Robo Cops Make their Debut</strong></p>
<p>There’s a new officer in Dubai to help fight crime, but although he wears a police cap, he’s 100% robot. Dubai police plan to have robotic officers make up a quarter of the force by 2030. It can speak six languages and is designed to read facial expressions. It has a computer touch screen where people can report a crime. The robot is deployed mainly to tourist spots and is equipped with a camera that sends live images back to police headquarters to identify wanted suspects. Although the robo cop can help deter crime and relieve some tasks from its human counterparts, humans are still expected to make arrests.</p>
<div id="attachment_691070134" class="wp-caption alignnone"><img decoding="async" class="dam-image getty size-large wp-image-691070134" src="https://specials-images.forbesimg.com/imageserve/691070134/960x0.jpg?fit=scale" data-height="640" data-width="960" /></div>
<p>Other robots are deployed around the world to collect evidence, investigate and detonate bombs and for crowd control among other tasks. That hasn’t stopped more than a thousand robotic experts such as Elon Musk and Stephen Hawking to warn against arming machines without human control.</p>
<p>As with any adoption of artificial intelligence and the Internet of Things, there are questions to ask and answer and concerns to address. Law enforcement agencies across the world are grappling with these and trying to find the right balance to take advantage of the benefits of this technology to fight and solve crime while preserving privacy and security.</p>
<p>&nbsp;</p>
<p>The post <a href="https://www.aiuniverse.xyz/how-robots-iot-and-artificial-intelligence-are-transforming-the-police/">How Robots, IoT And Artificial Intelligence Are Transforming The Police</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-robots-iot-and-artificial-intelligence-are-transforming-the-police/feed/</wfw:commentRss>
			<slash:comments>1</slash:comments>
		
		
			</item>
		<item>
		<title>Aiming to Divorce the Cloud, Qualcomm Buys Machine Learning Start-Up</title>
		<link>https://www.aiuniverse.xyz/aiming-to-divorce-the-cloud-qualcomm-buys-machine-learning-start-up/</link>
					<comments>https://www.aiuniverse.xyz/aiming-to-divorce-the-cloud-qualcomm-buys-machine-learning-start-up/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Mon, 21 Aug 2017 08:54:39 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[cloud]]></category>
		<category><![CDATA[data centers]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[Machine Learning Start-Up]]></category>
		<category><![CDATA[smartphone chips]]></category>
		<category><![CDATA[software tools]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=690</guid>

					<description><![CDATA[<p>Source &#8211; electronicdesign.com Last year, Qualcomm stopped short of including an accelerator core called the neural processing unit its Snapdragon silicon. Instead, it announced that it would publish software to bend its existing smartphone chips to the whims of machine learning. The strategy shift shows Qualcomm’s sense of urgency around machine learning, which has typically required <a class="read-more-link" href="https://www.aiuniverse.xyz/aiming-to-divorce-the-cloud-qualcomm-buys-machine-learning-start-up/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/aiming-to-divorce-the-cloud-qualcomm-buys-machine-learning-start-up/">Aiming to Divorce the Cloud, Qualcomm Buys Machine Learning Start-Up</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source &#8211;<strong> electronicdesign.com</strong></p>
<p>Last year, Qualcomm stopped short of including an accelerator core called the neural processing unit its Snapdragon silicon. Instead, it announced that it would publish software to bend its existing smartphone chips to the whims of machine learning.</p>
<p>The strategy shift shows Qualcomm’s sense of urgency around machine learning, which has typically required powerful servers to understand speech and sensor readings. It also reflects the chipmaker’s plans, which it elaborated on Thursday when acquired Dutch machine learning start-up Scyfer.</p>
<p>Scyfer’s software has been used to classify manufacturing defects for Tata Steel and predict store revenues for Dutch supermarket chain Albert Heijh. It runs machine learning techniques on storehouses of data generated by factory sensors, online shoppers, cameras, bank transactions, and medical imaging machines.</p>
<p>Making sense of such information usually happens in data centers. Shelves of graphics chips train models on, for instance, the identifying features of images of cats. Then, algorithms apply that model to a particular cat in a smartphone video. This is called inferencing, and it also typically occurs in the cloud.</p>
<p>Qualcomm wants to take inferencing out of the cloud, bringing it to gadgets. That would allow a smartphone to translate text written in a foreign language without calling back to the cloud, or a sensor to identify chemical leaking from an oil refinery instead of streaming raw data to the cloud for the final word.</p>
<p>Matt Grob, Qualcomm’s executive vice president of technology, wrote in a Thursday blog post that “in many cases, inference running entirely in the cloud will have issues for real-time applications that are latency-sensitive and mission-critical like autonomous driving.”</p>
<p>Grob added that such “applications cannot afford the roundtrip time or rely on critical functions to operate when in variable wireless coverage.” The benefits, he said, include more privacy and lower latency when chatting with the cloud over the internet. Messages sent to the cloud would also be condensed, saving battery life and network bandwidth.</p>
<p>“The cloud remains of course very important and will complement on-device processing,” Grob said. That strategy fits Qualcomm, which has been trying to plant roots in markets like robotics and sensors. Last year, it gave the clearest sign yet of its ambitions, agreeing to pay $47 billion for NXP Semiconductors, the world’s largest maker of automotive chips.</p>
<p>To bring intelligence to everyday gadgets, other companies are already going for custom silicon. Last year, Intel bought vision chipmaker Movidius to enable intelligent security cameras and drones, while start-ups like ThinCI are cobbling together silicon engines for self-driving cars. Microsoft is customizing chips so that its HoloLens goggles can analyze what users have in their field of vision.</p>
<p>For years, Qualcomm teased plans of releasing silicon that resemble in a limited way how the human brain processes information. But last year, it appeared to shelve its neuromorphic chips in a pivot to software tools that make the most of conventional computing cores.</p>
<p>The software it released last year, called the Neural Processing Engine, carefully curates machine learning code that understands speech and translates text. It finds the right CPU, GPU, and DSP cores inside Qualcomm’s chips to run tasks as fast and efficiently as possible.</p>
<p>“Qualcomm’s solutions already have the power, thermal, and processing efficiency to run powerful AI algorithms on the actual device,” Grob said in the Thursday blog post. “The diversity in architecture is essential and you can’t rely on just one type of engine for all workloads.”</p>
<p>The tool helps optimize algorithms for longer battery life and faster processing. Last month, Facebook announced plans to apply the software to augmented reality features in its smartphone app. The tool moves Facebook’s code into graphics cores, making augmented reality objects appear more realistic in photos or live video.</p>
<p>Buying Scyfer bolsters Qualcomm’s machine learning research ranks, which is also eyeing specialized hardware and neural network advances. The chipmaker adds Max Welling, one of Scyfer’s founders and a professor at University of Amsterdam. He is a former student of Geoffrey Hinton, who founded Google’s deep learning division and mentored scientists like Facebook’s Yann LeCun and Uber’s Zoubin Ghahramani.</p>
<p>Qualcomm didn’t disclose what it paid for the firm, which spun out of the University of Amsterdam in 2013.</p>
<p>The post <a href="https://www.aiuniverse.xyz/aiming-to-divorce-the-cloud-qualcomm-buys-machine-learning-start-up/">Aiming to Divorce the Cloud, Qualcomm Buys Machine Learning Start-Up</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/aiming-to-divorce-the-cloud-qualcomm-buys-machine-learning-start-up/feed/</wfw:commentRss>
			<slash:comments>3</slash:comments>
		
		
			</item>
	</channel>
</rss>
