<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Clouds Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/clouds/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/clouds/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Fri, 03 Jan 2020 07:39:02 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>Big Data Brings Challenges Beyond the Capabilities of Traditional SIEMs</title>
		<link>https://www.aiuniverse.xyz/big-data-brings-challenges-beyond-the-capabilities-of-traditional-siems/</link>
					<comments>https://www.aiuniverse.xyz/big-data-brings-challenges-beyond-the-capabilities-of-traditional-siems/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 03 Jan 2020 07:39:01 +0000</pubDate>
				<category><![CDATA[Big Data]]></category>
		<category><![CDATA[applications]]></category>
		<category><![CDATA[Big data]]></category>
		<category><![CDATA[Clouds]]></category>
		<category><![CDATA[data analytics]]></category>
		<category><![CDATA[platforms]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=5949</guid>

					<description><![CDATA[<p>Source: cpomagazine.com Data growth has taken the tech industry by storm – and there’s no sign of stopping it. The ubiquitousness of connected devices, applications, and social <a class="read-more-link" href="https://www.aiuniverse.xyz/big-data-brings-challenges-beyond-the-capabilities-of-traditional-siems/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/big-data-brings-challenges-beyond-the-capabilities-of-traditional-siems/">Big Data Brings Challenges Beyond the Capabilities of Traditional SIEMs</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: cpomagazine.com</p>



<p>Data growth has taken the tech industry by storm – and there’s no sign of stopping it. The ubiquitousness of connected devices, applications, and social media platforms has ingrained itself into the lives of billions, accelerating the accumulation of Big Data at a breakneck pace. By 2020, it’s predicted that 1.7MB of data will be generated every second for every person on the planet. Multiply that by 7.7 billion, and Big Data may now seem like an inadequate description.</p>



<p>For many cyber experts, the advent of this exponential production of data, and the industry’s quick response to adapt to it, comes as no surprise. Moore’s Law, which dictates that the number of transistors on a microchip will double annually, has so far proven to be true, albeit some claim it’s coming to an end soon. Computing power and the size of microprocessors have a strong, yet inverse relationship — the more powerful the device, the smaller the chip is that is powering it. Spurred on by chip-making giants such as Intel, modern engineering is delivering nanochips with greater numbers of transistors in recent years. However, the scalability of these ever-shrinking computer chips is reaching a breaking point — threatening the conventional wisdom that Moore’s Law once represented. Earlier this year, Intel chief engineering officer Murthy Renduchintala expressed concerns over the company’s ability to cope with the practical challenges posed by Big Data.</p>



<p>“It’s no secret that Intel has struggled with 10 nanometers,” Renduchintala said. “And what I have found in discussions with many… is the perception that Intel’s process innovation has slowed down during this time.”</p>



<p>This slowdown in innovation was inevitable, as the size of any physically existing object will soon find its limitation. In response to this technological hurdle comes the cloud, which has increased in use and popularity over the last decade. The idea of ridding an organization of the need for on-prem servers, which are inherently limited by space, has garnered a natural appeal in the wake of Big Data. As this shift has taken place, older, more outdated IT security platforms and on-prem servers have not been able to adequately evolve alongside Big Data production or cloud-based storage methods, preventing stakeholders from leveraging data in a way that makes sense for them.</p>



<p>Both legacy SIEM and central log management tools are excellent data gathering and detection methods but fail to comprehensively or intuitively associate pieces of data together, making the gobs of information produced by a company effectively useless or too time-consuming to understand. Migrating to the cloud requires more than a transfer of data, but rather a complete re-evaluation of how data-producing software should be tracked and stored.</p>



<p>SOAR, alternatively, has been able to partially address this problem, but is still bound by a more traditional framework that was not originally built for the cloud. SOAR improves upon the shortcomings of older SIEM’s by integrating threat-hunting and vulnerability testing, allowing for intelligent predictions or evaluations of suspicious activity and automatically responding accordingly. The automatic response characteristic of SOAR can render it useless, however, when unnecessary shutdowns are provoked by its detection of perceived threats, making it an inefficient SIEM. Overall, these solutions have worked fairly well, but still lack intuitive programming that is both native to the cloud and designed to skillfully interact with all services in a given network.</p>



<p>The key to harnessing the power of the cloud is found in taking SOAR’s capabilities to a greater level of network integration, producing what Fluency re-names as Cloud Orchestration, Automation and Response, or simply COAR rather than SOAR. This type of cloud-based data storage solution allows for vast accumulations of Big Data to be tracked across multiple APIs simultaneously. In this capacity, COAR operates as a central log management system engineered for the cloud, rather than a work-around solution based on an old on-prem server’s network. This kind of next generation SIEM shift is substantial when considering how Big Data is revolutionizing how and why potential threats are detected. COAR is able to track 12 million events per second sustained and with absolute ease – illustrating the kind of power a cloud-based advanced analytics tool can generate.</p>



<p>Increased power and storage capacity aren’t the only drivers of cloud migration. Data compliance is of major concern as organizations worldwide continue to produce vast amounts of personally identifiable information (PII) at exponential levels. The right of a customer or member of an organization to be “forgotten” has achieved paramount importance in the development of privacy laws in EU member states, with the passing of GDPR being the most prominent example.</p>



<p>Across the pond, California and New York are set to follow suit with CCPA and SHIELD legislation placing limitations on Big Tech’s level of data ownership, in addition to the databases of countless other businesses and institutions. Compliance enforcement comes at a time when data growth is at an all-time high, exerting even greater pressure on businesses large and small to securely and efficiently store PII, let alone healthcare organizations with PHI. Cloud-based log management offers a storage alternative with built-in compliance features and greater storage capacity, providing an easier pathway to compliance rather than overhauling pre-existing systems.</p>



<p>Moore’s Law may now be close to obsolete, and the forces that drove it to extinction are only growing stronger. The question of whether additional transistors can be squeezed onto a microchip may continue to persist but is ultimately a conversation devoid of innovative or practical thinking. Cloud migration is fueled by this shift, creating a greater need for sophisticated cloud orchestration tools and threat detection methods that reach far beyond one-off notifications or automatic shutdowns. Ultimately, the cloud offers a significant advantage when done correctly, but requires a large amount of expertise that most companies don’t have — and will soon need to. </p>
<p>The post <a href="https://www.aiuniverse.xyz/big-data-brings-challenges-beyond-the-capabilities-of-traditional-siems/">Big Data Brings Challenges Beyond the Capabilities of Traditional SIEMs</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/big-data-brings-challenges-beyond-the-capabilities-of-traditional-siems/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Data science to dominate the workplace in 2020 — learn Python to get in on the action</title>
		<link>https://www.aiuniverse.xyz/data-science-to-dominate-the-workplace-in-2020-learn-python-to-get-in-on-the-action/</link>
					<comments>https://www.aiuniverse.xyz/data-science-to-dominate-the-workplace-in-2020-learn-python-to-get-in-on-the-action/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Mon, 23 Dec 2019 07:37:48 +0000</pubDate>
				<category><![CDATA[Data Science]]></category>
		<category><![CDATA[Clouds]]></category>
		<category><![CDATA[data]]></category>
		<category><![CDATA[data science]]></category>
		<category><![CDATA[learn Python]]></category>
		<category><![CDATA[Programming Language]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=5767</guid>

					<description><![CDATA[<p>Source: mashable.com If you thought data science was just a trend that would come and go like the Juicy sweatpants or flat-billed cap you practically lived in <a class="read-more-link" href="https://www.aiuniverse.xyz/data-science-to-dominate-the-workplace-in-2020-learn-python-to-get-in-on-the-action/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/data-science-to-dominate-the-workplace-in-2020-learn-python-to-get-in-on-the-action/">Data science to dominate the workplace in 2020 — learn Python to get in on the action</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: mashable.com</p>



<p>If you thought data science was just a trend that would come and go like the Juicy sweatpants or flat-billed cap you practically lived in throughout the 2000s, then you are sorely mistaken.&nbsp;</p>



<p>Data influences nearly every aspect of your life, from playlists recommendations on Spotify, to Alexa&#8217;s answers, to the skincare products you use on a daily basis. And those who can make sense of thousands of cells of raw data and can consolidate them into actionable insights will find themselves climbing the corporate ladder pretty swiftly.&nbsp;</p>



<p>If you can’t go back to school to learn all about data science (yay, American tuition fees!), then consider diving into the world of online courses instead. They provide the information you need to know in order to begin a career in the field and are taught by leading experts. The major differentiator? They’ll cost you less than the price of a textbook. </p>



<p>Among some of the most comprehensive courses you can digitally enroll in is the Python Power Coder Bundle (now on sale for $34). This data science Bootcamp is comprised of eight different topics, including a step-by-step guide to the popular programming language Python (which includes how to set up automation, troubleshooting common coding errors, and deployment) as well as insight into other essential frameworks such as Apache Spark. </p>



<p>The goal is that with these 70+ hours of training under your belt, you’ll be able to navigate the data science field easier and more efficiently than you could before. Plus, you’ll also have the foundational knowledge needed to kickstart a coding career. And considering just how much money is on the table with a job in the field, we think 34 bucks is money well spent. </p>
<p>The post <a href="https://www.aiuniverse.xyz/data-science-to-dominate-the-workplace-in-2020-learn-python-to-get-in-on-the-action/">Data science to dominate the workplace in 2020 — learn Python to get in on the action</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/data-science-to-dominate-the-workplace-in-2020-learn-python-to-get-in-on-the-action/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Deep learning with point clouds</title>
		<link>https://www.aiuniverse.xyz/deep-learning-with-point-clouds/</link>
					<comments>https://www.aiuniverse.xyz/deep-learning-with-point-clouds/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 22 Oct 2019 07:59:40 +0000</pubDate>
				<category><![CDATA[Deep Learning]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Clouds]]></category>
		<category><![CDATA[computer science]]></category>
		<category><![CDATA[deep learning]]></category>
		<category><![CDATA[Laboratory]]></category>
		<category><![CDATA[Research]]></category>
		<category><![CDATA[Robotics]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=4790</guid>

					<description><![CDATA[<p>Source: news.mit.edu If you’ve ever seen a self-driving car in the wild, you might wonder about that spinning cylinder on top of it.&#160; It’s a “lidar sensor,” <a class="read-more-link" href="https://www.aiuniverse.xyz/deep-learning-with-point-clouds/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/deep-learning-with-point-clouds/">Deep learning with point clouds</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source:  news.mit.edu </p>



<p>If you’ve ever seen a self-driving car in the wild, you might wonder about that spinning cylinder on top of it.&nbsp;</p>



<p>It’s a “lidar sensor,” and it’s what allows the car to navigate the world. By sending out pulses of infrared light and measuring the time it takes for them to bounce off objects, the sensor creates a “point cloud” that builds a 3D snapshot of the car’s surroundings.&nbsp;</p>



<p>Making sense of raw point-cloud data is difficult, and before the age of machine learning it traditionally required highly trained engineers to tediously specify which qualities they wanted to capture by hand. But in a new series of papers out of MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), researchers show that they can use deep learning to automatically process point clouds for a wide range of 3D-imaging applications.</p>



<p>“In computer vision and machine learning today, 90 percent of the advances deal only with two-dimensional images,” says MIT Professor Justin Solomon, who was senior author of the new series of papers spearheaded by PhD student Yue Wang. “Our work aims to address a fundamental need to better represent the 3D world, with application not just in autonomous driving, but any field that requires understanding 3D shapes.”&nbsp;</p>



<p>Most previous approaches haven’t been especially successful at capturing the patterns from data that are needed to get meaningful information out of a bunch of 3D points in space. But in one of the team’s papers, they showed that their “EdgeConv” method of analyzing point clouds using a type of neural network called a dynamic graph convolutional neural network allowed them to classify and segment individual objects.&nbsp;</p>



<p>“By building ‘graphs’ of neighboring points, the algorithm can capture hierarchical patterns and therefore infer multiple types of generic information that can be used by a myriad of downstream tasks,” says Wadim Kehl, a machine learning scientist at Toyota Research Institute who was not involved in the work.&nbsp;</p>



<p>In addition to developing EdgeConv, the team also explored other specific aspects of point-cloud processing. For example, one challenge is that most sensors change perspectives as they move around the 3D world; every time we take a new scan of the same object, its position may be different than the last time we saw it. To merge multiple point clouds together into a single detailed view of the world, you need to align multiple 3D points in a process called “registration.”&nbsp;</p>



<p>Registration is vital for many forms of imaging, from satellite data to medical procedures. For example, when a doctor has to take multiple magnetic resonance imaging scans of a patient over time, registration is what makes it possible to align the scans to see what’s changed.&nbsp;</p>



<p>“Registration is what allows us to integrate 3D data from different sources into a common coordinate system,” says Wang. “Without it, we wouldn’t actually be able to get as meaningful information from all these methods that have been developed.”</p>



<p>Solomon and Wang’s second paper demonstrates a new registration algorithm called “Deep Closest Point” (DCP) that was shown to better find a point cloud’s distinguishing patterns, points, and edges (known as “local features”) in order to align it with other point clouds. This is especially important for such tasks as enabling self-driving cars to situate themselves in a scene (“localization”), as well as for robotic hands to locate and grasp individual objects.</p>



<p>One limitation of DCP is that it assumes we can see an entire shape instead of just one side. This means it can’t handle the more difficult task of aligning partial views of shapes (known as “partial-to-partial registration”). As a result, in a third paper the researchers presented an improved algorithm for this task that they call the Partial Registration Network (PRNet).&nbsp;</p>



<p>Solomon says that existing 3D data tends to be “quite messy and unstructured compared to 2D images and photographs.” His team sought to figure out how to get meaningful information out of all that disorganized 3D data without the controlled environment that a lot of machine learning technologies now require.</p>



<p>A key observation behind the success of DCP and PRNet is the idea that a critical aspect of point-cloud processing is context. The geometric features on point cloud A that suggest the best ways to align it to point cloud B may be different from the features needed to align it to point cloud C. For example, in partial registration, an interesting part of a shape in one point cloud may not be visible in the other — making it useless for registration.</p>



<p>Wang says that the team’s tools have already been deployed by many researchers in the computer vision community and beyond. Even physicists are using them for an application the CSAIL team had never considered: particle physics. </p>



<p>Moving forward, the researchers hope to use the algorithms on real-world data, including data gathered from self-driving cars. Wang says they also plan to explore the potential of training their systems using self-supervised learning, to minimize the amount of human annotation needed.</p>



<p>Solomon and Wang were the two sole authors of the DCP and PRNet papers. Their co-authors on the EdgeConv paper were research assistant Yongbin Sun and Professor Sanjay Sarma of MIT, alongside postdoc Ziwei Liu of University of California at Berkeley and Professor Michael M. Bronstein of Imperial College London.&nbsp;</p>



<p>The projects were supported, in part, by the U.S. Air Force, the U.S. Army Research Office, Amazon, Google Research, IBM, the National Science Foundation, the Skoltech-MIT Next Generation Program, and the Toyota Research Institute.</p>
<p>The post <a href="https://www.aiuniverse.xyz/deep-learning-with-point-clouds/">Deep learning with point clouds</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/deep-learning-with-point-clouds/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Robots with Their Heads in the Clouds</title>
		<link>https://www.aiuniverse.xyz/robots-with-their-heads-in-the-clouds/</link>
					<comments>https://www.aiuniverse.xyz/robots-with-their-heads-in-the-clouds/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 04 Aug 2017 09:01:15 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Human Intelligence]]></category>
		<category><![CDATA[brain circuit]]></category>
		<category><![CDATA[cloud-robots]]></category>
		<category><![CDATA[Clouds]]></category>
		<category><![CDATA[human brain operates]]></category>
		<category><![CDATA[Robots]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=463</guid>

					<description><![CDATA[<p>Source &#8211; blogs.scientificamerican.com Despite the rapid advancement and heavy investment in artificial intelligence (AI) and robotics, we still cannot put a human-like brain in a robot—not now, nor in the <a class="read-more-link" href="https://www.aiuniverse.xyz/robots-with-their-heads-in-the-clouds/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/robots-with-their-heads-in-the-clouds/">Robots with Their Heads in the Clouds</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source &#8211; <strong>blogs.scientificamerican.com</strong></p>
<p>Despite the rapid advancement and heavy investment in artificial intelligence (AI) and robotics, we still cannot put a human-like brain<em> in</em> a robot—not now, nor in the foreseeable future. It is a matter of physics.</p>
<p>The human brain is made of bio-circuits with a massive number of interconnecting neurons—about 100 billion of them. Our brain weighs about 1.500 kg (~3.5 lb) and consumes roughly 40 watts of power. Building an equivalent “brain” with today’s most advanced silicon-based digital technology would result in a computer 1 million times heavier that consumes a million times more power! That is equivalent to the power consumption of a medium city of 100,000 homes.</p>
<p>Why such a difference? Our brain circuit conducts current using sodium ions with current measured in pico-amps (10<sup>-9</sup> A). A computer circuit, on the other hand, conducts current using electrons, in this case measured in micro-amps (10<sup>-6 </sup>A). Since power is proportional to the square of the current, i.e., P = I<sup>2</sup>R, its consumption is a million times greater.</p>
<p>Interestingly though, there is a trade-off, because electrons are much smaller and lighter than sulfur ions. An electron can travel 1 million times <em>faster</em>. So, if we could build an electronic brain, it would be potentially 1 million times faster, give or take algorithmic reduction. It would be possible for a million robots to share it, based on its incredible speed.</p>
<p>Since intelligence can’t be built <em>inside the robot</em>, is there a way to accomplish this outside? Fortunately, we are now at the time in history when this is not only possible, but inevitable. The impetus is cloud computing and mobile communication. The former provides intelligence and the later provides connectivity.</p>
<p>We can cluster thousands of computer servers together to form a gigantic cloud computer that can compare to the complexity of a human brain. With recent developments in AI, intelligence capabilities that are comparable to, or in many cases beyond, human capacity are being developed. Without the limitation of increased weight and power consumption, it is just a matter of time before we can build that cloud brain for intelligent robots. Millions of robots can share this brain and train and learn together.</p>
<p>With the intelligence in the cloud, mobile networks can provide the connectivity with many robots. The human brain is connected to the rest of the body using another neural circuit made of ions, the nervous system. It is quite slow compared to electronics, taking between 50—300 milliseconds for a brain signal to travel to the various muscles and organs inside of our body. Vision is the fastest operation of the brain, with delays of about 30 ms for signals leaving the eye to reach the brain. By comparison, in the amount of time it takes the brain to signal the body, the modern communication network, using light (fiber optics) and electromagnetic waves, can send messages tens of thousands of kilometers.Today’s 4G LTE mobile networks can send multiple megabits of signals per second over a distance of thousands of kilometers with less than 100 ms delay. That is sufficient for human-level perception and mechanical control. Furthermore, with the coming 5G mobile standard we will soon be able to improve performance and scale 100 fold! Thus, mobile communication networks can serve as the nervous system for connecting the cloud brain with the robotic bodies.</p>
<p>We see the same relationship in computing speed. The human brain operates in the range of 500 to 1,000 Hz, whereas today’s computers operate at several GHz, and are thus one hundred thousand times faster! Thus, while the brain is incredibly small, light and energy efficient, it operates slower than computers and transmits slower than existing mobile networks. Because future-generation networks will be operating at speeds beyond what humans can handle, one could think of these future network build outs as serving not humans, but rather the machines in our midst. Robot vision, speech, analysis and other capabilities will harness the known intelligence of the world in the cloud.</p>
<p>While the big challenges of human-like intelligence and connectivity to it can be achieved, there are other priorities that must be addressed. Chief among these is security. It is clear that new, ultra-high reliability is vital to the success of cloud-connected robots. Users must have assurances that their robots are under complete authorized control—24/7/365. If this control is ever compromised, then the compromise would have to be instantly detected and the robot disabled.</p>
<p>Very smart cloud-robots will be an integral part of our world in the not-too-distant future. Mobile networks will enable performance similar to human interactions. Scientists and engineers have made many advances to this point in time, but it is hard to imagine anything more exciting than what we are about to create, as we approach the human intelligence benchmark.</p>
<p>The post <a href="https://www.aiuniverse.xyz/robots-with-their-heads-in-the-clouds/">Robots with Their Heads in the Clouds</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/robots-with-their-heads-in-the-clouds/feed/</wfw:commentRss>
			<slash:comments>3</slash:comments>
		
		
			</item>
	</channel>
</rss>
