<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>supercomputer Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/supercomputer/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/supercomputer/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Fri, 31 Jul 2020 06:32:02 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.1</generator>
	<item>
		<title>Google builds world’s fastest machine learning training supercomputer that breaks AI performance records</title>
		<link>https://www.aiuniverse.xyz/google-builds-worlds-fastest-machine-learning-training-supercomputer-that-breaks-ai-performance-records/</link>
					<comments>https://www.aiuniverse.xyz/google-builds-worlds-fastest-machine-learning-training-supercomputer-that-breaks-ai-performance-records/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 31 Jul 2020 06:31:52 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Google]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[supercomputer]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=10619</guid>

					<description><![CDATA[<p>Source: zeenews.india.com New Delhi: Google said it has built world’s fastest machine learning (ML) training supercomputer that broke AI performance records in six out of eight industry-leading MLPerf benchmarks. “The latest results from the industry-standard MLPerf benchmark competition demonstrate that Google has built the world’s fastest ML training supercomputer. Using this supercomputer, as well as <a class="read-more-link" href="https://www.aiuniverse.xyz/google-builds-worlds-fastest-machine-learning-training-supercomputer-that-breaks-ai-performance-records/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/google-builds-worlds-fastest-machine-learning-training-supercomputer-that-breaks-ai-performance-records/">Google builds world’s fastest machine learning training supercomputer that breaks AI performance records</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: zeenews.india.com</p>



<p>New Delhi: Google said it has built world’s fastest machine learning (ML) training supercomputer that broke AI performance records in six out of eight industry-leading MLPerf benchmarks.</p>



<p>“The latest results from the industry-standard MLPerf benchmark competition demonstrate that Google has built the world’s fastest ML training supercomputer. Using this supercomputer, as well as our latest Tensor Processing Unit (TPU) chip, Google set performance records in six out of eight MLPerf benchmarks,” a Google blog said.</p>



<p>Google said that it achieved these results with ML model implementations in TensorFlow, JAX, and Lingvo. Four of the eight models were trained from scratch in under 30 seconds.</p>



<p>Google blog explains, “…Consider that in 2015, it took more than three weeks to train one of these models on the most advanced hardware accelerator available. Google’s latest TPU supercomputer can train the same model almost five orders of magnitude faster just five years later.”</p>



<p>MLPerf models are chosen to be representative of cutting-edge machine learning workloads that are common throughout industry and academia. The supercomputer Google used for the MLPerf training round is four times larger than the &#8220;Cloud TPU v3 Pod&#8221; that set three records in the previous competition.</p>



<p>The system includes 4096 TPU v3 chips and hundreds of CPU host machines, all connected via an ultra-fast, ultra-large-scale custom interconnect. In total, this system delivers over 430 PFLOPs of peak performance.</p>



<p>Google said its MLPerf Training v0.7 submissions demonstrate our commitment to advancing machine learning research and engineering at scale and delivering those advances to users through open-source software, Google’s products, and Google Cloud.</p>
<p>The post <a href="https://www.aiuniverse.xyz/google-builds-worlds-fastest-machine-learning-training-supercomputer-that-breaks-ai-performance-records/">Google builds world’s fastest machine learning training supercomputer that breaks AI performance records</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/google-builds-worlds-fastest-machine-learning-training-supercomputer-that-breaks-ai-performance-records/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Microsoft Just Built a World-Class Supercomputer Exclusively for OpenAI</title>
		<link>https://www.aiuniverse.xyz/microsoft-just-built-a-world-class-supercomputer-exclusively-for-openai/</link>
					<comments>https://www.aiuniverse.xyz/microsoft-just-built-a-world-class-supercomputer-exclusively-for-openai/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Mon, 01 Jun 2020 07:02:45 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[Microsoft]]></category>
		<category><![CDATA[OpenAI]]></category>
		<category><![CDATA[supercomputer]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=9176</guid>

					<description><![CDATA[<p>Source: singularityhub.com Last year, Microsoft announced a billion-dollar investment in OpenAI, an organization whose mission is to create artificial general intelligence and make it safe for humanity. No Terminator-like dystopias here. No deranged machines making humans into paperclips. Just computers with general intelligence helping us solve our biggest problems. A year on, we have the <a class="read-more-link" href="https://www.aiuniverse.xyz/microsoft-just-built-a-world-class-supercomputer-exclusively-for-openai/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/microsoft-just-built-a-world-class-supercomputer-exclusively-for-openai/">Microsoft Just Built a World-Class Supercomputer Exclusively for OpenAI</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: singularityhub.com</p>



<p>Last year, Microsoft announced a billion-dollar investment in OpenAI, an organization whose mission is to create artificial general intelligence and make it safe for humanity. No Terminator-like dystopias here. No deranged machines making humans into paperclips. Just computers with general intelligence helping us solve our biggest problems.</p>



<p>A year on, we have the first results of that partnership. At this year’s Microsoft Build 2020, a developer conference showcasing Microsoft’s latest and greatest, the company said they’d completed a supercomputer exclusively for OpenAI’s machine learning research. But this is no run-of-the-mill supercomputer. It’s a beast of a machine. The company said it has 285,000 CPU cores, 10,000 GPUs, and 400 gigabits per second of network connectivity for each GPU server.</p>



<p>Stacked against the fastest supercomputers on the planet, Microsoft says it’d rank fifth.</p>



<p>The company didn’t release performance data, and the computer hasn’t been publicly benchmarked and included on the widely-followed Top500 list of supercomputers. But even absent official rankings, it’s likely safe to say its a world-class machine.</p>



<p>“As we’ve learned more and more about what we need and the different limits of all the components that make up a supercomputer, we were really able to say, ‘If we could design our dream system, what would it look like?’” said OpenAI CEO Sam Altman. “And then Microsoft was able to build it.”</p>



<p>What will OpenAI do with this dream-machine? The company is building ever bigger narrow AI algorithms—we’re nowhere near AGI yet—and they need a lot of computing power to do it.</p>



<h3 class="wp-block-heading"><strong>The Pursuit of Very Large AI Models</strong></h3>



<p>The size of the most advanced AI models—that is, the neural networks in machine learning algorithms—has been growing fast. At the same time, according to OpenAI, the computing power needed to train these models has been doubling every 3.4 months.</p>



<p>The bigger the model, the bigger the computer you need to train it.</p>



<p>This growth is in part due to the number of parameters used in each model. Simplistically, these are the values “neurons” operating on data in a neural net assume through training. OpenAI’s GPT-2 algorithm, which generated convincing text from prompts, consisted of nearly 1.5 billion parameters. Microsoft’s natural language generating AI model, Turing NLG, was over 10 times bigger, weighing in at 17 billion parameters. Now, OpenAI’s GPT-3, just announced Thursday, is reportedly made up of a staggering 175 billion parameters.</p>



<p>There’s another trend at play too.</p>



<p>Whereas many machine learning algorithms are trained on human-labeled data sets, Microsoft, OpenAI, and others are also pursuing “unsupervised” machine learning. This means that with enough raw, unlabeled data the algorithms teach&nbsp;<em>themselves</em>&nbsp;by identifying patterns in that data.</p>



<p>Some of the latest systems can also perform more than one task in a given domain. An algorithm trained on the raw text of billions of internet pages—from Wikipedia entries to self-published books—can infer relationships between words, concepts, and context. Instead of being able to do only one thing, like generate text, it can transfer its learning to multiple related tasks in the same domain, like also reading documents and answering questions.</p>



<p>The Turing NLG and GPT-3 algorithms fall into this category.</p>



<p>“The exciting thing about these models is the breadth of things they’re going to enable,” said Microsoft Chief Technical Officer Kevin Scott. “This is about being able to do a hundred exciting things in natural language processing at once and a hundred exciting things in computer vision, and when you start to see combinations of these perceptual domains, you’re going to have new applications that are hard to even imagine right now.”</p>



<h3 class="wp-block-heading"><strong>If Only We Had a Bigger Computer…</strong></h3>



<p>To be clear, this isn’t AGI, and there’s no certain path to AGI yet. But algorithms beginning to modestly generalize within domains is progress.</p>



<p>A looming question is whether the approach will continue progressing as long as researchers can throw more computing power at it, or if today’s machine learning needs to be augmented with other techniques. Also, if the most advanced AI research requires such prodigious resources, then increasingly, only the most well-heeled, well-connected private organizations will be able to play.</p>



<p>Some good news is that even as AI model size is growing, the efficiency of those models is improving too. Each new breakthrough requires a big jump in computing power, but later models are tweaked and tuned, such that successor algorithms can do as well or better with less computing power.</p>



<p>Microsoft also announced an update to its open source deep learning toolset, DeepSpeed, first released in February. The company says DeepSpeed can help developers train models 15 times larger and 10 times faster using the same computing resources. And they also plan to open source their Turing models so the broader community can build on them.</p>



<p>The general idea is that once one of these very large AI models has been trained, it can actually be customized and employed by other researchers or companies with far fewer resources.</p>



<p>In any case, Microsoft and OpenAI are committed to very large AI, and their new machine may be followed by even bigger systems in the years ahead.</p>



<p>“We’re testing a hypothesis that has been there since the beginning of the field: that a neural network close to the size of the human brain can be trained to be an AGI,” Greg Brockman, OpenAI’s co-founder, chairman, and CTO, told the Financial Times when Microsoft’s investment was first made public. “If the hypothesis is true, the upside for humanity will be remarkable.”</p>
<p>The post <a href="https://www.aiuniverse.xyz/microsoft-just-built-a-world-class-supercomputer-exclusively-for-openai/">Microsoft Just Built a World-Class Supercomputer Exclusively for OpenAI</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/microsoft-just-built-a-world-class-supercomputer-exclusively-for-openai/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Supermicro announces integrated A100 GPU-powered systems</title>
		<link>https://www.aiuniverse.xyz/supermicro-announces-integrated-a100-gpu-powered-systems/</link>
					<comments>https://www.aiuniverse.xyz/supermicro-announces-integrated-a100-gpu-powered-systems/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Mon, 18 May 2020 06:06:51 +0000</pubDate>
				<category><![CDATA[Deep Learning]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[deep learning]]></category>
		<category><![CDATA[GPU]]></category>
		<category><![CDATA[HPC]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[supercomputer]]></category>
		<category><![CDATA[Supermicro]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=8831</guid>

					<description><![CDATA[<p>Source: datacentrenews.eu Super Micro Computer has announced two new systems designed for artificial intelligence (AI) deep learning applications that leverage the third-generation NVIDIA HGX technology with the new NVIDIA A100 Tensor Core GPUs as well as full support for the new NVIDIA A100 GPUs across the company’s broad portfolio of 1U, 2U, 4U and 10U <a class="read-more-link" href="https://www.aiuniverse.xyz/supermicro-announces-integrated-a100-gpu-powered-systems/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/supermicro-announces-integrated-a100-gpu-powered-systems/">Supermicro announces integrated A100 GPU-powered systems</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: datacentrenews.eu</p>



<p>Super Micro Computer has announced two new systems designed for artificial intelligence (AI) deep learning applications that leverage the third-generation NVIDIA HGX technology with the new NVIDIA A100 Tensor Core GPUs as well as full support for the new NVIDIA A100 GPUs across the company’s broad portfolio of 1U, 2U, 4U and 10U GPU servers.&nbsp;</p>



<p>NVIDIA A100 is the first elastic, multi-instance GPU that unifies training, inference, HPC, and analytics.</p>



<p>“Expanding upon our portfolio of GPU systems and NVIDIA HGX-2 system technology, Supermicro is introducing a new 2U system implementing the new NVIDIA HGX A100 4 GPU board (formerly codenamed Redstone) and a new 4U system based on the new NVIDIA HGX A100 8 GPU board (formerly codenamed Delta) delivering 5 PetaFLOPS of AI performance,” says Supermicro CEO and president Charles Liang.&nbsp;</p>



<p>“As GPU accelerated computing evolves and continues to transform data centers, Supermicro will provide customers the very latest system advancements to help them achieve maximum acceleration at every scale while optimising GPU utilisation. These new systems will significantly boost performance on all accelerated workloads for HPC, data analytics, deep learning training and deep learning inference.”</p>



<p>As a balanced data centre platform for HPC and AI applications, Supermicro’s new 2U system leverages the NVIDIA HGX A100 4 GPU board with four direct-attached NVIDIA A100 Tensor Core GPUs using PCI-E 4.0 for maximum performance and NVIDIA NVLink for high-speed GPU-to-GPU interconnects.&nbsp;</p>



<p>This GPU system accelerates compute, networking and storage performance with support for one PCI-E 4.0 x8 and up to four PCI-E 4.0 x16 expansion slots for GPUDirect RDMA high-speed network cards and storage such as InfiniBand HDR, which supports up to 200Gb per second bandwidth.&nbsp;</p>



<p>&nbsp;“AI models are exploding in complexity as they take on next-level challenges such as accurate conversational AI, deep recommender systems and personalised medicine,” says NVIDIA accelerated computing general manager and vice president Ian Buck.</p>



<p>“By implementing the NVIDIA HGX A100 platform into their new servers, Supermicro provides customers the powerful performance and massive scalability that enable researchers to train the most complex AI networks at unprecedented speed.”</p>



<p>Optimised for AI and machine learning, Supermicro’s new 4U system supports eight A100 Tensor Core GPUs.&nbsp;</p>



<p>The 4U form factor with eight GPUs is ideal for customers that want to scale their deployment as their processing requirements expand.&nbsp;</p>



<p>The new 4U system will have one NVIDIA HGX A100 8 GPU board with eight A100 GPUs all-to-all connected with NVIDIA NVSwitch for up to 600GB per second GPU-to-GPU bandwidth and eight expansion slots for GPUDirect RDMA high-speed network cards.&nbsp;</p>



<p>Ideal for deep learning training, data centres can use this scale-up platform to create next-gen AI and maximise data scientists’ productivity with support for ten x16 expansion slots.</p>



<p>Customers can expect a significant performance boost across Supermicro’s extensive portfolio of 1U, 2U, 4U and 10U multi-GPU servers when they are equipped with the new NVIDIA A100 GPUs.&nbsp;&nbsp;</p>



<p>For maximum acceleration, Supermicro’s new A+ GPU system supports up to eight full-height double-wide (or single-wide) GPUs via direct-attach PCI-E 4.0 x16 CPU-to-GPU lanes without any PCI-E switch for the lowest latency and highest bandwidth.&nbsp;</p>



<p>The system also supports up to three additional high-performance PCI-E 4.0 expansion slots for a variety of uses, including high-performance networking connectivity up to 100G. An additional AIOM slot supports a Supermicro AIOM card or an OCP 3.0 mezzanine card.</p>



<p>With 1U, 2U, 4U, and 10U rackmount GPU systems; Ultra, BigTwin, and embedded systems supporting GPUs; as well as GPU blade modules for our 8U SuperBlade, Supermicro offers the industry’s widest and deepest selection of GPU systems to power applications from Edge to Cloud.</p>



<p>To deliver enhanced security and unprecedented performance at the edge, Supermicro plans to add the new NVIDIA EGXB A100 configuration to its edge server portfolio.&nbsp;</p>



<p>The EGX A100 converged accelerator combines a Mellanox SmartNIC with GPUs powered by the new NVIDIA Ampere architecture, so enterprises can run AI at the edge more securely.</p>
<p>The post <a href="https://www.aiuniverse.xyz/supermicro-announces-integrated-a100-gpu-powered-systems/">Supermicro announces integrated A100 GPU-powered systems</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/supermicro-announces-integrated-a100-gpu-powered-systems/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>AI Chip Startup Cerebras Reveals &#8216;World&#8217;s Fastest AI Supercomputer&#8217;</title>
		<link>https://www.aiuniverse.xyz/ai-chip-startup-cerebras-reveals-worlds-fastest-ai-supercomputer/</link>
					<comments>https://www.aiuniverse.xyz/ai-chip-startup-cerebras-reveals-worlds-fastest-ai-supercomputer/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 28 Nov 2019 09:54:38 +0000</pubDate>
				<category><![CDATA[AI-ONE]]></category>
		<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[AI software]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[computer chips]]></category>
		<category><![CDATA[supercomputer]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=5451</guid>

					<description><![CDATA[<p>Source: crn.com Artificial intelligence chip startup Cerebras Systems claims it has the &#8220;world&#8217;s fastest AI supercomputer,&#8221; thanks to its large Wafer Scale Engine processor that comes with 400,000 compute cores. The Los Altos, Calif.-based startup introduced its CS-1 system at the Supercomputing conference in Denver last week after raising more than $200 million in funding from investors, <a class="read-more-link" href="https://www.aiuniverse.xyz/ai-chip-startup-cerebras-reveals-worlds-fastest-ai-supercomputer/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/ai-chip-startup-cerebras-reveals-worlds-fastest-ai-supercomputer/">AI Chip Startup Cerebras Reveals &#8216;World&#8217;s Fastest AI Supercomputer&#8217;</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: crn.com</p>



<p>Artificial intelligence chip startup Cerebras Systems claims it has the &#8220;world&#8217;s fastest AI supercomputer,&#8221; thanks to its large Wafer Scale Engine processor that comes with 400,000 compute cores.</p>



<p>The Los Altos, Calif.-based startup introduced its CS-1 system at the Supercomputing conference in Denver last week after raising more than $200 million in funding from investors, most recently with an $88 million Series D round that was raised in November 2018, according to Andrew Feldman, the founder and CEO of Cerebras who was previously an executive at AMD.</p>



<p>The CS-1 system is based on the startup&#8217;s Wafer Scale Engine chip, which it says is &#8220;the only trillion transistor wafer scale processor in existence,&#8221; measuring 56.7 times larger and containing 78 more compute cores than the largest GPU.</p>



<p>This brings total funding to more than $200 million for Cerebras, which showed off its CS-1 system at the Supercomputing conference in Denver last year. The system is based on its Wafer Scale Engine chip, which it says is &#8220;the only trillion transistor wafer scale processor in existence,&#8221; measuring 56.7 times larger and containing 78 more compute cores than the largest GPU.</p>



<p>With Wafer Scale Engine&#8217;s 400,000 compute cores and 18 GB of on-chip memory, the startup said the CS-1 can deliver compute performance with less space and power than any other system, representing one-third of a standard data center rack while replacing the need for hundreds of thousands of GPUs.</p>



<p>&#8220;The CS-1 is the industry’s fastest AI computer, and because it is easy to install, quick to bring up and integrates with existing AI models in TensorFlow and PyTorch, it delivers value the day it is deployed,&#8221; Feldman said in a recent statement. &#8220;Depending on workload, the CS-1 delivers hundreds or thousands of times the performance of legacy alternatives at one-tenth the power draw and one-tenth the space per unit compute.&#8221;</p>



<p>Cerebras is targeting deep learning workloads, both for training and inference. The startup said the large size of its Wafer Scale Engine allows it to &#8220;process information more quickly&#8221; than other AI accelerators like GPUs, reducing training work from months to minutes. Inference, in the meantime, &#8220;is thousands of times faster,&#8221; with the Wafer Scale Engine capable of making a single image classification in microseconds, which is equal to one one-thousandth of a millisecond.</p>



<p>Among Cerebras&#8217; dozens of customers are the U.S. Department of Energy&#8217;s Argonne National Laboratory and Lawrence Livermore National Laboratory. Argonne, in particular, is using the CS-1 to accelerate neural networks for cancer studies, study the properties of black holes and treat traumatic brain injury.</p>



<p>The startup doesn&#8217;t have plans to sell its chips or systems through channel partners for now, according to Cerebras&#8217; spokesperson.</p>



<p>Marc Fertik, vice president of technology solutions at Elk Grove Village, Ill.-based Ace Computers, No. 261 on CRN&#8217;s 2019 Solution Provider 500 list, said with the CS-1 likely costing a &#8220;fortune,&#8221; it wouldn&#8217;t make sense for Cerebras to work with resellers until it starts selling less expensive systems at higher volumes.</p>



<p>&#8220;As soon as you move down in price point to increase your volume, you need to build your channel, because that&#8217;s when your business development and sales force can&#8217;t do it alone,&#8221; he said.</p>



<p>Even then, he said, prebuilt systems aren&#8217;t for everyone in the channel, citing Nvidia&#8217;s GPU-accelerated DGX deep learning system as an example. While some partners have built practices around the platform, others, like Ace Computers, would rather focus on building their own GPU-accelerated systems using parts from server vendors such as Supermicro, according to Fertik.</p>



<p>&#8220;We have military guys that don’t care about the extra software support, but they care about the hardware cost,&#8221; he said. &#8220;We have successfully convinced them and sold them multiple times something that is 30 percent cheaper.&#8221;</p>
<p>The post <a href="https://www.aiuniverse.xyz/ai-chip-startup-cerebras-reveals-worlds-fastest-ai-supercomputer/">AI Chip Startup Cerebras Reveals &#8216;World&#8217;s Fastest AI Supercomputer&#8217;</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/ai-chip-startup-cerebras-reveals-worlds-fastest-ai-supercomputer/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>ARTIFICIAL INTELLIGENCE COULD HELP US SEE FARTHER INTO SPACE THAN EVER BEFORE</title>
		<link>https://www.aiuniverse.xyz/artificial-intelligence-could-help-us-see-farther-into-space-than-ever-before/</link>
					<comments>https://www.aiuniverse.xyz/artificial-intelligence-could-help-us-see-farther-into-space-than-ever-before/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Wed, 06 Sep 2017 09:18:34 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Human Intelligence]]></category>
		<category><![CDATA[gravitational lenses]]></category>
		<category><![CDATA[human experts]]></category>
		<category><![CDATA[supercomputer]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=978</guid>

					<description><![CDATA[<p>Source &#8211; digitaltrends.com Distortions in space-time sound like they’d be more of a concern on an episode of Star Trek than they would in the real world. However, that’s not necessarily true: analyzing images of gravitational waves could help enormously extend both the range and resolution of telescopes like Hubble, and allow us to see farther into the universe <a class="read-more-link" href="https://www.aiuniverse.xyz/artificial-intelligence-could-help-us-see-farther-into-space-than-ever-before/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/artificial-intelligence-could-help-us-see-farther-into-space-than-ever-before/">ARTIFICIAL INTELLIGENCE COULD HELP US SEE FARTHER INTO SPACE THAN EVER BEFORE</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source &#8211; <strong>digitaltrends.com</strong></p>
<p>Distortions in space-time sound like they’d be more of a concern on an episode of <em>Star Trek</em> than they would in the real world. However, that’s not necessarily true: analyzing images of gravitational waves could help enormously extend both the range and resolution of telescopes like Hubble, and allow us to see farther into the universe than has been possible before.</p>
<p>The good news? Applying an artificial intelligence neural network to this problem turns out to accelerate its solution well beyond previous methods — like 10 million times faster. That means that analysis which could take human experts weeks or even months to complete can now be carried out by neural nets in a fraction of a single second.</p>
<p>Developed by researchers at Stanford University and the SLAC National Accelerator Laboratory, the new neural network is able to analyze images of so-called “gravitational lensing.” This is an effect first hypothesized about by Albert Einstein, who suggested that giant masses such as stars have the effect of curving light around them. This effect is similar to a telescope in that it allows us to examine distant objects with more clarity. However, unlike a telescope, gravitational lenses distort objects into smeared rings and arcs — so making sense of them requires the calculating abilities of a computer.</p>
<article class="m-content " data-scope="content">To train their network, researchers on the project showed it around half a million simulated images of gravitational lenses. After this was done, the neural net was able to spot new lenses and determine their properties — down to how their mass was distributed, and how great the magnification levels of the background galaxy were.</p>
<p>Given that projects like the Large Synoptic Survey Telescope (LSST), a 3.2-gigapixel camera currently under construction at SLAC, is expected to increase the number of known strong gravitational lenses from a few hundred to tens of thousands, this work comes at the perfect time.</p>
<p>“We won’t have enough people to analyze all these data in a timely manner with the traditional methods,” said postdoctoral fellow Laurence Perreault Levasseur, a co-author on the associated <em>Nature</em> research paper. “Neural networks will help us identify interesting objects and analyze them quickly. This will give us more time to ask the right questions about the universe.”</p>
<p>Impressively, the neural network doesn’t even need a supercomputer to run on: one of the tested neural nets was designed to work on an iPhone. Studying the universe in greater detail than ever? Turns out there’s an app for that!</p>
</article>
<p>The post <a href="https://www.aiuniverse.xyz/artificial-intelligence-could-help-us-see-farther-into-space-than-ever-before/">ARTIFICIAL INTELLIGENCE COULD HELP US SEE FARTHER INTO SPACE THAN EVER BEFORE</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/artificial-intelligence-could-help-us-see-farther-into-space-than-ever-before/feed/</wfw:commentRss>
			<slash:comments>3</slash:comments>
		
		
			</item>
		<item>
		<title>5 artificial intelligence tools defining the future of P&#038;C insurance</title>
		<link>https://www.aiuniverse.xyz/5-artificial-intelligence-tools-defining-the-future-of-pc-insurance/</link>
					<comments>https://www.aiuniverse.xyz/5-artificial-intelligence-tools-defining-the-future-of-pc-insurance/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Sat, 26 Aug 2017 07:33:07 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[AI tools]]></category>
		<category><![CDATA[data analytics]]></category>
		<category><![CDATA[digital technology]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[supercomputer]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=777</guid>

					<description><![CDATA[<p>Source &#8211; propertycasualty360.com While it&#8217;ll be awhile before we all have an IBM Watson Supercomputer on our desks, there are a number of artificial intelligence business tools that property and casualty insurers and insurance professionals can use right now to run smarter, faster — and ahead of the competition. &#8220;The emergence of AI coupled with big data and massive <a class="read-more-link" href="https://www.aiuniverse.xyz/5-artificial-intelligence-tools-defining-the-future-of-pc-insurance/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/5-artificial-intelligence-tools-defining-the-future-of-pc-insurance/">5 artificial intelligence tools defining the future of P&#038;C insurance</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source &#8211; <strong>propertycasualty360.com</strong></p>
<p>While it&#8217;ll be awhile before we all have an IBM Watson Supercomputer on our desks, there are a number of artificial intelligence business tools that property and casualty insurers and insurance professionals can use right now to run smarter, faster — and ahead of the competition.</p>
<p>&#8220;The emergence of AI coupled with big data and massive computing power has enabled us to take a different view of our processes,&#8221; says Justin Tomczak, CFA, media relations, State Farm. Consider, he added, the “thousands of photos of damage that we receive every day become an amazing set of data for AI to learn from and to serve our customer’s more effectively and efficiently. &#8221;</p>
<p>Jeremiah Bentley, vice president, marketing and customer engagement, Texas Mutual agrees that AI has the power to transform insurance. &#8220;We think the greatest potential in the use of artificial intelligence tools in the industry is the opportunity to improve the customer experience, and by extension, the overall impression that the consumer has about the insurance industry,&#8221; he says.</p>
<p>Great customer engagement, adds Kevin Kelley, divisional senior vice president, Great American Insurance Group, is what separates today’s disrupters from disrupted in insurance.</p>
<p>&#8220;The innovators behind disruptors in the insurance industry are using digital technology, data analytics, artificial intelligence and machine learning to produce a better, faster, and more efficient customer experience,&#8221; Kelley says.</p>
<p>Essentially, these next generation AI wonders tap into the technology&#8217;s ability to do a lot of the thinking and strategizing for insurers and agents.</p>
<p>Keep reading for a sampling of what the future of insurance business software may look like for insurers.</p>
<h2>No. 5: AI app makers</h2>
<p>Insurance professionals who want to start dabbling in artificial intelligence right now — for free — might look to such open-source software products as Datumbox. Targeted to businesses with one or more programmers on staff — or an extremely brave PC power-user — Datum is an AI platform that enables the user to design and build their own AI apps from scratch.</p>
<p>Some of the specific tools you can create with Datumbox include:</p>
<p><strong>AI Sentiment Analyzers:</strong>  These tools enable you to unleash an app on the Web, social media and similar digital locations that will see what people are saying about your company and/or products and services — and also determine if the sentiments behind those posts are positive, negative or neutral.</p>
<p><strong>AI Text Readability Analysis:</strong>  This tool can be used to ensure the marketing copy for your insurance business is extremely accessible — or conversely, appeals to a more discriminating audience.</p>
<p><strong>AI  Gender Analysis:</strong>  Whether its soaring praise or withering criticism, this tool will enable you to determine whose behind posts about your company — a man or a woman.</p>
<h2>No. 4: AI dashboard maker</h2>
<p>Qlik enables your insurance business to develop AI dashboards that can monitor dozens, hundreds — or even thousands — of web sites and/or web properties across cyberspace, and then bring back all that data for instant analysis.</p>
<p>With Qlik, you&#8217;ll be able to compare and contrast the performance of all your Web sites in terms of clicks, visits, purchases, successful calls-to-action, and more. Plus, the software promises to bring back associations and insights you may not have thought to consider.  Similar products include Metric Insights  and Tableau.</p>
<h2>No. 3: AI self-designing websites</h2>
<p>Grim fact: Not all of us are Da Vinci&#8217;s in the making.</p>
<p>Fortunately, with Grid — an online service that will auto-design a web site for your insurance business — that doesn&#8217;t matter anymore.</p>
<p>This tools allows users to simply upload the content you want on your web site — text, images and video — and the service does the rest, placing everything just where it&#8217;s supposed to go. Once all your components are in place, you also have the ability to tweak the resulting design.  You can get an in-depth look at how Grid works with its introductory video (56 minutes) on YouTube. Wix offers a similar online service.</p>
<h2>No. 2: AI call center matchmaker</h2>
<p>Any insurance business exec who has winced listening to a call center rep clashing with a customer will want to look into Affinti.</p>
<p>Designed to find &#8216;birds-of-a-feather&#8217; personality matches between your call center reps and your customers, Affiniti processes more than one billion calculations-a-second in its never-ending quest to sniff out the personality of anyone who happens to be calling your business.</p>
<p>Essentially, the AI software works by retrieving, storing and analyzing psychographic and demographic data on customers across the U.S., which it sources from the world&#8217;s identity data brokers, including Allant, Axciom, Experian, Facebook, LinkedIn and Targus.</p>
<p>Specific data Affiniti is incessantly gobbling up includes income level, credit card  usage,  profession,  gender,  telecommunication  usage  patterns, responsiveness to marketing, political persuasion and travel habits.</p>
<p>Most likely, it also knows if your toenails need trimming.</p>
<p>Meanwhile, Affiniti analyzes the other side of the equation — the personalities of the call center reps at your insurance business — by studying how your reps interact with customers over a 60-90 day period, and by crunching data from a 20-minute survey that you can administer to your call center reps when they&#8217;re first hired.</p>
<p>The result: In a perfect world, you get a match made in bits-and-bytes heaven that hopefully will result in a better customer service experience and perhaps heavier sales.</p>
<h2>No. 1: AI early warning lawsuit alerts</h2>
<p>When it comes to lawsuits, the only thing better than an attorney who strikes sheer terror in the opposition is one who can scope-out potential lawsuits before they happen — and steer you clear of any trouble.</p>
<p>That&#8217;s the premise behind Intraspexion, ingenious lawsuit-prevention software developed by seasoned attorney Nick Brestoff.</p>
<p>Intraspexion works by relentlessly analyzing every single email your employees send or receive from the outside world, and then studying those emails for telltale signs of trouble ahead.</p>
<p>As soon as it finds an email it believes could be the start of an impending lawsuit, it instantly alerts your attorney or in-house counsel, requesting human intervention.</p>
<p>According to the company&#8217;s founder, Nick Brestoff, Intraspexion&#8217;s accuracy had been verified by a third party source at 99%.</p>
<p>Interestingly, Intraspexion is built on Google TensorFlow — a free, open source, deep learning software developed by researchers and engineers on the Google Brain Team.</p>
<p>&#8220;TensorFlow is quickly becoming a viable option for companies interested in deploying deep learning,&#8221; says Rajat Monga, engineering leader, TensorFlow at Google.</p>
<p>Currently, Brestoff&#8217;s software — which is being pilot-tested by a New York Stock Exchange level company — is only programmed to analyze employee emails for potential employee discrimination suits, simply because those suits are among the most common.</p>
<p>But Brestoff says he can easily rework his code for insurers to do the same kind of monitoring for breach-of-contact suits, fraud suits and more than 150 other categories of lawsuits that businesses must dodge every day.</p>
<h2>Other AI tools on the horizon</h2>
<p>&#8220;There are a variety of potential use cases that span improved customer service; risk, price, fraud, demand modeling; improved underwriting practices; identifying behavioral insights and driving optimized segmentation,&#8221; says Tim Cunningham, CIO, Grange Insurance.  &#8220;The capabilities exist today and the barriers to experiment and learn continue to diminish.&#8221;</p>
<p>John Tramonti, AVP, product implementation, MetLife Auto &amp; Home, agrees:  &#8220;Artificial intelligence — specifically cognitive technologies — has the potential to redefine both how we interact with consumers and support our workforce.&#8221;</p>
<p>Concludes Bill Bloom, executive vice president, Operations, Technology &amp; Data, The Hartford on the coming Age of AI:  &#8220;In the near-term, robotics and natural language processing offer the greatest opportunities within service operations and claims, but it’s clear that the value will soon be felt in areas such as underwriting, actuarial and finance.</p>
<p>&#8220;The vendor landscape for these tools is broad and evolving rapidly. Therefore, we’ve architected our solutions to allow us to swap software providers over time as the market changes. &#8221;</p>
<p>Adds Luyang Fu, vice president, predictive analytics, The Cincinnati Insurance Company: &#8220;My impression is that property casualty insurers are working to integrate these AI tools into smaller projects first, and building up to full integration. Artificial intelligence tools will be an essential part of many operations, but it will take time.&#8221;</p>
<p>The post <a href="https://www.aiuniverse.xyz/5-artificial-intelligence-tools-defining-the-future-of-pc-insurance/">5 artificial intelligence tools defining the future of P&#038;C insurance</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/5-artificial-intelligence-tools-defining-the-future-of-pc-insurance/feed/</wfw:commentRss>
			<slash:comments>22</slash:comments>
		
		
			</item>
		<item>
		<title>How artificial intelligence can help the hunt for new materials</title>
		<link>https://www.aiuniverse.xyz/how-artificial-intelligence-can-help-the-hunt-for-new-materials/</link>
					<comments>https://www.aiuniverse.xyz/how-artificial-intelligence-can-help-the-hunt-for-new-materials/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Sat, 12 Aug 2017 05:47:05 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Data Science]]></category>
		<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[cloud]]></category>
		<category><![CDATA[data science]]></category>
		<category><![CDATA[machine learning techniques]]></category>
		<category><![CDATA[medical devices]]></category>
		<category><![CDATA[new materials]]></category>
		<category><![CDATA[supercomputer]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=601</guid>

					<description><![CDATA[<p>Source &#8211; imeche.org The smart materials of the future are likely to be discovered not in the lab, but on a supercomputer. Materials science has exploded in recent years – offering tantalising potential solutions to engineering challenges ranging from higher-capacity batteries to safer medical devices. There are materials that work at the molecular level to fight bacteria, <a class="read-more-link" href="https://www.aiuniverse.xyz/how-artificial-intelligence-can-help-the-hunt-for-new-materials/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-artificial-intelligence-can-help-the-hunt-for-new-materials/">How artificial intelligence can help the hunt for new materials</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source &#8211; <strong>imeche.org</strong></p>
<div class="long-form">
<p class="page-intro">The smart materials of the future are likely to be discovered not in the lab, but on a supercomputer.</p>
</div>
<div class="long-form">
<p>Materials science has exploded in recent years – offering tantalising potential solutions to engineering challenges ranging from higher-capacity batteries to safer medical devices. There are materials that work at the molecular level to fight bacteria, and others that can change shape when introduced to an electric current. Some transform in the heat or cold, while others are soft and spongy when poked, but form a rigid barrier when hit at speed.</p>
<p>Throughout history, the hunt for new substances has been conducted by tinkerers and scientists in labs, or pioneering craftsmen in workshop. Most were stumbled across by luck and then tested to see if they would be useful. Graphene, for instance, was discovered by two researchers at Manchester University who were speculatively playing around with Scotch tape and graphite on a Friday afternoon.</p>
<p>But now, new materials are more likely to be discovered by a supercomputer. Researchers in Europe and the United States are using computer modelling, artificial intelligence and machine learning techniques to predict new materials from ones that are known to exist.</p>
<p>Some are purely hypothetical, but others are being synthesised and tested for potentially useful properties such as magnetism, conductivity or the amount of external force they can undergo without breaking.</p>
<p>Researchers at Basel University, for example, were recently able to predict 90 different forms of a crystal called elpasolite, which could be used as a semiconductor or insulator, or emit light when exposed to radiation.</p>
<h2>Global effort</h2>
<p>There are a number of large projects around the world, including Materials Cloud in Lausanne, and the Center for Material Genomics at Duke University in North Carolina. But the first was the Materials Genome Project at MIT, which was founded by Gerbrand Ceder in 2006.</p>
<p>He took inspiration from the Human Genome Project, an ambitious attempt to create a map of our DNA. “By itself, the human genome was not a recipe for new treatments,” he told <em>Nature</em> last year, “but it gave medicine amazing amounts of basic, quantitative information to start from.”</p>
<p>Now, the same thing is happening with new materials. By creating databases of the properties of various compounds, researchers can speed up the search for potentially useful combinations.</p>
<p>It’s catching on, with a host of start-ups launching in the space including Nutonian, QuesTek Innovations, and Alphastar. In 2011, the US government launched the Materials Genome Initiative, a $500m investment in the field. That helped create a publicly available database of all the new and predicted materials. According to a five-year progress report, the database now includes “more than 66,000 crystalline compounds, 500,000 nano-porous materials, 70,000 electrochemical phase diagrams, 43,000 electronic band structures, and 2,900 full elastic tensors (important for understanding mechanical behaviour)”.</p>
<p>Artificial intelligence isn’t just about increasing the speed of progress. With machine learning, scientists can identify things that would never be spotted in the normal course of research. “Machine learning does not depend on equations that are based on the laws of physics to find patterns and model the data,” explained Dayton Horvath, a research associate at Lux Research and lead author of the report <em>Materials and Informatics: The Next Research Revolution?</em> in an email to <em>Professional Engineering.</em></p>
<p>“Any data type, even if there is no fundamental physical equation that can describe the data (such as color, or chemical resistance), can be used to help discover new materials, and predict the properties of existing and new materials.”</p>
<h2>Accelerating innovation</h2>
<p>Horvath’s report argues that artificial intelligence will accelerate the pace of innovation, with a knock on effect to every industry that uses materials. It’s an opportunity for engineering companies but it all relies on good data.</p>
<p>“[They need to] make institutional data accessible so that machine learning algorithms can properly leverage what is arguably an R&amp;D organization’s most valuable asset: decades of amassed data,” Horvath told <em>PE</em>.</p>
<p>There are also publicly available data sets that engineers can use to search for potential materials – either existing ones or predicted ones – that could meet the needs of a particular project they might be working on. “Engineers should be aware of the publicly available materials property, composition, and structure datasets that provide a good starting point for building initial training data sets and qualifying off-the-shelf machine learning algorithms for specific applications,” advises Horvath. He recommends Citrine Informatics, a start-up that provides tutorials on materials informatics, and access to their public database.</p>
<p>Companies are also getting involved in the application of machine learning to materials science. IBM are working with an unnamed company to develop an algorithm that can scan hundreds of thousands of scientific papers and patents for potentially useful discoveries – more than anyone would ever be able to read.</p>
<p>That’s been used to create a database of about 250,000 molecules that can be searched using artificial intelligence to identify ones that might be of interest to that particular researcher’s project. “You may say, ‘I want materials that are soluble,’ or ‘I want materials that can be exposed to light,’ explained Dario Gil, vice president of science and solutions at IBM Research at the EmTech Digital conference in San Francisco in March.</p>
<p>The scientists still have some input in training the algorithm and setting the parameters of the kind of molecule or material characteristics that they’re looking for. Artificial intelligence isn’t replacing them, but it is speeding up the search for new materials, and could help smooth the way for all manner of engineering advances. “What we’re doing is greatly accelerating the rate of progress and the productivity of the scientists,” says Gil.</p>
</div>
<p>The post <a href="https://www.aiuniverse.xyz/how-artificial-intelligence-can-help-the-hunt-for-new-materials/">How artificial intelligence can help the hunt for new materials</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-artificial-intelligence-can-help-the-hunt-for-new-materials/feed/</wfw:commentRss>
			<slash:comments>6</slash:comments>
		
		
			</item>
	</channel>
</rss>
