<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Hardware Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/hardware/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/hardware/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Sat, 05 Jun 2021 05:31:11 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.1</generator>
	<item>
		<title>IBM USES ARTIFICIAL INTELLIGENCE IN HARDWARE: AN UPDATE</title>
		<link>https://www.aiuniverse.xyz/ibm-uses-artificial-intelligence-in-hardware-an-update/</link>
					<comments>https://www.aiuniverse.xyz/ibm-uses-artificial-intelligence-in-hardware-an-update/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Sat, 05 Jun 2021 05:31:09 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[IBM]]></category>
		<category><![CDATA[Update]]></category>
		<guid isPermaLink="false">https://www.aiuniverse.xyz/?p=14040</guid>

					<description><![CDATA[<p>Source &#8211; https://www.analyticsinsight.net/ IBM Deploys AI to Induce Positive Disruptions The colossal field of artificial intelligence has witnessed exponential growth in recent years. With the advent of deep neural networks (DNNs) that surpass human cognition and human intelligence, the post-modern world is now thriving on disruptive technologies and avant-garde innovations. Adhering to the ebb and <a class="read-more-link" href="https://www.aiuniverse.xyz/ibm-uses-artificial-intelligence-in-hardware-an-update/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/ibm-uses-artificial-intelligence-in-hardware-an-update/">IBM USES ARTIFICIAL INTELLIGENCE IN HARDWARE: AN UPDATE</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.analyticsinsight.net/</p>



<h2 class="wp-block-heading"><strong>IBM Deploys AI to Induce Positive Disruptions</strong></h2>



<p>The colossal field of artificial intelligence has witnessed exponential growth in recent years. With the advent of deep neural networks (DNNs) that surpass human cognition and human intelligence, the post-modern world is now thriving on disruptive technologies and avant-garde innovations. Adhering to the ebb and flow of innovations and technological transformations, IBM is constantly experimenting with AI to explore the terra incognita and attach new dimensions to disruptive technology.</p>



<p>In recent days, IBM is striving to combine AI and hardware devices. Such a move will not only catalyze digital transformation in business industries and IT hubs but will also emerge as another disruptive technology at large.</p>



<h4 class="wp-block-heading"><strong>The Exceptional Conflation of AI and Hardware: Aims and Objectives</strong></h4>



<p>Pushing the limits of AI, IBM is currently on a mission to instill AI in hardware. The IBM Research AI Hardware center is the hub of a team of academicians and industry leaders who are experimenting to give birth to the next wave of AI technologies. The mission of such a development is that it will deliver 2.5 times annual improvement in hardware that is capable of high compute efficiency. This development in hardware is deemed as one of the key components of IBM’s “Fluid Intelligence”.</p>



<p>With this development, IBM is looking forward to a 16-fold improvement in processing efficiency. IBM is delving into a world of analog computation, which can generate high performance in low power.</p>



<p>Additionally, IBM has also announced its third-generation digital AI core. IBM asserts that this will develop a new four-core design that accelerates the performance efficiency of training. This will significantly surpass and outpace the goal of 2.5X global improvements.</p>



<p>Another meticulous AI in hardware development is analog computing. IBM has devoted itself to experimenting with analog computers for decades. It has recently developed a chip using phase-memory (PCM) in order to encode the neural nets on a memory device.</p>



<p>It is said, “Time is money.” Business industries practice timeliness vehemently. This is taken as the primary reason to adopt high-end technologies to cope up with the cut-throat race in the market space.</p>
<p>The post <a href="https://www.aiuniverse.xyz/ibm-uses-artificial-intelligence-in-hardware-an-update/">IBM USES ARTIFICIAL INTELLIGENCE IN HARDWARE: AN UPDATE</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/ibm-uses-artificial-intelligence-in-hardware-an-update/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>New defense method enables telecoms, ISPs to protect consumer IoT devices</title>
		<link>https://www.aiuniverse.xyz/new-defense-method-enables-telecoms-isps-to-protect-consumer-iot-devices/</link>
					<comments>https://www.aiuniverse.xyz/new-defense-method-enables-telecoms-isps-to-protect-consumer-iot-devices/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 04 Aug 2020 12:44:34 +0000</pubDate>
				<category><![CDATA[Internet of things]]></category>
		<category><![CDATA[BGU]]></category>
		<category><![CDATA[cybersecurity]]></category>
		<category><![CDATA[data security]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Internet of Things]]></category>
		<category><![CDATA[Research]]></category>
		<category><![CDATA[telecommunications]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=10688</guid>

					<description><![CDATA[<p>Source: helpnetsecurity.com Instead of relying on customers to protect their vulnerable smart home devices from being used in cyberattacks, Ben-Gurion University of the Negev (BGU) and National University of Singapore (NUS) researchers have developed a new method that enables telecommunications and internet service providers to monitor these devices. According to their new study, the ability <a class="read-more-link" href="https://www.aiuniverse.xyz/new-defense-method-enables-telecoms-isps-to-protect-consumer-iot-devices/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/new-defense-method-enables-telecoms-isps-to-protect-consumer-iot-devices/">New defense method enables telecoms, ISPs to protect consumer IoT devices</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: helpnetsecurity.com</p>



<p>Instead of relying on customers to protect their vulnerable smart home devices from being used in cyberattacks, Ben-Gurion University of the Negev (BGU) and National University of Singapore (NUS) researchers have developed a new method that enables telecommunications and internet service providers to monitor these devices.</p>



<p>According to their new study, the ability to launch massive DDoS attacks via a botnet of compromised devices is an exponentially growing risk in the Internet of Things (IoT). Such attacks, possibly emerging from IoT devices in home networks, impact the attack target, as well as the infrastructure of telcos.</p>



<p>“Most home users don’t have the awareness, knowledge, or means to prevent or handle ongoing attacks,” says Yair Meidan, a Ph.D. candidate at BGU. “As a result, the burden falls on the telcos to handle. Our method addresses a challenging real-world problem that has already caused challenging attacks in Germany and Singapore, and poses a risk to telco infrastructure and their customers worldwide.”</p>



<p>Each connected device has a unique IP address. However, home networks typically use gateway routers with NAT functionality, which replaces the local source IP address of each outbound data packet with the household router’s public IP address. Consequently, detecting connected IoT devices from outside the home network is a challenging task.</p>



<p>The researchers developed a method to detect connected, vulnerable IoT models before they are compromised by monitoring the data traffic from each smart home device. This enables telcos to verify whether specific IoT models, known to be vulnerable to exploitation by malware for cyberattacks are connected to the home network. It helps telcos identify potential threats to their networks and take preventive actions quickly.</p>



<p>By using the proposed method, a telco can detect vulnerable IoT devices connected behind a NAT, and use this information to take action. In the case of a potential DDoS attack, this method would enable the telco to take steps to spare the company and its customers harm in advance, such as offloading the large volume of traffic generated by an abundance of infected domestic IoT devices. In turn, this could prevent the combined traffic surge from hitting the telco’s infrastructure, reduce the likelihood of service disruption, and ensure continued service availability.</p>



<p>“Unlike some past studies that evaluated their methods using partial, questionable, or completely unlabeled datasets, or just one type of device, our data is versatile and explicitly labeled with the device model,” Meidan says. “We are sharing our experimental data with the scientific community as a novel benchmark to promote future reproducible research in this domain.” This dataset is available here.</p>



<p>This research is a first step toward dramatically mitigating the risk posed to telcos’ infrastructure by domestic NAT IoT devices. In the future, the researchers seek to further validate the scalability of the method, using additional IoT devices that represent an even broader range of IoT models, types and manufacturers.</p>



<p>“Although our method is designed to detect vulnerable IoT devices before they are exploited, we plan to evaluate the resilience of our method to adversarial attacks in future research,” Meidan says. “Similarly, a spoofing attack, in which an infected device performs many dummy requests to IP addresses and ports that are different from the default ones, could result in missed detection.”</p>
<p>The post <a href="https://www.aiuniverse.xyz/new-defense-method-enables-telecoms-isps-to-protect-consumer-iot-devices/">New defense method enables telecoms, ISPs to protect consumer IoT devices</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/new-defense-method-enables-telecoms-isps-to-protect-consumer-iot-devices/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Robotics in research – freeing up time for socially-distanced innovation?</title>
		<link>https://www.aiuniverse.xyz/robotics-in-research-freeing-up-time-for-socially-distanced-innovation/</link>
					<comments>https://www.aiuniverse.xyz/robotics-in-research-freeing-up-time-for-socially-distanced-innovation/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Wed, 08 Jul 2020 06:39:20 +0000</pubDate>
				<category><![CDATA[Robotics]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Automated]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[intelligent machine]]></category>
		<category><![CDATA[Research]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=10052</guid>

					<description><![CDATA[<p>Source: techhq.com The disruption caused by the coronavirus has highlighted, time and again, how our pre-pandemic obsession with ‘going automated’ across industries held water. But away from production lines and other blue-collar applications, the University of Liverpool (UoL), UK has shown how robots can carry huge benefits by working non-stop within the bounds of academia <a class="read-more-link" href="https://www.aiuniverse.xyz/robotics-in-research-freeing-up-time-for-socially-distanced-innovation/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/robotics-in-research-freeing-up-time-for-socially-distanced-innovation/">Robotics in research – freeing up time for socially-distanced innovation?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: techhq.com</p>



<p>The disruption caused by the coronavirus has highlighted, time and again, how our pre-pandemic obsession with ‘going automated’ across industries held water. But away from production lines and other blue-collar applications, the University of Liverpool (UoL), UK has shown how robots can carry huge benefits by working non-stop within the bounds of academia too.</p>



<p>Branded an “intelligent machine with social distancing skills,” the £100,000 (US$126,000) robotic researcher – essentially a remotely-operated ‘robo-chemist’ – is equipped with a level of artificial intelligence (AI) that goes beyond typical robotic machinery, allowing it to utilize previous results and, ultimately, ‘decide what to do next’.</p>



<p>The robotic arm has been lifting test tubes and siphoning serums day and night in the hunt for solar cell reaction catalysts, but researchers believe the robot – or at least the technology underpinning it – could also be repurposed for the fight against Covid-19 in vaccine development.</p>



<p>TechHQ previously highlighted the power of AI in making scientific in-roads faster, cheaper and more-effective, by rapidly crunching through vast banks of data that would take human researchers months, if not years. This sentiment is echoed by the robot’s developer Benjamin Burger: “[The robot] frees my time to focus on innovation and new solutions, rather than doing the same action.”</p>



<p>And that neatly sums up the core concept of Robotic Process Automation (RPA); software-based scripts that are being employed across multiple applications in businesses to expedite repetitive tasks and minimize human-error. But when it comes to physical robots, we’re more used to seeing them on production lines, or in places such as mines, quarries, or offshore wind farms where it can simply be too dangerous for humans to venture.</p>



<p>A new report by the Royal Society of Chemistry, however, emphasizes the importance of “urgently embracing” robotics, AI and advanced computing as part of a post-Covid national research strategy. Like the every other place of work, scientists have been rendered ‘remote’ by enforced lockdowns and safety concerns,  but robo-scientists can continue to carry out ground-breaking research in situ.</p>



<p>This isn’t a replacement strategy. Instead, robotics systems like that of UoL’s could do the heavy-lifting in laboratories – with oversight from a potentially global team – while researchers can shift their focus to other parts of the project or more valuable tasks.</p>



<p>Speaking to the BBC, Deirdre Black, head of research and innovation at the Royal Society of Chemistry, said by leveraging robotics and automation,  scientists can “explore bigger and more complex problems, like decarbonization, preventing and treating disease, and making our air cleaner.”</p>



<p>With every industry grappling with the implications of the pandemic for months, if not years to come, we can expect robotics and automation technologies to see further rapid adoption not just in heavy industries, but in the worlds of science and academia too which will be equally as important to recovery.</p>



<p>We have already seen examples of this shift with Pepper (Softbank Robotics’ programmable humanoid) lending a personable touch to Covid-19 patients in intensive care.</p>
<p>The post <a href="https://www.aiuniverse.xyz/robotics-in-research-freeing-up-time-for-socially-distanced-innovation/">Robotics in research – freeing up time for socially-distanced innovation?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/robotics-in-research-freeing-up-time-for-socially-distanced-innovation/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>3 Top Artificial Intelligence Stocks to Buy in July</title>
		<link>https://www.aiuniverse.xyz/3-top-artificial-intelligence-stocks-to-buy-in-july/</link>
					<comments>https://www.aiuniverse.xyz/3-top-artificial-intelligence-stocks-to-buy-in-july/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Mon, 06 Jul 2020 07:20:04 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Development]]></category>
		<category><![CDATA[Future]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[software]]></category>
		<category><![CDATA[Technologies]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=10024</guid>

					<description><![CDATA[<p>Source: fool.com Over a decade ago, a nebulous idea called &#8220;the cloud&#8221; started to gain momentum. Using the internet to deliver a service to a remotely located user was a novel concept, but today, it&#8217;s an essential piece of the economy.&#160; Artificial intelligence (AI) is likewise an important but oft-misunderstood technology. It&#8217;s still developing, but <a class="read-more-link" href="https://www.aiuniverse.xyz/3-top-artificial-intelligence-stocks-to-buy-in-july/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/3-top-artificial-intelligence-stocks-to-buy-in-july/">3 Top Artificial Intelligence Stocks to Buy in July</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: fool.com</p>



<p>Over a decade ago, a nebulous idea called &#8220;the cloud&#8221; started to gain momentum. Using the internet to deliver a service to a remotely located user was a novel concept, but today, it&#8217;s an essential piece of the economy.&nbsp;</p>



<p>Artificial intelligence (AI) is likewise an important but oft-misunderstood technology. It&#8217;s still developing, but it promises to create a new segment of the economy based on the automation of simple tasks and raw data crunching.&nbsp;</p>



<p>Researcher IDC estimates that some $37.5 billion was spent globally on AI systems in 2019. That&#8217;s not a particularly large sum, but IDC thinks that figure could roughly triple by 2023. Just as the cloud is now responsible for delivering all sorts of tools and services, AI systems are expected to have a wide range of uses in a very short period of time. &nbsp;</p>



<p>Last month, I talked up <strong>Alphabet</strong>, <strong>salesforce.com</strong>, and <strong>NVIDIA </strong>(NASDAQ:NVDA). For July, I&#8217;m revisiting NVIDIA and going with <strong>Micron Technology </strong>(NASDAQ:MU) and <strong>Appian </strong>(NASDAQ:APPN) as my AI picks of the moment.</p>



<h3 class="wp-block-heading">Before there was software, there was hardware</h3>



<p>As a final product, AI is software: an algorithm that dictates the function of a system or device. But hardware must be built to train, deploy, and operate that software. As AI is still in its infancy, the hardware used to support it is where I will gravitate for the time being.</p>



<p>When talking about AI hardware, it&#8217;s easy to default to NVIDIA. The folks at NVIDIA see a world where AI is ubiquitous, assisting us with tasks and making recommendations. That future is nigh. The company&#8217;s wares are already a largely unseen part of everyday life. Whether it&#8217;s an advanced driver assist feature in a new car, a recommendation for a movie or song, high-end graphics on video games, or the packages delivered to your home, there&#8217;s a good chance NVIDIA was involved. </p>



<p>The bear argument these days is that NVIDIA is too expensive. Looking back over the last year of results, it most certainly is. Shares trade for 20 times revenue and 54 times free cash flow (revenue less cash operating and capital expenses). Yikes. &nbsp;</p>



<p>But I&#8217;ll reiterate what I&#8217;ve said in previous articles on NVIDIA: The past is less important than the future for a high-growth company. Between its internal development and its recent acquisition of Mellanox (which I believe NVIDIA got for a song), revenue for the second quarter was forecast to be some 42% higher than a year ago. With a whole year left to lap its pre-Mellanox results and the current state of world affairs creating insatiable demand for new semiconductors and devices, double-digit percentage growth could continue for a while longer.  </p>



<p>Here&#8217;s my full disclosure: I continue to add shares of NVIDIA not because I think it&#8217;s a fair value now, but because I see at least a decade of rapid AI industry development ahead with the company delivering some of the primary components necessary to make it all possible. That kind of time horizon may not gel for many, but if you think your money will still be invested in 10 years, I don&#8217;t see why this shouldn&#8217;t be a core set it and forget it holding in any portfolio.</p>



<h3 class="wp-block-heading">Remember one of intelligence&#8217;s key ingredients</h3>



<p>Memory is crucial to human and artificial intelligence. A machine&#8217;s ability to make predictions and perform automated work isn&#8217;t simply dictated by how quickly it can crunch information. It also needs stored data from which it can generate such predictions. That&#8217;s where memory semiconductors come in.</p>



<p>Digital memory chips are necessary for all sorts of electronic systems. Many types are highly commoditized and sensitive to changes in supply and demand. That can wreak havoc on pricing and lead to wild swings from sky-high profitability to heavy losses. Micron has historically been at the mercy of this cycle.</p>



<p>This memory chip leader will probably continue to be highly cyclical. However, the company changed its approach a few years ago. It invested in new chip technology and architecture to differentiate its portfolio from the rest of the pack. It&#8217;s also increasingly focused on higher-order computing needs and has walked away from deals that don&#8217;t meet its investment return criteria. Even during the lows of a year-plus semiconductor slump, Micron has thus remained profitable.</p>



<p>Surging orders amid the COVID-19 pandemic have pulled Micron out of its trough and pushed it back into growth mode. Revenue was up 14% in the last quarter. AI systems, data centers, and connected devices operating at the &#8220;network edge&#8221; have needed upgrades during the lockdown.</p>



<p>Lower sales of consumer-facing devices like smartphones and cars partially offset results. But upgrade cycles for new video game consoles, PCs, and advanced driver assistance systems are expected in the years ahead. Micron&#8217;s advanced memory chips play an integral role in these smart devices. With a new upcycle possibly beginning, I think the stock is a buy now. </p>



<h3 class="wp-block-heading">Training bots to handle soul-crushing work</h3>



<p>People worry that AI will compete with humans for jobs. It&#8217;s not an ungrounded concern. Tech&#8217;s increased productivity and cost-savings benefits are very real. Many workers may need to future-proof their careers &#8212; or change careers altogether &#8212; because of the disruptive nature of tech. But right or wrong, it&#8217;s happening. Humans have always had to compete with the technology they create.</p>



<p>AI and related technologies like low-code software development are proving useful to organizations trying to adjust to shelter-in-place orders and the &#8220;new normal&#8221; of the pandemic. Low-code isn&#8217;t AI, per se. It&#8217;s a visual toolkit that builds applications much faster than prior technologies.</p>



<p>There are a number of low-code providers out there, but Appian made an interesting recent move. Early in 2020, the company made its first-ever acquisition by buying robotic process automation (RPA) firm Novayre Systems. What is RPA? Think of it as a virtual robot that can be programmed to do tasks within software, like populating form fields. </p>



<p>Both low-code software and RPA can help companies resume operations. But won&#8217;t that steal jobs? In the short term, it might. But if it&#8217;s going to happen, investors might as well prepare, and I see owning Appian as one way to do so.</p>



<p>Granted, Appian expects its recurring software revenue to slow to a 25% to 26% year-over-year pace in the second quarter (down from 34% in 2019 and 46% in the first quarter of 2020) as many customers are putting new projects on temporary hold. Appian, a small company, still operates at a loss on top of that.</p>



<p>However, the company had no debt and $149 million in cash and equivalents at the end of March 2020. This doesn&#8217;t include its recently announced sale of 1.93 million new shares of its common stock, which would raise about $100 million in fresh cash at current share prices. That news has shares down over 15% from all-time highs. Appian trades for 13 times forward revenue expecations, so it isn&#8217;t particularly cheap. But I think this is an early AI and automation vendor worth taking seriously.</p>



<h3 class="wp-block-heading">10 stocks that could be the biggest winners of the stock market crash</h3>



<p>When investing geniuses David and Tom Gardner have a stock tip, it can pay to listen. After all, the newsletter they have run for over a decade,&nbsp;<em>Motley Fool Stock Advisor</em>, has tripled the market.*</p>



<p>David and Tom just revealed what they believe are the <strong>ten best stocks</strong> for investors to buy right now… and NVIDIA Corporation wasn&#8217;t one of them! That&#8217;s right &#8212; they think these 10 stocks are even better buys.</p>
<p>The post <a href="https://www.aiuniverse.xyz/3-top-artificial-intelligence-stocks-to-buy-in-july/">3 Top Artificial Intelligence Stocks to Buy in July</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/3-top-artificial-intelligence-stocks-to-buy-in-july/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>How IT Supports the Data Science Operation</title>
		<link>https://www.aiuniverse.xyz/how-it-supports-the-data-science-operation/</link>
					<comments>https://www.aiuniverse.xyz/how-it-supports-the-data-science-operation/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 05 Nov 2019 09:40:02 +0000</pubDate>
				<category><![CDATA[Data Science]]></category>
		<category><![CDATA[Big data]]></category>
		<category><![CDATA[cloud systems]]></category>
		<category><![CDATA[data science]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[IT]]></category>
		<category><![CDATA[software]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=4998</guid>

					<description><![CDATA[<p>Source: informationweek.com The data science world in its most puristic state is populated by parallel processing servers that primarily run Hadoop and execute in batch mode, large troves of data that these processors operate on, and statistically and scientifically trained data scientists who know nothing about IT, or about the requirements of maintaining an IT <a class="read-more-link" href="https://www.aiuniverse.xyz/how-it-supports-the-data-science-operation/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-it-supports-the-data-science-operation/">How IT Supports the Data Science Operation</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: informationweek.com</p>



<p>The data science world in its most puristic state is populated by parallel processing servers that primarily run Hadoop and execute in batch mode, large troves of data that these processors operate on, and statistically and scientifically trained data scientists who know nothing about IT, or about the requirements of maintaining an IT operation.</p>



<p>While there are organizations that include data science specialties within IT and therefore have the IT management and support expertise nearby, there are an equal number of companies that run their data science departments independently of IT. These departments have little clue of the IT disciplines needed to maintain and support the health of a big data ecosystem.</p>



<p>This is also why many organizations are discovering how critical it is to have data science and IT work hand in hand.</p>



<p>For CIOs and data center leaders, who by necessity should be heavily involved in an IT-data science partnership, and what are the important bases that need to be covered to assure IT support of a data science operation?</p>



<p><strong>Hardware</strong></p>



<p>Two or three years ago, it was a basic rule of thumb that Hadoop, the most dominant big data/data science platform in companies, ran in batch mode. This made it easy for organizations to run big data applications on commodity computing hardware. Now, with the move to more real-time processing of big data, commodity hardware is migrating to in-memory processing, SSD storage and an Apache Spark cluster computing framework. This requires robust processing that can’t necessarily be performed by commodity servers. It also requires IT know-how for configuring hardware components for optimal processing. Accustomed to a fixed record, transactional computing environment, not all IT departments have resident skills for working with or fine-tuning in-memory parallel processing. This is a technical area that IT may need to cross-train or recruit for.</p>



<p><strong>Software</strong></p>



<p>In the Hadoop world, MapReduce is the dominant programming model for processing and generating big data sets with a parallel, distributed algorithm on a cluster. Apache Spark processes in-memory, enabling real-time big data processing. Organizations are moving to more real-time processing, but they also understand the value that Hadoop delivers in a batch environment. From a software standpoint, IT must be able to support both platforms.</p>



<p><strong>Infrastructure</strong></p>



<p>Most IT departments function with a hybrid computing infrastructure that consists of in-house systems and applications in the data center, coupled with private and public cloud systems. This has required IT to think outside of the data center, and to implement management policies, procedures and operations for systems, applications and data that may be in-house, in-cloud or both. Operationally, this has meant that IT must continue to manage its internal technology assets in-house, but also work with cloud vendors that technology asset management is outsourced to, or work in the cloud themselves if assets are only hosted, with the enterprise continuing to manage them.</p>



<p>Support for data science and big data in this more complicated infrastructure takes the IT technology management responsibility one step further, because the management goals for big data differ from those of traditional, fixed data.</p>



<p>Among the support issues for big data that IT must decide on are:</p>



<ul class="wp-block-list"><li>How much big data, which is voluminous and constantly building, should be archived, and which data should be discarded?</li><li>What are the storage and processing price points of cloud vendors, and at what point do cloud storage and processing become more expensive than their in-house equivalents?</li><li>What is the disaster recovery plan for big data and its applications, which are becoming mission critical for organizations?</li><li>Who is responsible for SLAs, especially in the cloud world, when a big data production problem occurs?</li><li>How is data shuttled safely and securely between the cloud and the data center?</li></ul>



<p><strong>Insights</strong></p>



<p>Data scientists have expertise in statistical analysis and algorithm development, but they don&#8217;t necessarily know how much or which data is available for them to operate on. This is an area where IT excels, because its organizational charter is to track all of the data in enterprise storage, as well as data that is incoming and outgoing.</p>



<p>If a marketing manager wants to develop customer analytics that take into account certain facts that are stored internally on customer records, and also in customers’ purchasing and service histories with the company &#8212; and the manager also wants to know what customers are interested in by tracking customer activity on Websites and social media &#8212; IT is the most knowledgeable when it comes to determining all paths to achieving a total picture of customer information. And it’s the database group, working in tandem with other IT departments, that develops JOINS of data sets that aggregate all of the data so the algorithms data scientists develop can operate on it to develop truest results.</p>



<p>Without IT’s expertise of knowing where the data is and how to access and aggregate it, analytics and data science engineers would be challenged to arrive at accurate insights that can benefit the business.</p>



<p>IT support of the data science operation is a key pillar of corporate analytics success.</p>



<p>IT enables data scientists to do what they do best &#8212; design algorithms to mine the best information from data. At the same time, IT is engaged in its best of class “wheel house” &#8212; knowing where to find the data and aggregate it.Mary E. Shacklett is an internationally recognized technology commentator and President of Transworld Data, a marketing and technology services firm. Prior to founding her own company, she was Vice President of Product Research and Software Development for Summit Information</p>
<p>The post <a href="https://www.aiuniverse.xyz/how-it-supports-the-data-science-operation/">How IT Supports the Data Science Operation</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-it-supports-the-data-science-operation/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Microsoft&#8217;s Azure Quantum employs Honeywell quantum hardware</title>
		<link>https://www.aiuniverse.xyz/microsofts-azure-quantum-employs-honeywell-quantum-hardware/</link>
					<comments>https://www.aiuniverse.xyz/microsofts-azure-quantum-employs-honeywell-quantum-hardware/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 05 Nov 2019 09:23:00 +0000</pubDate>
				<category><![CDATA[Microsoft Azure Machine Learning]]></category>
		<category><![CDATA[Azure]]></category>
		<category><![CDATA[Azure cloud]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Honeywell quantum]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[Microsoft]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=4995</guid>

					<description><![CDATA[<p>Source: zdnet.com Microsoft this morning announced a partnership with multiple companies for quantum computing capabilities running in its Azure cloud computing service during its &#8220;Ignite&#8221; developer conference, with one of the partners being industrial giant Honeywell.  Honeywell&#8217;s head of its quantum effort, Tony Uttley, spoke with ZDNet about the announcement, explaining how quantum computing is on an <a class="read-more-link" href="https://www.aiuniverse.xyz/microsofts-azure-quantum-employs-honeywell-quantum-hardware/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/microsofts-azure-quantum-employs-honeywell-quantum-hardware/">Microsoft&#8217;s Azure Quantum employs Honeywell quantum hardware</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: zdnet.com</p>



<p>Microsoft this morning announced a partnership with multiple companies for quantum computing capabilities running in its Azure cloud computing service during its &#8220;Ignite&#8221; developer conference, with one of the partners being industrial giant Honeywell. </p>



<p>Honeywell&#8217;s head of its quantum effort, Tony Uttley, spoke with ZDNet about the announcement, explaining how quantum computing is on an evolutionary path from assisting conventional computing to someday usurping it.&nbsp;</p>



<p>&#8220;For those who don&#8217;t even know Honeywell is in quantum, think of us now as a fully fledged participant and a real contender,&#8221; said Uttley.&nbsp;</p>



<p>The Microsoft offering is dubbed &#8220;Azure Quantum.&#8221;</p>



<p>When asked if quantum computing requires quantum &#8220;supremacy&#8221; to be demonstrated, as Google did two weeks ago in Nature magazine, Uttley responded that it&#8217;s a matter of evolving the science through multiple periods of increasing sophistication.</p>



<p>&#8220;We think of three eras,&#8221; says Uttley. &#8220;First is the era in which quantum computers act as co-processors&#8221; for classical computers, to help accelerate some of the work that the classical computers do. Then comes the era of the &#8220;classically impractical,&#8221; in which some things that are feasible to do on the classical computer nevertheless might be worth doing in a quantum system nevertheless might be better done on a quantum machine for the dramatic speed-up purposes.&nbsp;</p>



<p>And last comes the era of the &#8220;classically impossible,&#8221; perhaps things such as factoring large numbers into primes, are really be infeasible on a classical computer. </p>



<p>&#8220;We are on the verge of classically impractical,&#8221; is how Uttley described the present situation.</p>



<p>More important at the moment than fancy algorithm demonstrations such as Google&#8217;s is the basic fidelity of any quantum device, perhaps. Although Microsoft has for a long time been developing its own quantum circuitry, the partnership with Honeywell offers the company access to Honeywell&#8217;s hardware made up of &#8220;trapped ions.&#8221; An ion, of course, is an atom that has a net positive or negative electrical charge. The trap in this case is a fabricated device, like a computer chip, that can be used to manipulate those ions, similar to moving electrons through the gates made up of silicon transistors.</p>



<p>Trapped ions is a scientific area of exploitation going back at least 24 years. Ions are perceived as having some desirable properties versus other quantum approaches, including the relative stability of the qubits created with them, thanks to relatively long &#8220;coherence times&#8221; of the qubits (the period of time during which the all-important quantum entanglement can be maintained.)</p>



<p>To Uttley, all this adds up to a more reliable quantum device than some other approaches, on average.&nbsp;</p>



<p>&#8220;There&#8217;s a a lot of discussion of how many qubits do you have,&#8221; says Uttley. &#8220;But more rarely asked is, What can you do with those? Are they fully connectable?</p>



<p>&#8220;At least as important is the question of what is the fidelity&#8221; of the qubits, says Uttley, &#8220;how accurate are they.&#8221;</p>



<p>Although Honeywell has fabricated ion traps to build its computer, Uttley is short on details of the gate configurations and circuit configurations and technical details of the trap&#8217;s properties, while promising to reveal more information on the technical side at a future point. Some interesting technical material is provided in a just-published paper posted on the arXiv pre-print server, &#8220;Subspace benchmarking high-fidelity entangling operations with trapped ions.&#8221;</p>



<p>For the moment, says Uttley, the focus is going to be on working with customers in a beta release by the end of this year, with a more public unveiling in early 2020.</p>



<p>For Honeywell, as Uttley sees it, quantum is a natural extension of the company&#8217;s decades-long legacy in control systems. And quantum computing offers the potential to speed-up machine learning algorithms for various industrial applications that could be of special interest to Honeywell customers, such as&nbsp; for optimization of petrochemical processes and for optimization for air traffic control, and any number of other things that can be defined as control problems first and foremost.&nbsp;</p>



<p>(More information on Honeywell&#8217;s Quantum Solutions unit is available on the company&#8217;s Web site.)</p>



<p>Microsoft states in today&#8217;s announcement that &#8220;we&#8217;ve been working together with a global quantum community to innovate across every layer of the quantum stack — from applications and software down to control and devices.&#8221; Microsoft is also using offerings from startups 1Qbit, IonQ and QCI.</p>



<p>Microsoft announced users of the system, such as Case Western Reserve University.</p>
<p>The post <a href="https://www.aiuniverse.xyz/microsofts-azure-quantum-employs-honeywell-quantum-hardware/">Microsoft&#8217;s Azure Quantum employs Honeywell quantum hardware</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/microsofts-azure-quantum-employs-honeywell-quantum-hardware/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>This prosthetic arm combines manual control with machine learning</title>
		<link>https://www.aiuniverse.xyz/this-prosthetic-arm-combines-manual-control-with-machine-learning/</link>
					<comments>https://www.aiuniverse.xyz/this-prosthetic-arm-combines-manual-control-with-machine-learning/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Sat, 14 Sep 2019 12:08:41 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[EPFL]]></category>
		<category><![CDATA[Gadgets]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[Prosthetics]]></category>
		<category><![CDATA[Robotics]]></category>
		<category><![CDATA[Science]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=4483</guid>

					<description><![CDATA[<p>Source: techcrunch.com Prosthetic limbs are getting better every year, but the strength and precision they gain doesn’t always translate to easier or more effective use, as amputees have only a basic level of control over them. One promising avenue being investigated by Swiss researchers is having an AI take over where manual control leaves off. To visualize <a class="read-more-link" href="https://www.aiuniverse.xyz/this-prosthetic-arm-combines-manual-control-with-machine-learning/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/this-prosthetic-arm-combines-manual-control-with-machine-learning/">This prosthetic arm combines manual control with machine learning</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: techcrunch.com</p>



<p>Prosthetic limbs are getting better every year, but the strength and precision they gain doesn’t always translate to easier or more effective use, as amputees have only a basic level of control over them. One promising avenue being investigated by Swiss researchers is having an AI take over where manual control leaves off.</p>



<p>To visualize the problem, imagine a person with their arm amputated above the elbow controlling a smart prosthetic limb. With sensors placed on their remaining muscles and other signals, they may fairly easily be able to lift their arm and direct it to a position where they can grab an object on a table.</p>



<p>But what happens next? The many muscles and tendons that would have controlled the fingers are gone, and with them the ability to sense exactly how the user wants to flex or extend their artificial digits. If all the user can do is signal a generic “grip” or “release,” that loses a huge amount of what a hand is actually good for.</p>



<p>Here’s where researchers from École polytechnique fédérale de Lausanne (EPFL)  take over. Being limited to telling the hand to grip or release isn’t a problem if the hand knows what to do next — sort of like how our natural hands “automatically” find the best grip for an object without our needing to think about it. Robotics researchers have been working on automatic detection of grip methods for a long time, and it’s a perfect match for this situation.</p>



<p>Prosthesis users train a machine learning model by having it observe their muscle signals while attempting various motions and grips as best they can without the actual hand to do it with. With that basic information the robotic hand knows what type of grasp it should be attempting, and by monitoring and maximizing the area of contact with the target object, the hand improvises the best grip for it in real time. It also provides drop resistance, being able to adjust its grip in less than half a second should it start to slip.</p>



<p>The result is that the object is grasped strongly but gently for as long as the user continues gripping it with, essentially, their will. When they’re done with the object, having taken a sip of coffee or moved a piece of fruit from a bowl to a plate, they “release” the object and the system senses this change in their muscles’ signals and does the same.</p>



<p>It’s reminiscent of another approach, by students in Microsoft’s Imagine Cup, in which the arm is equipped with a camera in the palm that gives it feedback on the object and how it ought to grip it.</p>



<p>It’s all still very experimental, and done with a third-party robotic arm and not particularly optimized software. But this “shared control” technique is promising and could very well be foundational to the next generation of smart prostheses. The team’s paper is published in the journal Nature Machine Intelligence.</p>
<p>The post <a href="https://www.aiuniverse.xyz/this-prosthetic-arm-combines-manual-control-with-machine-learning/">This prosthetic arm combines manual control with machine learning</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/this-prosthetic-arm-combines-manual-control-with-machine-learning/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Is Your Data Center Ready for Machine Learning Hardware?</title>
		<link>https://www.aiuniverse.xyz/is-your-data-center-ready-for-machine-learning-hardware/</link>
					<comments>https://www.aiuniverse.xyz/is-your-data-center-ready-for-machine-learning-hardware/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 01 Feb 2019 09:52:11 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[data center]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Hardware]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[Nvidia]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=3301</guid>

					<description><![CDATA[<p>Source- datacenterknowledge.com So, you want to scale your computing muscle to train bigger deep learning models. Can your data center handle it? According to Nvidia, which sells more of the specialized chips used in machine learning than any other company, it most likely cannot. These systems often consume so much power, a conventional data center doesn’t <a class="read-more-link" href="https://www.aiuniverse.xyz/is-your-data-center-ready-for-machine-learning-hardware/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/is-your-data-center-ready-for-machine-learning-hardware/">Is Your Data Center Ready for Machine Learning Hardware?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source- <a href="https://www.datacenterknowledge.com/machine-learning/your-data-center-ready-machine-learning-hardware" target="_blank" rel="noopener">datacenterknowledge.com</a></p>
<p>So, you want to scale your computing muscle to train bigger deep learning models. Can your data center handle it?</p>
<p>According to Nvidia, which sells more of the specialized chips used in machine learning than any other company, it most likely cannot. These systems often consume so much power, a conventional data center doesn’t have the capacity to remove the amount of heat they generate.</p>
<p>It’s easy to see how customers withoutan infrastructure that can support a piece of Nvidia hardware is a business problem for Nvidia. To widen this bottleneck for at least one of its product lines, the company now has a list of pre-approved colocation providers it will send you to if you need a place that will keep your supercomputers cool and happy.</p>
<p>As more companies’ machine learning initiatives graduate from initial experimentation phases – during which their data scientists may have found cloud GPUs rented from the likes of Google or Microsoft sufficient – they start thinking about larger-scale models and investing in their own hardware their teams can share to train those models.</p>
<p>Among the go-to hardware choices for these purposes have been Nvidia’s DGX-1 and DGX-2 supercomputers, which the company designed specifically with machine learning in mind. When a customer considers buying several of these systems for their data scientists, they often find that their facilities cannot support that level of power density and look to outsource the facilities part.</p>
<p>“This program takes that challenge off their plate,” Tony Paikeday, who’s in charge of marketing for the DGX line at Nvidia, told Data Center Knowledge in an interview about the chipmaker’s new colocation referral program. “There’s definitely a lot of organizations that are starting to think about shared infrastructure” for machine learning. Deploying and managing this infrastructure falls to their IT leadership, he explained, and many of the IT leaders “are trying to proactively get ahead of their companies’ AI agendas.”</p>
<h2>Cool Homes for Hot AI Hardware</h2>
<p>DGX isn’t the only system companies use to train deep learning models. There are numerous choices out there, including servers by all the major hardware vendors, powered by Nvidia’s or AMD’s GPUs. But because they all pack lots of GPUs in a single box – an HPE Apollo server has eight GPUs, for example, as does DGX-1, while DGX-2 has 16 GPUs – high power density is a constant across this category of hardware. This means that <a href="https://www.datacenterknowledge.com/archives/2017/03/27/deep-learning-driving-up-data-center-power-density">along with the rise of machine learning comes growing demand for high-density data centers</a>.</p>
<p>The trend benefits specialist colocation providers like Colovore, Core Scientific, and ScaleMatrix, who designed their facilities for high density from the get-go. But other, more generalist data center providers are also capable of building areas within their facilities that can handle high density. Colovore, Core Scientific, and ScaleMatrix are on the list of colocation partners Nvidia will refer DGX customers to, but so are Aligned Energy, CyrusOne, Digital Realty Trust, EdgeConneX, Flexential, and Switch.</p>
<p>Partially owned by Digital Realty, Colovore built its facility in Santa Clara in 2014 <a href="https://www.datacenterknowledge.com/archives/2017/03/01/this-company-owns-the-high-density-data-center-niche-in-silicon-valley">specifically to take care of Silicon Valley’s high-density data center needs</a>. Today, it supports close to 1,000 DGX-1 and DGX-2 systems, Ben Coughlin, the company’s CFO and co-founder, told us. He wouldn’t say who owned the hardware, saying only that it belonged to fewer than 10 customers who were “mostly tech” companies. (Considering that the facility is only a five-minute drive from Nvidia headquarters, it’s likely that the chipmaker itself is responsible for a big portion of that DGX footprint, but we haven’t been able to confirm this.)</p>
<p>Colovore has already added one new customer because of Nvidia’s referral program. A Bay Area healthcare startup using artificial intelligence is “deploying a number of DGX-1 systems to get up and running,” Coughlin said.</p>
<p>A single DGX-1 draws 3kW in the space of three rack units, while a DGX-2 needs 10kW and takes up 10 rack units – that’s 1kW per rack unit regardless of the model. Customers usually put between nine and 11 DGX-1s in a single rack, or up to three DGX-2s, Coughlin said. Pumping chilled water to the rear-door heat exchangers mounted on the cabinets, Colovore’s passive cooling system (no fans on the doors) can cool up to 40kW, according to him.</p>
<p>In a “steady state,” many of the cabinets draw 12kW to 15kW, “but when they go into some sort of workload state, when they’re doing some processing, they’ll spike 25 to 30 kilowatts,” he said. “You can see swings on our UPSs of 400 to 500 kilowatts at that time across our infrastructure. It’s pretty wild.”</p>
<p>Echoing Nvidia’s Paikeday, Chris Orlando, CEO and co-founder of ScaleMatrix, said typical customers that turn to his company’s high-density colocation services in San Diego and Houston are well into their machine learning programs and looking at expanding and scaling the infrastructure that supports those programs.</p>
<p>A <a href="https://www.datacenterknowledge.com/archives/2017/02/06/this-data-center-is-designed-for-deep-learning">high-density specialist</a>, ScaleMatrix’s proprietary cooling design also brings chilled water directly to the IT cabinets. The company has “more than a handful of customers that have DGX boxes colocated today,” Orlando told us.</p>
<h2>High Density Air-Cooled</h2>
<p>Flexential, which is part of Nvidia’s referral program but doesn’t have high-density colocation as its sole focus, uses traditional raised-floor air cooling for high density, adding doors at the ends of the cold aisles to isolate them from the rest of the building and “create a bathtub of cold air for the server intakes,” Jason Carolan, the company’s chief innovation officer, explained in an email.</p>
<p>According to him, this approach works fine for a 35kW rack of DGX systems. “We have next-generation cooling technologies that will take us beyond air, but to date, we haven’t had a sizeable enough customer application that has required … it on a large scale,” he said. Five of Flexential’s 41 data centers can cool high-density cabinets today.</p>
<p>As more and more companies use machine learning, it is becoming an important workload for data center providers to be able to support. Adoption of these computing techniques is only in its early phases, and they are likely to become an important growth driver for colocation companies going forward. Not many enterprises are set up to host supercomputers on-premises, and few are going to spend the money to build this infrastructure, so turning to colocation facilities that are already designed to efficiently cool tens of kilowatts per rack is their logical next step.</p>
<p>The post <a href="https://www.aiuniverse.xyz/is-your-data-center-ready-for-machine-learning-hardware/">Is Your Data Center Ready for Machine Learning Hardware?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/is-your-data-center-ready-for-machine-learning-hardware/feed/</wfw:commentRss>
			<slash:comments>2</slash:comments>
		
		
			</item>
	</channel>
</rss>
