<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>analytics tools Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/analytics-tools/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/analytics-tools/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Thu, 26 Dec 2019 07:43:50 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>The future is DARQ but bright</title>
		<link>https://www.aiuniverse.xyz/the-future-is-darq-but-bright/</link>
					<comments>https://www.aiuniverse.xyz/the-future-is-darq-but-bright/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 26 Dec 2019 07:43:49 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[analytics tools]]></category>
		<category><![CDATA[applications]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[DARQ]]></category>
		<category><![CDATA[Future]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=5817</guid>

					<description><![CDATA[<p>Source: cio.economictimes.indiatimes.com In the digital age, disruption is not an exception, but a norm. Everything already is, or is becoming digital. Data is everything and everywhere – <a class="read-more-link" href="https://www.aiuniverse.xyz/the-future-is-darq-but-bright/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/the-future-is-darq-but-bright/">The future is DARQ but bright</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: cio.economictimes.indiatimes.com</p>



<p>In the digital age, disruption is not an exception, but a norm. Everything already is, or is becoming digital. Data is everything and everywhere – how we communicate, learn, spend, work, and even elect governments.</p>



<p>The digital playing field will eventually even out as more businesses unlock newer benefits of digital business models and processes. As we enter the post-digital age, the game changer will be a new technology stack– DARQ.</p>



<p><strong>The DARQ Future<br></strong><br>While the SMAC (social, mobile, analytics and cloud) convergence was the foundation of the digital age, the post-digital age is DARQ. The next set of massive disruption changes comprises of Distributed ledger technology (DLT), Artificial intelligence (AI), Extended reality (XR), and Quantum computing (QC). </p>



<p>

Together, DARQ promises to drive incredible opportunities to reimagine the human future – The way we play, interact and work could (and in all probability would), be massively different in the next decade or so. “Scotty, beam me up”, could be a (virtual) reality very soon.</p>



<p><strong>DLT for trust<br></strong><br>DLT is the backbone for technologies like cryptocurrency and blockchain. Its utility lies in making data validation and non-repudiation redundant. Since information is locked in an implicit chain of trust, DLT has significant applications. In a hyper-connected world (think smart cities, self-driven cars, smart power grids and buildings etc.) trust in the virtual world is key. Authentication, Authorization, and Non-repudiation become centre stage as people, devices, and machines interact with each other in a digital socio-economic tango.</p>



<p>DLT offers greater agency in information verification and control, data and privacy. On June 18,2019, Facebook announced its foray into DLT with the Libra Blockchain, a cryptographically authenticated database designed to support its cryptocurrency, Libra coin. With it, you can buy or cash out online or at local exchange points like grocery stores, and spend it using interoperable third-party wallet apps or Facebook’s own wallet to be built into WhatsApp, Messenger and its own app.</p>



<p>That’s one blip on the radar: today, DLT has embedded itself in sectors like healthcare, supply chain and logistics amongst others and is cementing trust into a variety of transactions.</p>



<p><strong>Intelligence gain<br></strong><br>Humanity has progressed far in its short time on the planet. Driven by a global network of goods and services, fuelled by Absolute Advantage (The Wealth of Nations, Adam Smith), we have specialized into our areas of strength. Strong advances in computing &amp; storage power have enabled machines to learn these “specialized but narrow” skills and start to merge with us in our daily lives.</p>



<p>Artificial intelligence (AI) and its constellation of technologies are front and centre in producing technology-driven business and customer interactions more efficiently, faster, and at lower costs. Across sectors, AI is optimizing processes to improve decision-making from smart cars, diagnosing illnesses, participating in debate, writing realistic text, to running presidential campaigns.</p>



<p>AI and the Internet of Medical Things (IoMT) are already showing encouraging consumer health applications in real-time patient health monitoring, diagnostics, and emergency response systems. US caregiver, Symphony Post-Acute Network integrated machine learning (ML) with a cloud-based AI engine to predict and improve care for 80,000 patients. In Retail, consumers could see ML use purchase data to auto-fill shopping lists, and even perhaps, place orders based on purchase data and shelf-life.</p>



<p><strong>Multiple realities</strong><br>Faster and more intelligent computers, cheaper storage, and hyper-speed networks are enabling Augmented reality to move beyond aircraft simulators into everyday lives. Be it gaming, tele-medicine, or virtual meetings – XR is beginning to become omni-present and could disrupt the way we work and play. Would you still fly to Hawaii if XR replicated the same experience down to the last sensory experience in your living room?</p>



<p>In a recent event, Microsoft scanned Julia White, CVP Azure Marketing, and transformed her into an exact hologram replica. Both Julia and her virtual clone then appeared together on stage for the keynote. Microsoft used its Azure AI technologies and neural text-to-speech and the future is rapidly becoming our reality now. The challenge is to bring these capabilities to life in useful, relevant and progressive ways – to make life better, to help us all achieve more. Obviously, Extended reality (XR) promises to be a game-changer.</p>



<p>Together with AI, XR makes for serious business. They have already identified potential applications of XR—like virtual training, “hands-on” education, or even immersive online shopping. Currently, almost 50% of XR is currently used in higher education. In medicine, students rehearse surgeries in VR as a part of their training, while several enterprises have internal training components made immersive by XR. Aside from learning and education, XR is progressively being used in design, architecture and engineering. Similarly, automotive engineers use VR suites to sculpt concept cars to bring down real-world prototype costs.</p>



<p>For XR to shape and augment the world as we know it, businesses must invest in network connectivity, application development, and most importantly, processing power.</p>



<p><strong>Quantum leap<br></strong><br>When it comes to processing, Quantum computing (QC) is a holy grail in solving the toughest computational problems. Google says its colossal D-Wave 2X quantum computing machine has been figuring out algorithms at 100,000,000 times the speed of a traditional computer chip can.</p>



<p>Yet, Quantum computer is more than just its processor power, and is furthest from maturity. Nevertheless, it’s accelerating as companies begin exploring its potential. Several leaders in QC have released platforms that enable connections to their quantum computers via the cloud. Last year, Aliyun, Chinese giant Alibaba’s cloud service subsidiary, and the Chinese Academy of Sciences launched a joint 11-qubit quantum computing service encouraging researchers to conduct quantum experiments.</p>



<p>QC is necessary to power next- generation technologies, yet it is the costliest piece of the DARQ puzzle. All that computing power is of little use data back-up isn’t possible; quantum bits (qubits) take up vast storage spaces physically. To solve this, scientists are now looking at the natural world for the solution: as a 3D storage system, DNA offers an extra dimension in storing the vast quanta of data.</p>



<p>Quantum computers have the potential for pathbreaking applications in material design, logistics, AI and Machine learning, encryption, manufacturing, finance and even energy to name a few. Recently, Google announced ‘quantum supremacy’. Sycamore, Google’s quantum computer, calculated a task in 3 minutes, 20 seconds which otherwise, the world’s best supercomputer, Summit, would have taken 10,000 years to compute.</p>



<p>There’s still some time for QC to turn into an ROI-focused technology market and involves industrial-academic collaboration and public-private partnerships compelled to improve both digital and physical infrastructure.</p>



<p><strong>What&#8217;s bright with DARQ<br></strong><br>Enterprises must realise that only a strong digital foundation will help them pilot DARQ technologies to their advantages.</p>



<p>At the core of DARQ is innovation and this is critical for its foundational role in building the future. Only when all four technologies are viable at scale will their impact will grow significantly. Businesses must have the will to seize the opportunities and begin exploring possibilities and investments with a strategic focus—to leap ahead of competition in a brand-new competitive landscape.</p>



<p><strong>The author is SVP &amp; Head of APAC, GlobalLogic</strong><strong>.</strong></p>



<p>DISCLAIMER: The views expressed are solely of the author and ETCIO.com does not necessarily subscribe to it. ETCIO.com shall not be responsible for any damage caused to any person/organisation directly or indirectly.

</p>
<p>The post <a href="https://www.aiuniverse.xyz/the-future-is-darq-but-bright/">The future is DARQ but bright</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/the-future-is-darq-but-bright/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>8 Top Big Data Analytics Tools</title>
		<link>https://www.aiuniverse.xyz/8-top-big-data-analytics-tools/</link>
					<comments>https://www.aiuniverse.xyz/8-top-big-data-analytics-tools/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Sat, 27 Apr 2019 05:11:33 +0000</pubDate>
				<category><![CDATA[Big Data]]></category>
		<category><![CDATA[Analytic Capabilities]]></category>
		<category><![CDATA[analytics tools]]></category>
		<category><![CDATA[Big data]]></category>
		<category><![CDATA[Collaboration]]></category>
		<category><![CDATA[Datamation]]></category>
		<category><![CDATA[Integration]]></category>
		<category><![CDATA[Microsoft Power BI]]></category>
		<category><![CDATA[Oracle Analytics Cloud]]></category>
		<category><![CDATA[Splunk]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=3451</guid>

					<description><![CDATA[<p>Source:- datamation.com. By definition, Big Data is all about collecting large (or &#8220;Big&#8221;) volumes of structured and unstructured data. What makes Big Data useful is analysis of the collected information to <a class="read-more-link" href="https://www.aiuniverse.xyz/8-top-big-data-analytics-tools/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/8-top-big-data-analytics-tools/">8 Top Big Data Analytics Tools</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source:- datamation.com.</p>
<p>By definition, Big Data is all about collecting large (or &#8220;Big&#8221;) volumes of structured and unstructured data. What makes Big Data useful is analysis of the collected information to find patterns and meaning that otherwise would be left undiscovered. Making sense of Big Data is the realm of Big Data analytics tools, which provide different capabilities for organization to derive competitive value.</p>
<p>What should you look for when selecting Big Data Analytics tools for your business?</p>
<ul>
<li><strong>Analytic Capabilities.</strong> There are multiple types of analytics capabilities with different models for various types of analysis including:  predictive mining, decision trees, time series, neural networks, path analysis, market basket analysis, and link analysis.</li>
</ul>
<ul>
<li><strong>Integration.</strong> Often additional statistical tools and programming languages (such as R) are needed by organization to conduct other forms of custom analysis.</li>
</ul>
<ul>
<li><strong>Data Import and Export.</strong> Getting data in and out of various tools is a critical feature and understanding how difficult (or easy) it is connect the analytics tool to the big data repository is a key consideration.</li>
</ul>
<ul>
<li><strong>Vizualization</strong>. Seeing the numbers is one thing, but having data displayed in a graphical format, often makes the data more useable.</li>
</ul>
<ul>
<li><strong>Scalability.</strong> Big Data can be big to start with, and generally has a tendency to grow even bigger over time. Organizations need to consider and understand the scalability options for the analytics tools they choose.</li>
</ul>
<ul>
<li><strong>Collaboration.</strong> Analysis can sometimes be a solitary exercise, but more often than not it involves collaboration.</li>
</ul>
<p>In this <em>Datamation</em> guide, we look at 8 of the top Big Data Analytics Tools that cover multiple aspects of the market.</p>
<ul>
<li>Cloudera</li>
<li>Microsoft Power BI</li>
<li>Oracle Analytics Cloud</li>
<li>Pentaho Big Data Integration and Analytics</li>
<li>SAS Institute</li>
<li>Sisense</li>
<li>Splunk</li>
<li>Tableau</li>
</ul>
<h2>Cloudera</h2>
<p>When it comes to the core of Big Data, few if any companies are as closely tied with the core Hadoop Big Data open source platform as Cloudera. After all, the founders of Hadoop itself started the company. Cloudera recently got  an even bigger foothold in the Hadoop ecosystem with the merger of Hortonworks which was its primary rival.</p>
<p>The key differentiator for Cloudera is the company&#8217;s deep understanding and core competence in Hadoop, which carries through its portfolio including the company&#8217;s Cloudera Enterprise platform. This is built on top of the open source CDH distribution.</p>
<p>Cloudera&#8217;s Big Data tools are a good fit for organizations that need a full stack that includes the core Hadoop technology for collecting and creating Big Data. With Cloudera Enterprise, organizations are able to create and process predictive analytics models, using a variety of integrated tools.</p>
<h2>Microsoft Power BI</h2>
<p>Microsoft&#8217;s Power BI has been a perennial favorite for analyst firms in the business intelligence space, based largely on the platform&#8217;s ease of use and accessibility.</p>
<p>In 2018, Microsoft expanded Power BI, extending the same ease of use to Big Data, enabling data ingest and transformation. The key differentiator for the platform is integration with the Azure Data Lake Storage Gen2 which supports HDFS (Hadoop Distributed File System) for advanced big data analytics.</p>
<p>Power BI is a good choice for organizations looking for an easy on-ramp into Big Data Analytics and is a particularly obvious choice for those that have already standardized on a Microsoft stack. Power BI provides cloud based business analytics and integrates what Microsoft calls &#8220;content packs&#8221; with pre-built dashboards and report for different types of analysis and data monitoring. The collaboration capabilities in the platform enables users to share data and dashboard, while also providing alerting capabilities.</p>
<h2>Oracle Analytics Cloud</h2>
<p>Oracle hasn&#8217;t always been known as a Big Data analytics provider, but it&#8217;s a space where the database giant has moved aggressively into in recent years. Self-service Big Data analytics on a consumption usage model is what the Oracle Analytics Cloud is all about.</p>
<p>Among the key differentiators of the Oracle Analytics Cloud that users comment on is the platform&#8217;s automation capabilities for different types of analytics and Big Data analysis use-cases. Organizations that are already used to using Oracle tools, including Oracle&#8217;s namesake database, will likely be the most attracted to the Analytics Cloud offering.</p>
<p>The ability to bring multiple data sources together is a core capability of the Oracle Analytics Cloud, with a strong infrastructure that including the Oracle Event Hub Cloud service to ingest data and the Oracle Big Data Cloud Service to store data.</p>
<h2>Hitachi Vantara Pentaho</h2>
<p>Hitachi is not a name that many would associate with Big Data, but ever since the company acquired Pentaho in 2015, it has been a solid player in the space.</p>
<p>Pentaho&#8217;s roots are with its open source analytics platform upon which the more expansive Enterprise edition is built. It&#8217;s the open source nature of the platform that is a key differentiator and has led to a broad community of users that is also often seen as a key strength by users.</p>
<p>Pentaho is a good choice for organizations with lots of different types of data and big data sources. The ability to rapidly ingest and blend data from different sources is another key benefit that users gain from the Pentaho Big Data Integration and Analytics platform. Pentaho&#8217;s platform enables multiple models including predictive analytics to help organizations guide toward specific outcomes.</p>
<h2>SAS Visual Analytics</h2>
<p>SAS Institute has a long history in the analytics market that predates the use of Big Data as both a term and a technology by decades. The company has deep domain expertise in analytics which is manifest across a number of different offerings that can help with Big Data Analytics, among them is the Visual Analytics solution that runs on the broader SAS platform for analytics.</p>
<p>Visual Analytics is for users and organizations that are looking for deep analytics tools, with drag and drop functionality for building advanced visualizations. Extensibility of the platform for different types of business intelligence and data reporting needs is a key differentiator for the platform.</p>
<p>Collaboration is a core component as well with the ability to share information and comments across multiple options including mobile devices, web browsers and even Microsoft Office applications. SAS Visual Analytics can be deployed on-premises or as a service in the cloud.</p>
<h2>Sisense</h2>
<p>Getting Big Data repositories in a state where they can be rapidly used for analytics is a non-trivial challenge, that Sisense aims to help solve with its platform</p>
<p>The promise of helping to make it easier to get Big Data ready for analysis is an area of strength and a key differentiator for Sisense, with its Big Data preparation capabilities that aim to make is easier for users to model data.</p>
<p>Sisense is a good choice for larger organizations that are looking for fast implementation time and solid customer support. The data visualization via the systems dashboard is often seen by users as being easy to use and as a time saver to get the required results. Accessing the dashboards and sharing data is another core strength of the platform, with mobile and web options as well as the ability to easily generate different types of reports.</p>
<p>Sisense offers both on-premises as well as cloud-based offering for its platform.</p>
<h2>Splunk</h2>
<p>Splunk started out as a log analysis platform and has found a loyal based of users and organizations that love the way the platform works and enables data manipulation and visualizations. For those organizations that are already using Splunk for log or other types of analysis, embracing Splunk Analytics for Hadoop is an easy step.</p>
<p>Splunk as a platform is known for its user-friendly web based log inspection and analytics capabilities, which can be extended to look at Big Data stores in Hadoop systems. The platform benefits from a proven collaboration component and enables users to create and share graphs and analytics dashboards.</p>
<p>Key differentiators for Splunk include the ability to integrate with other elements of the Splunk platform, including security controls and Splunk&#8217;s own search process language (SPL) which further provides strong benefits to users.</p>
<h2>Tableau</h2>
<p>The Tableau platform is a recognized leader in the analytics market and is a good option for non-data scientists working in enterprises, across any sector.</p>
<p>The VizQL data visualization technology at the core of Tableau is a key differentiator for the platform overall, creating data visualization without the need to first organize data. Connectivity to different types and backends of Big Data is also a core attribute of the Tableau platform.</p>
<p>A big benefit that users find from Tableau is the ability to reuse existing skills, in the Big Data context. Tableau makes use of a standardized SQL (Structured Query Language) to query and interface with Big Data systems, making it possible for organizations to make use of existing database and analyst skills sets to find the insights they are looking for, from a large data set. Tableau also integrates its own in-memory data engine called &#8220;Hyper&#8221; enabling fast data lookup and analysis.</p>
<p>&nbsp;</p>
<p>The post <a href="https://www.aiuniverse.xyz/8-top-big-data-analytics-tools/">8 Top Big Data Analytics Tools</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/8-top-big-data-analytics-tools/feed/</wfw:commentRss>
			<slash:comments>5</slash:comments>
		
		
			</item>
		<item>
		<title>Analytics market to keep growing with digital transformation</title>
		<link>https://www.aiuniverse.xyz/analytics-market-to-keep-growing-with-digital-transformation/</link>
					<comments>https://www.aiuniverse.xyz/analytics-market-to-keep-growing-with-digital-transformation/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 21 Nov 2017 08:53:28 +0000</pubDate>
				<category><![CDATA[Human Intelligence]]></category>
		<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[analytics tools]]></category>
		<category><![CDATA[Digital Transformation]]></category>
		<category><![CDATA[Machine learning]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=1744</guid>

					<description><![CDATA[<p>Source &#8211; cio.in From enterprises to power stations, hospitals and public transportation, the volume of real-time data generated is unprecedented today. Data has become a crucial part of <a class="read-more-link" href="https://www.aiuniverse.xyz/analytics-market-to-keep-growing-with-digital-transformation/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/analytics-market-to-keep-growing-with-digital-transformation/">Analytics market to keep growing with digital transformation</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source &#8211;<strong> cio.in</strong></p>
<div>From enterprises to power stations, hospitals and public transportation, the volume of real-time data generated is unprecedented today. Data has become a crucial part of the smooth functioning of business operations and is unleashing new user experiences and an unseen world of business opportunities. With the generation of more and more data, the opportunity of analytics to derive profitable outcomes for businesses is growing rapidly.</div>
<div>In order to adapt to disruption and benefit from the digital transformation, organizations are leveraging analytics tools and taking a focused approach to developing the true value of information. In such a context of rapid transformation, data and analytics cannot be considered separate from each other.</div>
<div>The presence of large complex data sets is why technologies like deep learning and machine learning have risen to become one of the biggest trends in analytics, with large tech organizations are heavily indulging in open AI hardware and software.</div>
<div></div>
<div><strong>Gold among heaps of data</strong></div>
<div>The world data sphere will skyrocket to 163 zettabytes, ten times the 16.1ZB of data generated in 2016, says IDC. Corresponding to such humongous growth in data volume, data monetization is continuously becoming a major source of revenue, with analytics market expected to get a major surge in the coming times. Worldwide revenues for big data and business analytics will boost to more than USD 203 billion in 2020, at a compound annual growth rate (CAGR) of 11.7 percent, says IDC.</div>
<div><strong> </strong></div>
<div><strong>Challenges remain</strong></div>
<div>As we move ahead, big data is still a challenge for businesses for the fact that analyzing massive, complex sets of constantly changing data remains a difficult task for businesses. Another is the analysis of unstructured data that remains a challenge. For data analytics to truly improve business efficiency in the coming years, IT professionals need to develop an end-to-end architecture so as to achieve the scale and agility necessary to analyze big data.</div>
<div>The rapid scalability of cloud computing makes this task possible. With increasing migration of operations to the cloud, along with readily available analytics tools, cloud vendors are giving a strong competition to traditional analytics organizations. IDC predicts that by 2018, new cloud pricing models for particular analytics workloads will push up to 5 times higher growth in spending on cloud versus on-premises analytics solutions.</div>
<p>The post <a href="https://www.aiuniverse.xyz/analytics-market-to-keep-growing-with-digital-transformation/">Analytics market to keep growing with digital transformation</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/analytics-market-to-keep-growing-with-digital-transformation/feed/</wfw:commentRss>
			<slash:comments>3</slash:comments>
		
		
			</item>
		<item>
		<title>8 speed bumps that may slow down the microservices and container express</title>
		<link>https://www.aiuniverse.xyz/8-speed-bumps-that-may-slow-down-the-microservices-and-container-express/</link>
					<comments>https://www.aiuniverse.xyz/8-speed-bumps-that-may-slow-down-the-microservices-and-container-express/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 05 Sep 2017 10:22:26 +0000</pubDate>
				<category><![CDATA[Microservices]]></category>
		<category><![CDATA[analytics tools]]></category>
		<category><![CDATA[container express]]></category>
		<category><![CDATA[DevOps]]></category>
		<category><![CDATA[IT]]></category>
		<category><![CDATA[IT monitoring tools]]></category>
		<category><![CDATA[microservice deployment]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=951</guid>

					<description><![CDATA[<p>Source &#8211; zdnet.com At the core of any DevOps initiative is the judicious employment of containers and microservices, which dramatically speed up and simplifying the jobs of developers <a class="read-more-link" href="https://www.aiuniverse.xyz/8-speed-bumps-that-may-slow-down-the-microservices-and-container-express/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/8-speed-bumps-that-may-slow-down-the-microservices-and-container-express/">8 speed bumps that may slow down the microservices and container express</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source &#8211; <strong>zdnet.com</strong></p>
<p>At the core of any DevOps initiative is the judicious employment of containers and microservices, which dramatically speed up and simplifying the jobs of developers and operations teams alike. While many of the tried-and-true rules of IT management apply, containers and microservices also add new considerations, and new ways of doing things.</p>
<p>To explore many of the IT management concerns that accompany successful container and microservice deployments, we turn to the observations of two seasoned experts in the field, Ashesh Badani, VP and general manager of OpenShift for Red Hat and Marc Wilczek, a highly regarded industry thought leader.</p>
<p>Badani, writing at The Enterprisers Project, observes that the ultimate goal of containers and microservices &#8212; and the DevOps they enable &#8212; is agility. &#8220;Containers corral applications in a neat package, isolated from the host system on which they run. Developers can easily move them around during experimentation, which is a fundamental part of DevOps. Containers also prove helpful as you move quickly from development to production environments.&#8221;</p>
<p>Developers are especially enthusiastic about microservices enabled by containers for a couple of reasons, Wilczek explains in a recent post in CIO. &#8220;They enable developers to isolate functions eas­ily, which saves time and effort, and increases overall productivity. Unlike monoliths, where even the tiniest change involves building and deploying the whole application, each microservice deals with just one concern.&#8221;</p>
<p>As with any promising new technology or methodology, there are barriers to overcome, both organizational and technological. Badini and Wilczek offer sage advice on overcoming the major speed bumps that may flummox the move to a microservices and containerized architecture:</p>
<p><strong>Organizational skills and readiness: </strong>Before any container of microservices effort can get underway, people across the enterprise need to be on board with it, and ready to adapt their own mindsets. &#8220;IT leaders driving cultural change need support from both the C-suite and evangelists in the smaller teams,&#8221; says Badini, warning that all too often, &#8220;the easiest thing to do is just do nothing.&#8221; But today&#8217;s hyper-competitive and hyper-fast economic environment demands the speed and agility containers and microservices make possible. The good news, Badani adds, is &#8220;you don&#8217;t need all the resources or skills of Facebook in order to make significant business change. Start experiments with smaller groups. As you succeed and become more comfortable, expand out in terms of technology and talent. Encourage people on your team to engage with their peers outside the company, to talk about technology and culture challenges.&#8221;</p>
<p><strong>Platform. </strong>Choice of platform is key to a container and microservices efforts. &#8220;A platform addresses management, governance, and security concerns,&#8221; says Badani. &#8220;While there are plenty of open source container tools to experiment with, an enterprise-grade container platform typically comprises dozens of open source projects, including Kubernetes orchestration, security, networking, management, build automation and continuous integration and deployment capabilities out of the box.&#8221;</p>
<p><strong>Capacity and lifecycle management:</strong> &#8220;Both containers and microservices can easily be replaced and therefore tend to have a relatively short lifespan&#8221; &#8212; often measured in days, says Wilczek. &#8220;The short lifespan combined with the enormous density lead to an unprecedented number of items that require monitoring.&#8221; In addition, containers need a lot of memory space. The challenge is that &#8220;with their own operating environment attached, images can easily reach a couple of hundred megabytes in size,&#8221; Wilczek says. He recommends ongoing lifecycle management practices &#8212; &#8220;especially retiring old images to free up shared resources and avoid capacity constraints.&#8221; An ability to quickly retire containers to free up memory space requires a comprehensive lifecycle management effort.</p>
<section class="sharethrough-top" data-component="medusaContentRecommendation" data-medusa-content-recommendation-options="{&quot;promo&quot;:&quot;promo_ZD_recommendation_sharethrough_top_in_article_desktop&quot;,&quot;spot&quot;:&quot;dfp-in-article&quot;}"></section>
<p><strong>Network layer: </strong>Wilczek cautions that networks &#8212; or even virtualized network layers &#8212; may prove to be bottlenecks in the performance of microservices and containerized architectures, and thus require &#8220;close monitoring in terms of performance, load balancing, and seamless interaction.&#8221;</p>
<p><strong>Balancing legacy and cloud-native apps:</strong> Trade-offs between existing infrastructure and new, cloud-borne applications may be a sticking point, but is normal, Badani explains. &#8220;Some CIOs still have COBOL apps to support. Grappling with both old and new technologies, and making tradeoffs, is normal. Some companies seek containers mostly to house cloud-native apps being created by application development teams, including new work and revamps of existing apps. These apps are often microservices-based. The goal is to break up an app into its underlying services, so teams can update the apps independently.&#8221;</p>
<p><strong>Monitoring: </strong>&#8220;Many traditional IT monitoring tools don&#8217;t provide visibility into the containers that make up those microservices, leading to a gap somewhere between hosts and applications that is ultimately off the radar,&#8221; Wilczek warns. &#8220;Organizations need to put one common monitoring in place comprising both worlds and covering the entire IT stack &#8211; from the bottom to the top.&#8221;</p>
<p><strong>Manageability. </strong>Organizations need to ensure that there are enough staff resources dedicated to container and microservice deployment and management. &#8220;All too often, developers are tempted to add new functionality by creating yet another microservice,&#8221; Wilczek says. &#8220;In no time, organizations find themselves attempting to manage an army of containers and countless microservices competing for the same IT infrastructure underneath.&#8221; He recommends employing &#8220;analytics tools that discover duplicative services, and detect patterns in container be­havior and consumption to prioritize access to systems resources.&#8221;</p>
<p><strong>Security.</strong> &#8220;Because containers contain system specific libraries and dependencies, they&#8217;re more prone to be affected by newly discovered security vulnerabilities,&#8221; Badani says, who recommends the use of &#8220;trusted registries, image scanning, and management tools&#8221; that can help automatically identify and patch container images.</p>
<p>The post <a href="https://www.aiuniverse.xyz/8-speed-bumps-that-may-slow-down-the-microservices-and-container-express/">8 speed bumps that may slow down the microservices and container express</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/8-speed-bumps-that-may-slow-down-the-microservices-and-container-express/feed/</wfw:commentRss>
			<slash:comments>2</slash:comments>
		
		
			</item>
	</channel>
</rss>
