<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>cloud-native Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/cloud-native/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/cloud-native/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Thu, 28 Jan 2021 06:05:31 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>Why Are Java And Python The Most Preferred For Cloud-Native Application Development?</title>
		<link>https://www.aiuniverse.xyz/why-are-java-and-python-the-most-preferred-for-cloud-native-application-development/</link>
					<comments>https://www.aiuniverse.xyz/why-are-java-and-python-the-most-preferred-for-cloud-native-application-development/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 28 Jan 2021 06:05:29 +0000</pubDate>
				<category><![CDATA[Python]]></category>
		<category><![CDATA[application]]></category>
		<category><![CDATA[cloud-native]]></category>
		<category><![CDATA[Development]]></category>
		<category><![CDATA[Java]]></category>
		<category><![CDATA[Preferred]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=12589</guid>

					<description><![CDATA[<p>Source &#8211; https://www.whatech.com/ The world right now runs on a network of trillions of signals sent from billions of computer applications designed and maintained by thousands of <a class="read-more-link" href="https://www.aiuniverse.xyz/why-are-java-and-python-the-most-preferred-for-cloud-native-application-development/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/why-are-java-and-python-the-most-preferred-for-cloud-native-application-development/">Why Are Java And Python The Most Preferred For Cloud-Native Application Development?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.whatech.com/</p>



<p>The world right now runs on a network of trillions of signals sent from billions of computer applications designed and maintained by thousands of people. It is hence safe to assume that life right now runs digitally.</p>



<p>It was not the same a couple of decades ago but. The people who had computers were the exception and now it is the opposite.</p>



<p>All these software, technologies, and applications are a result of brainstorming by many technology enthusiasts who constantly work to make human life simpler.</p>



<p>That being said, it is known that whenever the outcome is a machine that simplifies human life, the technology behind it is as complicated. <strong>Android app development services </strong>could be a great example of this as they run on thousands of different network nodes and create applications to run on mobile phones.</p>



<p><strong>A Glance At Cloud-Native Application Development</strong></p>



<p>The latest revolution in this field is cloud-native application development. But what is it and how has it become so important for developers in the technological sector? A series of tiny, autonomous, and loosely coupled services are cloud-native applications.</p>



<p>They are intended to have well-recognized market benefits such as the ability to integrate customer input for quality improvement quickly.</p>



<p>The creation of cloud-native applications is an approach to designing, running, and enhancing apps based on well-known cloud computing techniques and technologies. An IoT app development company chooses cloud-native apps because they are easy to build, have a faster process, and are also highly scalable.&nbsp;</p>



<p>If an app is &#8220;cloud-native,&#8221; it is designed specifically to provide a seamless experience of creation and automated management through private, public, and hybrid clouds.</p>



<p>Hence, if <strong>Android</strong> <strong>app development services </strong>can build new applications faster, optimize existing ones, and connect them all through cloud-native computing, they will be delivering applications more rapidly like how the business demands in competitive times. But for this formula to work, the applications must be programmed using the right language since that is what guarantees the quality of the applications to be top-notch.</p>



<p>While there are so many programming languages out there, Java and Python are the most preferred for cloud-native apps because of the reasons listed below.</p>



<p><strong>Java For Cloud Computing</strong></p>



<p>Java has been in business for way too long to now suddenly be labeled obsolete simply because there are new and more creative languages. Even now Java development services are using Java to develop and maintain applications with cutting-edge technology due to its robustness, security enhancement, ease of use, and the ability to transfer to multiple platforms.</p>



<p>The reason for developers and businesses to choose Java-powered cloud-native application development was to build custom apps faster without compromising on the standards of quality required to sustain in the competitive market. It has been used to create Gmail, Hadoop platform, Confluence, etc.</p>



<p>Java as a programming language only adds to this goal. Java as a programming language is secure, portable, and stable and also ensures high-performance execution without consuming unnecessary time.</p>



<p>Java offers the powerful framework required to support the multi-cloud store, cloud computing, and reactive programming for updating and improving applications. A Java development company backs Java as the preferred language for the following reasons:</p>



<ul class="wp-block-list"><li>Serverless architecture can be supported by Java.</li><li>AOT (ahead-of-time) compilation&nbsp;and microframeworks are possible with Java.</li><li>Big size distribution is also possible because of the flexibility of Java.</li><li>A Java development company can also access reusable codes and is product-oriented to create custom applications.</li></ul>



<p><strong>Python For Cloud-Native Application Development</strong></p>



<p>Python simplifies the production of web applications, APIs, academic programming, and data science. Python is regarded as an attractive programming language that supports growth opportunities in diversified fields.</p>



<p>Python is one of the few languages that can be used for manipulating and processing massive data sets that are highly suitable. Python is most suitable for cloud computing for neural networks, machine learning, and streaming analytics systems.</p>



<p>Features like the ease of learning, brisk and easy to use data structures, third-party modules, far-reaching support libraries, community development, and efficient production of applications make Python the first choice of every&nbsp;<strong><a target="_blank" rel="noreferrer noopener" href="http://url.whate.ch/1beqy">IoT development app company</a>.</strong></p>



<p>Python is also called the preferred language due to the successful applications it has already created. The most trending apps like Netflix, Pinterest, Reddit, Spotify, and Instagram have all been created using Python.</p>



<p>With a portfolio like this one to testify its efficiency, it is safe to assume that Python even after thirty years of being in existence has managed to keep up with the changing rules of application development and has also justified its place at the top by programming applications that are used worldwide by billions of people. A few more reasons to choose Python as a programming language are listed below:</p>



<ul class="wp-block-list"><li>Python can be used to build all kinds of apps such as business applications, image and design applications, GUI-based desktop applications, scientific and computational applications.</li><li>Python is efficient when cloud computing involves neural networking.</li><li>It is easy to use while streaming analytics structures.</li><li>Ease of integration for hybrid applications running on several operating systems.</li></ul>



<p><strong>Few Final Words</strong></p>



<p>When it comes to cloud programming, in order to get better goods, it is important to use data-oriented languages rather than general-purpose ones. With the amount of technological development happening round the clock and the competition taking place on a global level, it has become very tough for companies to create applications that are not only unique but also efficient.</p>



<p>There is also the need to be the first one in innovation and development to survive in the ever-evolving tech industry. Ever since cloud computing has begun, building apps has become somewhat easier due to its speed.</p>



<p>However, building efficient and bug-free apps means using a robust programming language that does not compromise on the scalability and innovation of the app. Python and <strong>Java development services</strong> have proven to be two of the most preferred languages for cloud-native application development as they are easy to use, highly portable, and efficient.</p>



<p></p>
<p>The post <a href="https://www.aiuniverse.xyz/why-are-java-and-python-the-most-preferred-for-cloud-native-application-development/">Why Are Java And Python The Most Preferred For Cloud-Native Application Development?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/why-are-java-and-python-the-most-preferred-for-cloud-native-application-development/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Analyst Watch: 3 steps to becoming cloud native</title>
		<link>https://www.aiuniverse.xyz/analyst-watch-3-steps-to-becoming-cloud-native/</link>
					<comments>https://www.aiuniverse.xyz/analyst-watch-3-steps-to-becoming-cloud-native/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 10 Apr 2020 07:20:40 +0000</pubDate>
				<category><![CDATA[Microservices]]></category>
		<category><![CDATA[applications]]></category>
		<category><![CDATA[cloud]]></category>
		<category><![CDATA[cloud-native]]></category>
		<category><![CDATA[data analysts]]></category>
		<category><![CDATA[Development]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=8082</guid>

					<description><![CDATA[<p>Source: sdtimes.com What is a cloud-native enterprise and how does an enterprise achieve that designation? A cloud-native enterprise is one that specializes in cloud-native development, or development <a class="read-more-link" href="https://www.aiuniverse.xyz/analyst-watch-3-steps-to-becoming-cloud-native/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/analyst-watch-3-steps-to-becoming-cloud-native/">Analyst Watch: 3 steps to becoming cloud native</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: sdtimes.com</p>



<p>What is a cloud-native enterprise and how does an enterprise achieve that designation? A cloud-native enterprise is one that specializes in cloud-native development, or development that is optimized for distributed infrastructures.&nbsp;</p>



<p>Examples of distributed infrastructures include hybrid clouds — on-premises applications that use products and services from a multitude of sources and applications that leverage a multitude of containers.&nbsp;</p>



<p>Cloud-native development is optimized for distributed infrastructures because of its ability to bring the automation of the cloud directly to the application stack in the form of automated scalability, elasticity and high availability. By automating the operational management of application infrastructures, cloud-native development enables enhanced development velocity and agility in ways that empower enterprises to produce, disseminate and consume software and application-related services on an unprecedented scale.</p>



<p>The automation specific to cloud-native development is important because it enables the development and maintenance of ecosystems of digitized objects such as connected homes, appliances, automobiles, laptops, mobile devices and wearables. Technology suppliers that are seeking to gain market share in the rapidly emerging landscape of digitized ecosystems would do well to embed cloud-native development practices in their development methodologies by taking the following three steps: (1) embracing platform as a service; (2) cultivating developer familiarity with cloud-native technologies; and (3) creating a developer-centric culture in which everyone is a developer.</p>



<p>Platform as a service is a key component of an enterprise’s transition to cloud-native development because it provides developers with self-service access to developer tools as well as the ability to provision infrastructure. This ability to self-serve accelerates development cadences and empowers developers to work independently of a centralized IT authority. By having access to an integrated platform of products and services, developers enjoy increased developer agility in ways that predispose enhanced responsiveness and participation in collaborative decisions.</p>



<p>Another key step for enterprises in their transition to cloud native involves cultivating developer familiarity with cloud-native technologies such as microservices, containers, container orchestration frameworks and processes such as DevOps. The universe of cloud-native technologies also includes functions as a service, APIs, serverless technologies, service mesh and a multitude of others. That said, cultivating developer familiarity with microservices and containers marks a significant step in an enterprise’s journey to becoming cloud native that is likely to initiate familiarity with adjacent technologies.</p>



<p>To become truly cloud native, enterprises need to create a developer-centric culture in which everyone is a developer. This means that professional resources such as business analysts, project managers, HR, business partners, market intelligence analysts and data scientists all variously participate in application development in one form or another, whether it be through the development of net-new applications by using low-code or no-code development tools, or otherwise through configuring dashboards and widgets in pre-existing application templates. This democratization of development is a key component of an enterprise’s path toward cloud-native development, because it increases the digital literacy of business resources that collaborate with IT resources, which are more directly in charge of developing and maintaining applications.</p>



<p>The increased digital literacy of business stakeholders enables them to more richly inform application developers about the requirements for the digitization of business operations. In addition, the participation of business resources in application development empowers business professionals to contribute to application development and subsequently augment and extend the digitization efforts that are led by professional application developers.</p>



<p>The key takeaway here is that the transition of an enterprise to cloud native transcends the acquisition of developer familiarity with technologies such as microservices, containers, container orchestration frameworks and DevOps. The transition requires the confluence of the adoption of a platform, proficiency with cloud-native technologies and the democratization of development to professional resources who do not have the job title of a developer. This confluence paves the way for an enterprise to perform high velocity, hyperscale development that empowers enterprises to create and maintain ecosystems of digitized objects that serve the intensified needs of the digital economy for increased digitization.</p>
<p>The post <a href="https://www.aiuniverse.xyz/analyst-watch-3-steps-to-becoming-cloud-native/">Analyst Watch: 3 steps to becoming cloud native</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/analyst-watch-3-steps-to-becoming-cloud-native/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>How Today’s Microservices are Building Tomorrow’s Cloud</title>
		<link>https://www.aiuniverse.xyz/how-todays-microservices-are-building-tomorrows-cloud/</link>
					<comments>https://www.aiuniverse.xyz/how-todays-microservices-are-building-tomorrows-cloud/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Sat, 21 Mar 2020 05:37:30 +0000</pubDate>
				<category><![CDATA[Microservices]]></category>
		<category><![CDATA[cloud-native]]></category>
		<category><![CDATA[containers]]></category>
		<category><![CDATA[Kubernetes]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=7608</guid>

					<description><![CDATA[<p>Source: rtinsights.com Netflix, Amazon, and PayPal – these well-recognized global brands fundamentally changed the way we consume entertainment, shop, and manage finances, respectively. However, without major advancements <a class="read-more-link" href="https://www.aiuniverse.xyz/how-todays-microservices-are-building-tomorrows-cloud/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-todays-microservices-are-building-tomorrows-cloud/">How Today’s Microservices are Building Tomorrow’s Cloud</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: rtinsights.com</p>



<p>Netflix, Amazon, and PayPal – these well-recognized global brands fundamentally changed the way we consume entertainment, shop, and manage finances, respectively. However, without major advancements shaping the database industry over the last 40 years, these brands – synonymous with their cloud-based applications – would not exist. To be specific, today’s instant, on-demand services have been made possible, in part, by the single node Relational Database Management System (RDBMS).</p>



<p>As more services are made available online and digital innovations become mainstream, it’s important to step back and understand how the underlying technology originated, how it evolved to meet both business and consumer demands, and how technology must continue to evolve to adequately support the growing demand for these applications (look no further to the advent of more streaming services in the past year alone – Disney+, AppleTV+, and more).</p>



<h4 class="wp-block-heading"><strong>The Internet’s Infancy&nbsp;</strong></h4>



<p>During the Internet’s infancy stages, Oracle made RDBMS databases popular – at least popular among business users who were using simple client-server applications to help them manage operations, such as sales and employee data, customer relationships, supply chains, and more. These applications were monoliths and were perfectly suited to rely on the monolithic databases used by businesses at the time and had no real impact on the consumer world, which remained mostly disconnected from the (at the time) little-known World Wide Web.</p>



<h4 class="wp-block-heading"><strong>The Open Source Database Era</strong></h4>



<p>The Internet’s arrival for mass-consumption in the mid-1990s heralded the explosion of web applications. This meant that developers now needed faster, more efficient, and cost-effective ways to meet the demand of connected applications. The open-source database era was born and filled the void that the stagnation of monolithic databases had created.</p>



<p>The increasing consumption of open source databases helped make a feature-rich Internet possible. The mass adoption of web applications created a proliferation of data, from transactions, searches, and other activities that were introduced online, and often led to fragmented databases to support that data. However, as more and more features were being introduced that leveraged the underlying database, the existing database capabilities were challenged and unable to meet the demands of these new technologies.</p>



<p>Applications had grown so large and complex that by the early 2000s, when developers needed to store the data in an RDBMS, they had to do a sharded deployment. A sharded deployment is an approach where developers take incoming data, manually partition it into subsets and store each partition in a different RDBMS database instance, while still keeping that data available, in order to scale out. However, sharding a SQL (the universal language of relational databases) database required developers to write more code, which often meant they had to compromise on transactions, referential integrity, joins, and other major SQL functions. Clearly, this approach had run its course, and a new era was needed to manage massive amounts of data without losing SQL’s flexibility for ever-changing applications.</p>



<h4 class="wp-block-heading"><strong>More Services</strong></h4>



<p>As the Internet continued to surge, so too did online services – everything from banking, to healthcare, to eCommerce, and more. These services required so much data that manual sharding was nearly impossible to achieve. Because of this, developers began turning to a natively scalable approach called NoSQL – the first class of distributed databases that gave developers the ability to run applications at scale. However, more scalability meant a decrease in consistency, which developers readily accepted because they prioritized keeping pace with the expanding online landscape.</p>



<p>NoSQL today still solves for scale and availability, but it continues to compromise on SQL feature-set and consistency. Additionally, monolithic databases still exist, but the transactional applications originally based on Oracle, SQL Server, or PostgreSQL need to be rewritten to adapt to modern architectures like the public cloud and Kubernetes, an open-source container orchestration system for automating application deployment, scaling, and management (more on this later).</p>



<p>Customer expectations that have come with applications like Netflix, Amazon, and PayPal are now sprawling into business applications – i.e., enterprise applications need to be instantly accessible from anywhere and from any device. These expectations equate to richer experiences that can only be made possible by fully leveraging the cloud, and doing so requires technology to evolve even more.</p>



<h4 class="wp-block-heading"><strong>Microservices</strong></h4>



<p>The latest technology advancement that has significantly helped developers keep pace with user demand is a microservices-based design. Microservices are independent processes that communicate with each other to accomplish a task within a much larger application. More importantly, microservices help developers deliver new features more quickly and effectively than ever before to keep giving users what they crave – any time and on any device of their own choosing.</p>



<p>But digital transformation and the explosion of applications have created a need for more databases, machines, and data centers. While microservices are helping developers keep pace, new challenges are emerging:</p>



<ul class="wp-block-list"><li><strong>Rising Database Costs</strong>&nbsp;– Licensing and support costs of most database providers rise with the amount of data. This creates a disparity in the value-to-cost ratio, which prompted the open-source model of database transparency. But this model, too, can be limiting as many offerings restrict essential features that most businesses require, and only make those features available in their commercial versions.</li><li>&nbsp;<strong>Legacy Databases</strong>&nbsp;– On-premises users may have fewer machines to deal with, but legacy databases are expensive, require a lot of human labor, and carry risk. For example, there is a much higher chance of failure if you host an on-premises database in the cloud, a decline in availability and consistency, and a barrier to scale.</li><li><strong>Enterprise Inertia</strong>&nbsp;– Significant investments in time and resources are required to move large amounts of existing data to the cloud. However, companies must choose an alternative to their legacy architecture and adopt multi-cloud and multi-region strategies to compete in the modern landscape.</li></ul>



<h4 class="wp-block-heading"><strong>Tomorrow’s Cloud</strong></h4>



<p>Many people have worked tirelessly over the past two decades – developers, admins, network engineers, IT, and more – to ensure the internet and all its applications run smoothly for both business and consumer needs. But every technological innovation unearths new challenges that must be addressed. That’s why developers are now turning to multi-cloud deployments and need multi-cloud databases.</p>



<p>Currently, Kubernetes is the most effective approach to taking advantage of multi-cloud environments, which is used to run stateless applications and databases. Using portable containers means that developers can take advantage of the different strengths that exist in different clouds that are managed and operated simultaneously with Kubernetes. This allows users to deploy applications in a cloud-neutral manner, which means leveraging both public and private cloud as needed. Importantly, multi-cloud environments allow developers to create databases that serve the relational data modeling needs of microservices, provide zero data loss and low latency guarantees, and scale on-demand with ease in response to planned and unplanned business events.</p>



<p>The last 40 years of database technology have laid a foundation for the immediate and unlimited access to highly desired business and entertainment applications that we enjoy today. With no end in sight for the Internet and the demand it has created, the next breakthrough in cloud deployment is only a matter of time. To keep pace with the inevitable change, developers need to remember the obstacles they’ve had to overcome in the past to understand how to stay one step ahead of today’s cloud challenges and deliver the next wave of digital innovations that will delight us in years to come.</p>
<p>The post <a href="https://www.aiuniverse.xyz/how-todays-microservices-are-building-tomorrows-cloud/">How Today’s Microservices are Building Tomorrow’s Cloud</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-todays-microservices-are-building-tomorrows-cloud/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>KT Selects Amdocs&#8217; New Cloud-native and Microservices-based Solution to Launch New 5G Services</title>
		<link>https://www.aiuniverse.xyz/kt-selects-amdocs-new-cloud-native-and-microservices-based-solution-to-launch-new-5g-services/</link>
					<comments>https://www.aiuniverse.xyz/kt-selects-amdocs-new-cloud-native-and-microservices-based-solution-to-launch-new-5g-services/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Mon, 10 Feb 2020 07:26:29 +0000</pubDate>
				<category><![CDATA[Microservices]]></category>
		<category><![CDATA[5G Services]]></category>
		<category><![CDATA[amdocs]]></category>
		<category><![CDATA[business support system]]></category>
		<category><![CDATA[cloud-native]]></category>
		<category><![CDATA[kt corporation]]></category>
		<category><![CDATA[solution]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=6654</guid>

					<description><![CDATA[<p>Source: thefastmode.com Amdocs this week announced that KT Corporation, the largest quad-play service provider in South Korea, has started to upgrade and migrate their existing product catalog to <a class="read-more-link" href="https://www.aiuniverse.xyz/kt-selects-amdocs-new-cloud-native-and-microservices-based-solution-to-launch-new-5g-services/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/kt-selects-amdocs-new-cloud-native-and-microservices-based-solution-to-launch-new-5g-services/">KT Selects Amdocs&#8217; New Cloud-native and Microservices-based Solution to Launch New 5G Services</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: thefastmode.com</p>



<p>Amdocs this week announced that KT Corporation, the largest quad-play service provider in South Korea, has started to upgrade and migrate their existing product catalog to Amdocs CatalogONE. </p>



<p>This solution enables operators to create, deploy, test and launch new services at a much faster pace and quickly take advantage of new 5G use cases and revenue opportunities.</p>



<p>Amdocs CatalogONE is one of the building blocks of CES20, Amdocs’ new cloud-native and microservices-based customer experience suite. It enables end customers to benefit from more frequent service innovation and updated plans and bundles, as well as more market-driven promotions, such as offerings around special events and new 5G features, tailored for specific customer segments and sales channels.</p>



<p>With&nbsp;a centralized view of all products and services and advanced user interface, collaboration platform and approvals and notification management capabilities, the simple-to-use Amdocs CatalogONE allows business and marketing users to manage the offering lifecycle and create frequent configuration updates. To be deployed on the cloud, it will enable KT and Amdocs teams to efficiently collaborate and handle multiple business requests in parallel.&nbsp;</p>



<p>Having previously deployed Amdocs’ real-time convergent charging solution, KT will be able to further accelerate its ability to introduce and monetize new 5G consumer and enterprise offerings.&nbsp;</p>
<p>The post <a href="https://www.aiuniverse.xyz/kt-selects-amdocs-new-cloud-native-and-microservices-based-solution-to-launch-new-5g-services/">KT Selects Amdocs&#8217; New Cloud-native and Microservices-based Solution to Launch New 5G Services</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/kt-selects-amdocs-new-cloud-native-and-microservices-based-solution-to-launch-new-5g-services/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Amazon, Deloitte Partner to Address Healthcare Data Challenges</title>
		<link>https://www.aiuniverse.xyz/amazon-deloitte-partner-to-address-healthcare-data-challenges/</link>
					<comments>https://www.aiuniverse.xyz/amazon-deloitte-partner-to-address-healthcare-data-challenges/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 15 Nov 2019 05:32:46 +0000</pubDate>
				<category><![CDATA[Data Science]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[AWS]]></category>
		<category><![CDATA[cloud-native]]></category>
		<category><![CDATA[Data challenges]]></category>
		<category><![CDATA[Machine learning]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=5167</guid>

					<description><![CDATA[<p>Source:-healthitanalytics.comAmazon Web Services and Deloitte are working to create an efficient and secure healthcare data ecosystem. Amazon Web Services (AWS) is partnering with Deloitte to help customers <a class="read-more-link" href="https://www.aiuniverse.xyz/amazon-deloitte-partner-to-address-healthcare-data-challenges/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/amazon-deloitte-partner-to-address-healthcare-data-challenges/">Amazon, Deloitte Partner to Address Healthcare Data Challenges</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source:-healthitanalytics.com<br>Amazon Web Services and Deloitte are working to create an efficient and secure healthcare data ecosystem.<br></p>



<p>Amazon Web Services (AWS) is partnering  with Deloitte to help customers securely find, subscribe to, and use  third-party data in the cloud using AWS Data Exchange, a new service  that can help address unique healthcare issues.</p>



<p>The partnership will build on the existing relationship between the 
two organizations and transform biomedical research, clinical trials, 
real-world data insights, population health, and reimbursement.</p>



<p>Most digital health data generated by patients, researchers, health 
systems, and payers is inaccessible to other organizations for multiple 
reasons, from security concerns to technology constraints and business 
model challenges. This means healthcare organizations aren’t fully 
leveraging the benefits of this data, slowing the pace of medical 
innovation and limiting the potential to improve care delivery.</p>



<h4 class="wp-block-heading">Dig Deeper</h4>



<ul class="wp-block-list"><li>UTHealth, Amazon Partner to Use Machine Learning for Medical Research</li><li>Amazon Machine Learning, Big Data Tools Have Healthcare Implications</li><li>KLAS: Artificial Intelligence Success Requires Partnership, Training</li></ul>



<p>“The explosion of digital healthcare data and advances in artificial 
intelligence (AI) and machine learning (ML) hold the promise to answer 
some of healthcare&#8217;s most important questions — what&#8217;s working for whom,
 why and at what cost,” said&nbsp;Brett Davis, principal, Deloitte Consulting
 LLP, and general manager, ConvergeHEALTH by Deloitte.</p>



<p>“However, this data is locked in siloes across many organizations 
within the health ecosystem. The industry needs new business models that
 break down these silos to connect healthcare stakeholders, reduce 
inefficiencies and accelerate the translation of research to practice 
and improve patient outcomes. AWS Data Exchange provides the technology 
and infrastructure to support these new business models.”</p>



<p>AWS Data Exchange creates a secure, scalable data exchange 
infrastructure under terms and conditions that data owners can control. 
AWS Data Exchange also allows customers to benefit from third-party 
software, artificial intelligence algorithms, and professional services.</p>



<p>Deloitte’s ConvergeHEALTH Miner integration with AWS Data Exchange  means users of Miner can easily find, access, and analyze aggregated and  de-identified data from collaborators outside their organizations. This  could transform the way organizations conduct research, clinical  trials, population health management, and reimbursement.</p>



<p>The partnership will help life sciences and healthcare organizations 
use more data to perform better research and improve patient outcomes.</p>



<p>“AWS Data Exchange provides a secure and cloud-native channel to 
exchange data at scale, with cloud-based solutions like ConvergeHEALTH 
Miner supporting the analytical needs for data analysts, academic 
researchers and data scientists in the medical community,” said&nbsp;Stephen 
Orban, general manager, AWS Data Exchange, Amazon Web Services, Inc.</p>



<p>“We&#8217;re delighted to be working with Deloitte on AWS Data Exchange to 
help life sciences and healthcare organizations establish their 
strategies around the new data-driven collaborations and business 
models, while also helping data publishers with the engineering tasks 
needed to package, aggregate and anonymize their data for 
collaboration.”</p>



<p>This partnership adds to Amazon’s efforts to enhance healthcare using data. In November 2018, the company announced  a new machine learning service that can extract meaningful information  from unstructured EHR data and free-text clinical notes. Called Amazon  Comprehend Medical, the service allows developers to comb through  unstructured EHR data and pull out key clinical terms related to a  patient’s diagnosis.</p>



<p>Deloitte has also highlighted the importance of data and analytics in  the healthcare industry. In a 2019 survey, the organization emphasized the need for organizations to invest in innovative data technologies.</p>



<p>“Heading into the future, data analytics is one of the backbones for 
health systems seeking to use emerging technologies such as artificial 
intelligence (AI) and robotic process automation (RPA) to transform 
their care delivery or workforce,” the report stated.</p>



<p>“It is also important for strategies and initiatives that depend on 
data mining—such as understanding social determinants of health and the 
importance of customer experience and preferences.”</p>
<p>The post <a href="https://www.aiuniverse.xyz/amazon-deloitte-partner-to-address-healthcare-data-challenges/">Amazon, Deloitte Partner to Address Healthcare Data Challenges</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/amazon-deloitte-partner-to-address-healthcare-data-challenges/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Is Machine Learning the Future of Cloud-Native Security?</title>
		<link>https://www.aiuniverse.xyz/is-machine-learning-the-future-of-cloud-native-security/</link>
					<comments>https://www.aiuniverse.xyz/is-machine-learning-the-future-of-cloud-native-security/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 16 Jul 2019 07:48:15 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[cloud-native]]></category>
		<category><![CDATA[Future]]></category>
		<category><![CDATA[Learning]]></category>
		<category><![CDATA[machine]]></category>
		<category><![CDATA[Security]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=4019</guid>

					<description><![CDATA[<p>Source: darkreading.com Cloud-native architectures help businesses reduce application development time and increase agility, at a lower cost. Although flexibility and portability are key drivers for adoption, a <a class="read-more-link" href="https://www.aiuniverse.xyz/is-machine-learning-the-future-of-cloud-native-security/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/is-machine-learning-the-future-of-cloud-native-security/">Is Machine Learning the Future of Cloud-Native Security?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: darkreading.com</p>



<p>Cloud-native architectures help 
businesses reduce application development time and increase agility, at a
 lower cost. Although flexibility and portability are key drivers for 
adoption, a cloud-native structure brings with it a new challenge: 
managing security and performance at scale.&nbsp;</p>



<p><strong>Challenges in the Cloud<br></strong>The nature of containers and microservices makes it harder to protect them in these ways:</p>



<p><strong>1.</strong> They have a dissolved
 perimeter, meaning that once a traditional perimeter is breached, 
lateral movement of attacks (such as malware or ransomware) often goes 
undetected across data centers and/or cloud environments.</p>



<p><strong>2.</strong> With a DevOps 
mindset, developers are continuously building, pushing, and pulling 
images from various registries, leaving the door open for various 
exposures, whether they are operating system vulnerabilities, package 
vulnerabilities, misconfigurations, or exposed secrets.</p>



<p><strong>3.</strong> The ephemeral and 
opaque nature of containers leaves a massive amount of data in its wake,
 making visibility into the risk and security posture of the 
containerized environment extremely complicated. Sorting through 
interconnected data from thousands of services across millions of 
short-lived containers to understand a specific security or compliance 
violation in time is akin to finding a needle in a haystack.</p>



<p><strong>4.</strong> With increased 
development speeds, security is being pushed later in the development 
cycle. Developers are failing to bake security in early, opting instead 
to add it on at the end, and ultimately, they are increasing the chance 
of potential exposures in the infrastructure.</p>



<p>With tight budgets and the pressure to constantly innovate, machine 
learning (ML) and AIOps — that is, artificial intelligence for IT 
operations — are increasingly being built into security vendor road maps
 because it is the most realistic solution to decrease the burden on 
security professionals in modern architectures, at least at this point.</p>



<p><strong>What Makes ML a Good Fit?<br></strong>As containers are 
constantly being spun up and down on demand, there is no margin of error
 for security. An attacker has to be successful just once, and this is 
much easier in a cloud-native environment that is constantly evolving, 
especially as security struggles to keep up. This means runtime 
environments can now be compromised due to insider hacks, policy 
misconfigurations, zero-day threats, and/or external attacks.</p>



<p>It is hard for a resource-starved security team to manually secure 
against these threats, at scale, in this dynamic environment. It may 
take hours or days before a security profile is adjusted, which is 
plenty of time for a hacker to exploit this window of opportunity.</p>



<p>Over the last few decades, we have witnessed tremendous progress in 
ML algorithms and techniques. It has now become possible for individuals
 who do not necessarily have a statistical background to take models and
 apply them to various problems.</p>



<p>Containers are a good fit for supervised learning models for the following reasons:</p>



<p><strong>1.&nbsp;Containers have minimal surface area:</strong>
 Because containers are fundamentally designed for modular tasks and 
have smaller footprints, it is easier to define baseline activity inside
 and decide what is normal versus abnormal. In a virtual machine, there 
could be hundreds of binaries and processes running, but in a container,
 the number is far less.</p>



<p><strong>2. Containers are declarative:</strong>  Instead of looking at a random manifest, DevOps teams can look at the  daemon and container environment to understand exactly what that  specific container would be allowed to do at runtime.</p>



<p><strong>3.</strong>&nbsp;<strong>Containers are immutable:</strong>
 The immutability factor serves as a theoretical guardrail to prevent 
changes at runtime. For example, if a container starts running netcat 
all of a sudden, that could be an indicator of a potential compromise.</p>



<p>Given these characteristics, ML models can learn from the behavior, 
enabling them to be more accurate when creating runtime profiles that 
assess what should be allowed versus not. By letting machines define 
pinpointed profiles and automatically spotting indicators of potential 
threat, it improves detection. This also alleviates some of the burnout 
among members of the security operations center team because they don&#8217;t 
have to manually create specific rules for their different container 
environments, which helps them focus on the response and remediation 
rather than manual detection.</p>



<p>In this new world, security has to keep up with the ever-changing  technology landscape. Teams must equip their cloud-native security tools  to cut through noise and distractions, and find the insight they are  looking for and need. Without ML, security teams find themselves stuck  on details that don&#8217;t matter and missing what does.</p>



<p><strong>Related Content:</strong></p>



<ul class="wp-block-list"><li><strong>The 2019 State of Cloud Security</strong></li><li><strong>Cloud Security and Risk Mitigation</strong></li><li><strong>Serverless Computing from the Inside Out</strong></li><li><strong>The Life-Changing Magic of Tidying Up the Cloud</strong></li></ul>
<p>The post <a href="https://www.aiuniverse.xyz/is-machine-learning-the-future-of-cloud-native-security/">Is Machine Learning the Future of Cloud-Native Security?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/is-machine-learning-the-future-of-cloud-native-security/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Securing microservice environments in a hostile world</title>
		<link>https://www.aiuniverse.xyz/securing-microservice-environments-in-a-hostile-world/</link>
					<comments>https://www.aiuniverse.xyz/securing-microservice-environments-in-a-hostile-world/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 21 Aug 2018 05:58:51 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Microservices]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[application development]]></category>
		<category><![CDATA[cloud-native]]></category>
		<category><![CDATA[Microservice]]></category>
		<category><![CDATA[microservices deployment]]></category>
		<category><![CDATA[security mechanisms]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=2768</guid>

					<description><![CDATA[<p>Source &#8211; networkworld.com At the present time, there is a remarkable trend for application modularization that splits the large hard-to-change monolith into a focused microservices cloud-native architecture. The <a class="read-more-link" href="https://www.aiuniverse.xyz/securing-microservice-environments-in-a-hostile-world/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/securing-microservice-environments-in-a-hostile-world/">Securing microservice environments in a hostile world</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source &#8211; networkworld.com</p>
<p>At the present time, there is a remarkable trend for application modularization that splits the large hard-to-change monolith into a focused microservices cloud-native architecture. The monolith keeps much of the state in memory and replicates between the instances, which makes them hard to split and scale. Scaling up can be expensive and scaling out requires replicating the state and the entire application, rather than the parts that need to be replicated.</p>
<p>In comparison to microservices, which provide separation of the logic from the state, the separation enables the application to be broken apart into a number of smaller more manageable units, making them easier to scale. Therefore, a microservices environment consists of multiple services communicating with each other. All the communication between services is initiated and carried out with network calls, and services exposed via application programming interfaces (APIs). Each service comes with its own purpose that serves a unique business value.</p>
<p>Within a microservices deployment, one must assume that the perimeter is breachable.  Traditional security mechanisms only provide a layer of security for a limited number of threats. Such old-fashioned mechanisms are unable to capture the internal bad actors where most compromises occur. Therefore, it is recommended to deploy multiple security layers and employ zero trust as the framework. This way, the new perimeter and decision point will be at the microservice.</p>
<p>In this day and age, we must somehow enforce separation along with consistent policy between the services, while avoiding the perils of traditional tight coupling, and not jeopardize security. We need to find a solution so that the policy is managed centrally. However, at the same time, the policy should be enforced in a distributed fashion to ensure the workloads perform as designed and do not get compromised.</p>
<aside id="" class="nativo-promo nativo-promo-1 smartphone"></aside>
<h2>The cost of agility</h2>
<p>The two main drivers for change are agility and scale. Within a microservices environment, each unit can scale independently, driving massively scalable application architectures. Yet, this type of scale was impossible when it was necessary to couple heavy data services along with the application.</p>
<p>The ability to scale and react rapidly increases business velocity, which allows organizations to reap the benefit in terms of cost and resilience and also improved ways of managing and building the application. However, the decentralized nature of agile deployments introduces challenges in terms of governance.</p>
<p>We should keep in mind that we now have a distributed organization with sub-teams responsible for individual microservices, In addition, the patching and updates are carried out in real time. Eventually, this creates a gap that needs to be filled. The gap consists of visibility and the ability to scale policy in a distributed fashion.</p>
<h2>Complexity is the enemy of security</h2>
<p>The cloud-native approach introduces considerable complexity. Besides complexity, the end user is responsible for securing their own environment. With microservices, there are many more moving pieces and paths of communication, introducing complexity that must be managed. We need to manage this complexity while keeping the holistic view of the application and visibility as to how each component is operating.</p>
<aside id="" class="nativo-promo nativo-promo-1 tablet desktop"></aside>
<p>The attempt to secure a complex deployment using existing tools does not work and leads to a complicated security solution with complex policies. Complexity is the enemy of security. As security solutions become more complex, they become unmanageable and less secure. There is a requirement for a new unified security framework that can adapt to the different microservice environments while still providing full visibility along with simplified policy management.</p>
<p>You really need to know who is talking at any given point, authenticate the source, and authorize the type of transaction the API communication is trying to do. You should be aware of the specific communication, which is going on within these channels and what should be authenticated and authorized to communicate.</p>
<p>This is impossible to do efficiently unless changes are introduced to the microservices architecture. Microservice deployments are susceptible to an array of security threats. API vulnerabilities, logic attacks, lateral movements, and the inadequacy of traditional security tools bring the systems to a halt like a house of cards tumbling down.</p>
<h2>Diverse traffic patterns</h2>
<p>Today’s traffic patterns are different from those of the past. Nowadays, there are a lot of APIs connecting inbound and outbound along with internal communication. The APIs are all public, open and customer facing. The administrators are permitting this type of communication in and out of the public and private data center.</p>
<p>There is typically considerable asymmetry between the front-end ingress API and the backend APIs. Considering the customer environment, there is an initial consumer API call, but that propagates numerous other backend API calls to carry out, for example, user and route lookups.</p>
<p>As the microservice environment develops more components, it is difficult to monitor and make sure everything is secure. The deployment of web application firewalls (WAFs) to secure the public APIs and the use of next-gen firewalls filtering at strategic network points cover only a part of the attack surface. We must still assume that the perimeter is breachable along with the high potential for internal bad actors.</p>
<h2>Traditional security mechanisms fall short</h2>
<p>The network perimeter was born in a different time and traditional security mechanisms based on Internet Protocol (IP) and 5-tuple no longer suffice. The traditional perimeter consists of the virtual or physical appliance such as a firewall, IPS/IDS or API gateway located at strategic network points. Actually, the traditional perimeter with its traditional security mechanisms only provides the first layer of security. Even though it is labeled as defense in depth, still it is far from reaching that status in a microservices environment.</p>
<p>For example, API gateways are meant to manage the inbound calls. APIs are registered with the API gateway, which changes the workflow. They don’t scale in a microservices environment where there could be hundreds of services, each one exposing a number of API’s and each service containing multiple instances.</p>
<p>The API gateway needs to scale not just with the external traffic, but also with east to west internal traffic. This is the order of magnitude that holds the significant share of total traffic. Web application firewalls (WAFs) do not change the workflow but they share some of the challenges of API gateways. It is impossible to create and manage security when policies are not distributed to the workloads. There is a lot of work to be done for just a limited number of APIs that exponentially grow with internal communications. This is clearly not practical in case of microservices deployments.</p>
<p>Next-gen firewalls are typically the central security resource. They are more suitable for north to south traffic flows but not for internal east to west. In this world where everything is HTTP, firewalls do not offer the best visibility and access control. A firewall typically forces security on the source and destination IP and protocols, but in a microservices environment, the regular communication port is 80/443. It is very common for all to use the same port and protocol for communication.</p>
<p>For this to work, the firewall would need to follow the identity of the source and destination IP address along with the source and destination port numbers. It should have the ability to deal with an orchestration system, changing the identity all the time.</p>
<p>Enforcement should be done in a distributed fashion, right down at the workload level. If what you are monitoring and protecting is accessible to the application and application behavior, it matters less where the attacks come from. However, security frameworks based on traditional mechanisms can leave many avenues for bad actors to camouflage their attacks.</p>
<h2>The larger attack surface requires a new perimeter</h2>
<p>If the security cannot follow the microservices, you need to bring the security to the microservice and embed the security with the microservice. An effective perimeter is the decision point at the microservice that is not set at the strategic points located within the network. The new perimeter is at the microservice layer and everywhere that has an API. This is the only way to protect, especially when it comes to logic attacks.</p>
<p>Logic attacks become more prevalent in a microservices environment. This type of threat is carried out by a sophisticated attacker, not a script kiddy using a readily accessible tool. They take their time to penetrate into the perimeter to silently explore the internal environment and to go unnoticed while accessing valuable assets.</p>
<p>Cloud native applications expose their logic in multiple layers, not just one. Each microservice exposes some application logic through an API and these APIs if not efficiently secured, can be manipulated by a bad actor. A practical example would be an API that is meant to give you information about a single entry in a database. If the bad actor is able to modify the query slightly, they can pull multiple entries in the database that they do not have authority to access. Now, this can be exploited in every single API, which presents a much larger attack surface than before. This can bring about the opportunity for advanced persistent threat (APT).</p>
<p>As stated earlier, the distributed architectures give a much larger surface area. Each one of these small components is exposed to threats. The surface area is the sum of all APIs and the interactions both internal and external of the application. If you examine an API that is exposed to the outside, you would see hundreds of API calls. This offers numerous ways to exploit the vulnerabilities of an externally facing API. Within a kill-chain, the API is not just used to gain access but also as a way to perform lateral movements.</p>
<p>We also have challenges with traffic encryption. A large part of security in the new age of east to west traffic is the ability to have everything encrypted. It is the role of the application to perform the encryption.</p>
<p>In a microservices environment, key management is a difficult task. Besides, IPsec has a very coarse granularity. If you are looking for encryption with finer granularity in this environment then you need a new type of solution.</p>
<h2>Solution components: identity</h2>
<p>Workloads can be encapsulated in a number of ways such as a virtual machine (VM), bare metal or container. As a result, what’s required is a mechanism to provide a provable and secure identity to the application, not just to the server or container but also to the actual workload that is running. Ideally, identity can be a list of attributes. Think of them as the key-value pairs that describe an object to the level of detail that you want. Indeed, the more detail you have, the better.</p>
<p>The process of providing identity to a service is called identity bootstrapping. You have to trust something in order to provide application identity i.e. there needs to be an external source of truth. Companies such as AWS, VMware, and Octarine provide this by integrating with the orchestration system.</p>
<p>The orchestration system could be anything from vCenter to AWS ECS to Kubernetes. The system monitors events of new workloads being spawned. After validating the newly spawned workload, the workload is provided with the credentials it needs to prove its identity. After validating that it is legitimate, the newly spawned workload is provided with the credentials it needs to prove its identity. This way, the secrets are never kept in the code, container image, or in kubernetes.</p>
<h2>Solution components: visibility</h2>
<p>Once the identity is taken care of, we must create security based on the secured identity. How do you enforce policy and how does it get represented when you communicate it to something else?</p>
<p>Firstly, you need to rely on the application identity and monitoring traffic at layer 7. This is because on every API call the caller identifies oneself. You can add the identity on the client side and server side, validate the identity and log the API call to a central system.</p>
<p>The central system aggregates all API calls in all deployments for the customer&#8217;s environment and provides extensive visibility. This visibility extends over time to include the history of any changes. Such visibility is useful in agile environments.</p>
<h2>Solution components: anomaly detection</h2>
<p>You must try as much as possible to enforce the policy at the endpoint. However, sometimes in order to detect sophisticated attacks, you have to correlate multiple sequences, such as time of the day, payloads, and geographic access patterns.</p>
<p>Anomaly detection is needed that comprises a component responsible for looking at all the signals at a given time. Further, it should recognize small deviations from the baseline that could not be detected if you are looking at a single endpoint.</p>
<h2>Solution components: policy</h2>
<p>In the past, policy presented two prevailing themes &#8211; ACL distributed policy solutions and segmentation based on VLANs. You really need to start to think about policy being centrally administered but highly scalable and distributed by way of enforcement.</p>
<p>The policy should be based on workload identity and not network identity. With cloud-native, there is no alignment between the identity of a workload and its network identity. You cannot enable security enforcement through pre-defined network policies laid on traditional means.</p>
<p>The policy should also be driven by visibility, enabling feedback about policy and information about violations. This would allow the administrators to update the policy as required.</p>
<p>The promise of cloud-native applications and agile environments has many benefits for business. Cloud-native deployments left to the default lacks proper security tools and methodologies. However, with a guarded approach, you can reach a secure cloud-native agile environment.</p>
<p>The post <a href="https://www.aiuniverse.xyz/securing-microservice-environments-in-a-hostile-world/">Securing microservice environments in a hostile world</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/securing-microservice-environments-in-a-hostile-world/feed/</wfw:commentRss>
			<slash:comments>5</slash:comments>
		
		
			</item>
		<item>
		<title>Microservices and cloud-native development versus traditional development</title>
		<link>https://www.aiuniverse.xyz/microservices-and-cloud-native-development-versus-traditional-development/</link>
					<comments>https://www.aiuniverse.xyz/microservices-and-cloud-native-development-versus-traditional-development/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Wed, 27 Sep 2017 07:11:13 +0000</pubDate>
				<category><![CDATA[Microservices]]></category>
		<category><![CDATA[application development]]></category>
		<category><![CDATA[cloud-native]]></category>
		<category><![CDATA[software development]]></category>
		<category><![CDATA[traditional development]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=1288</guid>

					<description><![CDATA[<p>Source &#8211; ibm.com We’ve had a very good run for the last 20 years or so with distributed systems development in the enterprise, but time has started to <a class="read-more-link" href="https://www.aiuniverse.xyz/microservices-and-cloud-native-development-versus-traditional-development/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/microservices-and-cloud-native-development-versus-traditional-development/">Microservices and cloud-native development versus traditional development</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source &#8211;<strong> ibm.com</strong></p>
<p>We’ve had a very good run for the last 20 years or so with distributed systems development in the enterprise, but time has started to show some of the downsides of traditional development styles. First of all, distributed systems have grown enormous – it’s very common to see corporate websites for retailing or customer service with hundreds or thousands of discrete functions. Likewise I’ve seen many Java EAR files at those customers whose size are measured in gigabytes. So when you have a site that large, and one that may have been originally built 15 or more years ago, there are going to be parts of it that need to be updated to today’s business realities.   The second business challenge is that the pace of business change is much more rapid now than it was in the 90s and 2000s. Now that the cellphone replacement cycle is down to a year or less, and customers are constantly updating their apps on those phones—the idea that a corporate website can remain static for months at a time is simply not in touch with the times.</p>
<p>These two trends combined together create a challenge for traditional top-down enterprise development styles. It requires an approach that is both more customer-centric and able to react more quickly, and it also requires an architecture that is able to adapt to and facilitate these rapid changes.</p>
<p>Also, in the past, when development cycles were longer, waterfall-based methods were appropriate, or at least not as much of a hindrance as they are now. If you have the luxury of time, then the downsides of top-down approaches are less apparent. As a side effect, that led to the predominance of outsourcing since if you were going to define everything up front anyway, then you might as well perform the programming work where the labor was cheapest. All of those trends are now being called into question.</p>
<p><strong>What do you see that suggests that of microservices and cloud-native development can change this situation or help address these barriers?</strong></p>
<p>Let’s start with a definition of what microservices are. The microservices approach is a way of decomposing an application into modules with well-defined interfaces that each performs one and only one business function. These modules (or microservices) are independently deployed and operated by a small team who owns the entire lifecycle of the service.   The reason this is important is that this goes back to a very old principle in computer science that was discovered by Fred Brooks – that adding people to a late project only makes it later by increasing the number of communication paths within the team.</p>
<p>Instead, microservices accelerate delivery by minimizing communication and coordination between people while reducing the scope and risk of change.   Why is this important? Because in order to meet the rapidly changing pace of development we have to be able to limit both the scope of what we are doing and increase the speed at which applications can be developed. Microservices help with that.   Another critically important factor that you can’t ignore is the importance of the cloud for deployment. The cloud is rapidly becoming the de-facto standard for deployment of new and modified applications. That has led to the rise of “cloud-native” application development approaches that take advantage of all of the facilities provided by the cloud like elastic scaling, immutable deployments and disposable instances. When you write an application following the microservices architecture, it is automatically cloud-native. That is another factor that is accelerating the adoption of the architectural approach.</p>
<p>Now, as great as microservices are, there are some downsides, or at least some place where they’re not always appropriate. We’ve found that while the microservice approach is perfect for what we at IBM call Systems of Interaction, that it may not be the best approach for Systems of Record, especially those that already exist and change slowly and where there may be no maintenance gains from refactoring or rewriting those systems into microservices.</p>
<p><strong>What does IBM’s provide to help enterprises transform their systems to a micro service and cloud-native approach?  </strong></p>
<p>We bring several things to the table that help our customers to adopt the cloud-native and the microservices approach. First and foremost is our open-standards based cloud platform, IBM Cloud. You can’t underestimate the important of open standards when choosing a cloud platform, and IBM’s embrace of standards such as Cloud Foundry, Docker and Kubernetes makes it possible for you to develop not only our cloud, but for on-premise private clouds and other vendor’s clouds as well, giving you unprecedented portability.</p>
<p>Second, we have the comprehensive IBM Cloud Garage Method. You can only be successful with cloud-native and microservices architectures if you build them within a methodological framework that includes the kind of practices such as small, autonomous, co-located teams, test driven development and continuous integration and continuous delivery that make the approach viable. Finally, we have our people, particularly in the IBM Cloud Garage. The Garage is our secret weapon in helping customers rapidly move to the cloud and microservices by showing them how to apply the method to build systems on Bluemix using all of the latest technologies, practices and approaches. You gain experience with those approaches and technologies at the very same time that you’re building a minimum viable product – the first step toward adopting the approach on all of your systems.</p>
<p><strong>There are still many enterprises that have concerns about cloud migration from either a security or performance point of view. How do you address these concerns?</strong></p>
<p>I’ve heard these concerns many times and it comes down to the fact that neither security nor performance should be a driving factor. It’s possible to build systems on the cloud that are more secure and more performant than current on-premise systems! The key is that you don’t build them in the same way as you do on-premise systems, and it’s that change that is actually what is difficult for teams to understand. For instance, with security, you have to implement security at every layer; you can’t be satisfied with only securing the front-end of your applications thinking that anything behind your firewall is safe by default. The extra attention makes the overall system much more secure. Likewise with performance, in the cloud you have to build systems that are horizontally scalable – that means you have to develop algorithms that work with horizontally scalable systems – which end up not only being better able to scale and thus perform, but makes them more resilient as well.</p>
<p><strong>What advice can you give to enterprises who are outsourcing and waterfall-oriented and want to adopt agile processes and cloud-native development?</strong></p>
<p>The important thing is that you have to begin by changing your mindset. In many countries we’re starting to see a backlash against outsourcing as firms realize that software assets are an important category of intellectual property— a firm should be responsible for maintaining and creating on their own just as they create intellectual capital of other types as part of their core competency.</p>
<p>Software is everywhere now – the IoT and the Cloud now pervade every part of our lives, and any firm that thinks that writing software is outside of what they should be doing will find themselves quickly replaced in the market by more innovative firms that realize that the software is the critical factor – witness the demise of traditional taxicabs in the face of Uber and Lyft.</p>
<p>Once that first shift is made, then the second shift comes more easily. If you realize that building software is critical to your productivity and growth, then you want to build it as quickly as possible and to be able to try new things without having to wait months for a result. That leads directly to the Agile approach and away from a waterfall-based mindset that views software projects as large, multi-year capital expenditures. If you want to fully embrace Agile methods, then you need a technology base that facilitates that, and cloud-native approaches and microservices architectures give you that platform.</p>
<p>The post <a href="https://www.aiuniverse.xyz/microservices-and-cloud-native-development-versus-traditional-development/">Microservices and cloud-native development versus traditional development</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/microservices-and-cloud-native-development-versus-traditional-development/feed/</wfw:commentRss>
			<slash:comments>4</slash:comments>
		
		
			</item>
		<item>
		<title>Containers and microservices complicate cloud-native security</title>
		<link>https://www.aiuniverse.xyz/containers-and-microservices-complicate-cloud-native-security/</link>
					<comments>https://www.aiuniverse.xyz/containers-and-microservices-complicate-cloud-native-security/#comments</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 14 Sep 2017 07:20:56 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Microservices]]></category>
		<category><![CDATA[application security strategy]]></category>
		<category><![CDATA[cloud-native]]></category>
		<category><![CDATA[cloud-native security]]></category>
		<category><![CDATA[containers]]></category>
		<category><![CDATA[Docker containers]]></category>
		<category><![CDATA[software development]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=1112</guid>

					<description><![CDATA[<p>Source &#8211; theserverside.com There&#8217;s not much new in the world of malicious hackers raiding online software. Most attacks follow the same basic approach, and software developers are leaving <a class="read-more-link" href="https://www.aiuniverse.xyz/containers-and-microservices-complicate-cloud-native-security/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/containers-and-microservices-complicate-cloud-native-security/">Containers and microservices complicate cloud-native security</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source &#8211; theserverside.com</p>
<p>There&#8217;s not much new in the world of malicious hackers raiding online software. Most attacks follow the same basic approach, and software developers are leaving their applications open to being blindsided in the most benign and boring of ways. Developing applications with microservices and containers may be a modern approach to software design, but traditional software flaws still remain a problem when addressing cloud-native security.</p>
<p>Social engineering and phishing scams are perhaps the most common way security systems are breached and private data is pilfered. If a user inadvertently gives away his username and password, the only recourse is to change the password or shut down the user account. From that perspective, there&#8217;s not much the software engineer can do.<b></b></p>
<section class="section main-article-chapter" data-menu-title="Prioritizing cloud-native security">
<h3 class="section-title"><i class="icon" data-icon="1"></i>Prioritizing cloud-native security</h3>
<p>But not every data breach can be blamed on an end user, which is why developers must be vigilant when it comes to cloud-native security. According to Matt Rose, global director of application security strategy at Checkmarx, it&#8217;s commonplace for his software company&#8217;s static code analysis tools to identify places where input isn&#8217;t properly validated &#8212; making SQL injection a very plausible threat &#8212; administrative passwords are exposed in plain text, opportunities exist for buffer overruns and private user information is inadvertently written to the file system.</p>
<p>Software development teams are normally pretty good at tackling what they might consider severe threats or critical bugs, but sometimes, it&#8217;s the less severe bugs that can create the biggest problems, especially when an attacker can stack them on top of each other.</p>
<section class="section main-article-chapter" data-menu-title="Prioritizing cloud-native security">The reality is that, in this age of DevOps and cloud-native development, the software stack is more complex than ever, and when code is distrusted across a multitude of microservices and layered upon multiple virtual machines (VMs) and Docker containers, security holes can be difficult to identify. &#8220;The complexity of the application is a major challenge to any development staff,&#8221; Rose said. &#8220;Once code is in production, hackers have an unlimited amount of time and resources to think about a way to leverage something a developer only had perhaps a week to program. You can be very versed in security and still miss things.&#8221;<b></b></p>
</section>
<section class="section main-article-chapter" data-menu-title="Securing containers and microservices">
<h3 class="section-title"><i class="icon" data-icon="1"></i>Securing containers and microservices</h3>
<p>Of course, it&#8217;s not all downside when it comes to securing a microservices-laden application and a Docker-heavy software stack. The reality is that a minimally built container can be far more secure than a full-blown VM, and when issues are identified, container orchestration tools are making it easier than ever to enforce cloud-native security by rolling out updates to each Docker instance.</p>
<p>&#8220;The way that containerization has progressed is it&#8217;s taken the whole cloud templating model and said, &#8216;Let&#8217;s have a golden master for a container, and that container itself should have just enough of an operating environment to actually be useful,'&#8221; said Tim Mackey of Black Duck Software. And since Docker separates the user space upon which installed software runs from the kernel, the attack surface is much smaller when compared to VMs or applications running on bare metal.</p>
<p>And when problems do occur with software hosted by a container &#8212; or even the container itself &#8212; implementing a cloud-native security fix isn&#8217;t as cumbersome as one might think. &#8220;Because these containers can spin up very quickly &#8212; and by extension, spin down very quickly,&#8221; Mackey said, &#8220;if I need to patch them, then I can very easily build a rolling upgrade that is minimally disruptive.&#8221;</p>
<p>As containers and microservices dominate the world of DevOps, software developers must remain diligent, which means both writing robust code that meets basic security standards, while, at the same time, addressing problems when they arise and implementing bug fixes for even the least critical issues. And when problems do occur, rolling out a cloud-native security update across a sea of containers and microservices will be a relatively pain-free process.</p>
</section>
</section>
<h2 class="main-article-subtitle"></h2>
<p>The post <a href="https://www.aiuniverse.xyz/containers-and-microservices-complicate-cloud-native-security/">Containers and microservices complicate cloud-native security</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/containers-and-microservices-complicate-cloud-native-security/feed/</wfw:commentRss>
			<slash:comments>5</slash:comments>
		
		
			</item>
	</channel>
</rss>
