<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>digital revolution Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/digital-revolution/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/digital-revolution/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Mon, 29 Jun 2020 06:23:39 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>National Insurance Awareness Day 2020: How technology is changing insurance industry</title>
		<link>https://www.aiuniverse.xyz/national-insurance-awareness-day-2020-how-technology-is-changing-insurance-industry/</link>
					<comments>https://www.aiuniverse.xyz/national-insurance-awareness-day-2020-how-technology-is-changing-insurance-industry/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Mon, 29 Jun 2020 06:23:36 +0000</pubDate>
				<category><![CDATA[Internet of things]]></category>
		<category><![CDATA[Big data]]></category>
		<category><![CDATA[digital revolution]]></category>
		<category><![CDATA[Internet of Things]]></category>
		<category><![CDATA[software]]></category>
		<category><![CDATA[Technology]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=9821</guid>

					<description><![CDATA[<p>Source: financialexpress.com Industries, across the globe, have undergone a digital revolution. But, the way people buy insurance has remained much the same for decades, until the recent <a class="read-more-link" href="https://www.aiuniverse.xyz/national-insurance-awareness-day-2020-how-technology-is-changing-insurance-industry/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/national-insurance-awareness-day-2020-how-technology-is-changing-insurance-industry/">National Insurance Awareness Day 2020: How technology is changing insurance industry</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: financialexpress.com</p>



<p>Industries, across the globe, have undergone a digital revolution. But, the way people buy insurance has remained much the same for decades, until the recent COVID-19 outbreak. Since the outbreak, like everything else, people are browsing through health insurance and life insurance policies online to secure their future and life goals, thus upping the levels of digital disruption that is mostly seen in areas such as retail or and banking.</p>



<p>The pandemic upheaval has changed people’s outlook towards insurance. It has made people value the importance of insurance and apprehend the consequences of not having one. At the same time, it has made insurers realize the importance of being technologically innovative. More and more, smartphones and computers are putting power in the hands of buyers who want access to information as well as the tools to analyze and purchase insurance products through a variety of channels. Price transparency and easy access are also commercializing what were once complex products that needed to be sold only through insurance agents.</p>



<p>Simultaneously, development in technology such as AI, Big Data and Block chain are helping established insurers and insurrects to develop new ways of buying insurance, led by the needs of the consumer rather than the demands of insurance companies to sell policies. Undeniably, savvy insurance companies are using new technologies to assist the customers better. To start, more insurance companies are evading complex core IT systems by capitalizing in software-as-a-service applications, which they use for operations, distribution, HR admin, and commission processing, amongst other tasks. These tools will profoundly amend the way the insurance sector works, including with the automation of some of the traditional, physical tasks.</p>



<p>Insurance companies need to automate the majority of their traditional back-office operations. Technologies, like digital applications and advanced-analytics engines, are further altering operations.</p>



<p>As per the line of coverage, these competencies can restructure initial information gathering and document review, permitting clients to help themselves during the underwriting, servicing, and claims processes.</p>



<p>To arrange their operations, insurers should redesign themselves by forming interdisciplinary teams, strictly assimilating the technology and operations groups, and amassing the tools that understand the consumers’ predilections. In addition, insurers should construct skills to work with external service providers often, hire talent erudite in using the corresponding technology to augment operations, and ensure their organizational cultures embolden and cultivate experimentation. Changing the way operations work is a big undertaking requiring significant effort, but this shift would improve profitability in the long run.</p>



<p>Also, advances in AI (Artificial Intelligence) and ML (Machine Learning) is allowing incumbents to systematize the gradually complex tasks, including addressing all forms of customer queries. AI and ML are progressively being used to identify fraud, process automatic payouts for small claims besides offering digital self-service damage assessments.</p>



<p>For insurance, companies with more established technology proficiencies, a host of Internet of Things (IoT) technologies can help shrink manual interpolations in claims and pricing. Additional applications of IoT could avert insurance losses on roads, at work, and in homes and businesses by permitting insurers to achieve vigorous risk calculations using informed data.</p>



<p>Today, insurers need to reimagine how they engage with consumers and how they curate, consume and integrate an ecology of information and services to upsell and cross-sell insurance products and services better. To gratify today’s consumers, they must offer unpretentious products that are easy to purchase and it must alter business processes.</p>



<p>Technology is changing the insurance industry inside out. While it will be the trend for the upcoming years, insurers will need to restyle themselves to capitalize on the technology tools. This will let them function more proficiently and with a superior emphasis on consumer experience even in grim situations like a pandemic.</p>
<p>The post <a href="https://www.aiuniverse.xyz/national-insurance-awareness-day-2020-how-technology-is-changing-insurance-industry/">National Insurance Awareness Day 2020: How technology is changing insurance industry</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/national-insurance-awareness-day-2020-how-technology-is-changing-insurance-industry/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Microservices, The Machine Tools Of The Digital Revolution</title>
		<link>https://www.aiuniverse.xyz/microservices-the-machine-tools-of-the-digital-revolution/</link>
					<comments>https://www.aiuniverse.xyz/microservices-the-machine-tools-of-the-digital-revolution/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Wed, 23 Oct 2019 07:38:17 +0000</pubDate>
				<category><![CDATA[Microservices]]></category>
		<category><![CDATA[digital revolution]]></category>
		<category><![CDATA[machine]]></category>
		<category><![CDATA[Technology]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=4809</guid>

					<description><![CDATA[<p>Source: forbes.com Fortunately, in reality they’re not all that strange, and certainly not without precedent. It helps to think about their business purpose, and how much they’re <a class="read-more-link" href="https://www.aiuniverse.xyz/microservices-the-machine-tools-of-the-digital-revolution/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/microservices-the-machine-tools-of-the-digital-revolution/">Microservices, The Machine Tools Of The Digital Revolution</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: forbes.com</p>



<p>Fortunately, in reality they’re not all that strange, and certainly not without precedent.</p>



<p>It helps to think about their business purpose, and how much they’re like previous important advances in business technology. That requires a bit of abstraction, a nice word for simplifying, based on historical analogies.</p>



<p>As with many things in business, the secret to understanding these cloud computing technologies and techniques lies in establishing how their rise relates to supply and demand, the most fundamental elements of any market. With business technology, it’s also good to search for ways that an expensive and cumbersome process is being automated to hasten the delivery of value.</p>



<p>Microservices are elements of a larger software application that can be decoupled from the whole application. They can be updated or redeployed without having to take down, change, and then relaunch the whole application. Service meshes control how these parts interact, both with each other and with other services.</p>



<p>Can we get more complex than this? Oh yes. There are containers, sidecars, APIs, monoliths, service calls, SOA, CI/CD, and IaC, just for starters. But don’t overfixate—these are important technology elements, but they all serve the common business purposes of efficiency, speed, agility, and automated management.</p>



<p>Think of each microservice as a tool from a toolbox. At one time, tools were custom made, and were used to custom-make machines. For the most part, these machines were relatively simple, usually designed for a few basic purposes. The were individually constructed, no two exactly alike, and that limited the building and the fixing of them.&nbsp;</p>



<p>Then with the advent of standardized measurement and industrial expansion, we got precision-made machine tools capable of much more reuse and wider deployment. These tools made it possible to vastly increase the supply of products at a lower cost, exposing them to greater demand. Those standardized machine tools were more complex than their predecessors. And they enabled a boom in standardized re-use, a simpler model overall.</p>



<p>It’s the same with microservices—the elements are often more complex, but the overall process allows for standardized reuse, through the management of service meshes. The “tool” in this case is software that carries out a function—for example, doing online payments or creating security verifications.&nbsp;</p>



<p>T-Mobile uses microservices to increase the frequency of software releases from quarterly to as often as daily, making them more responsive to the market. PwC Australia built applications that can respond to changes in some customer behaviors, while keeping core operations running. You can learn more here, in an article with more good explanations and examples. </p>



<p>Extrapolating from this analogy, does the boom in microservices tell us that the computational equivalent of the Industrial Revolution is underway? Is this an indication of standardization that makes it vastly easier and faster to create objects and experiences, revolutionizes cost models, and shifts industries and fortunes?</p>



<p>Without getting too grandiose about it, yes. As with the Industrial Revolution, there will probably be a period where older artisans (in this case, traditional IT teams) have to get accustomed to the new method, but over time the big change will happen.</p>



<p>You see it around you already—in the creation of companies that come out of nowhere to invent and capture big markets or in the workforce transformations that allow work and product creation to be decoupled. Which is, not accidentally, much the way microservices decouple from larger applications in order to tweak particulars without interrupting larger operations. Since change has become easier, you see it in the importance of data in determining how things are consumed, and in rapidly reconfiguring how things are made and what is offered.&nbsp;</p>



<p>And you see it in the way businesses are re-evaluating how they apportion and manage work. Nothing weird about that, we did it in much bigger ways in the Industrial Revolution.</p>



<p>It’s understandable how the complexity of tech generates anxiety among many of its most promising consumers. Typically, a feature of business computing evolves from scarce and difficult to grasp knowledge. Its power and utility speeds its evolution, often faster than software developers can socialize it or the general public can learn it. Not that long ago, spreadsheets and email were considered weird, too, for these same reasons.&nbsp;</p>



<p>To move ahead, though, it’s important to recognize big, meaningful changes, and abstract their meaning into something logical and familiar. At a granular level, microservices may be complex, but their function is very straightforward and critical for successful businesses.</p>
<p>The post <a href="https://www.aiuniverse.xyz/microservices-the-machine-tools-of-the-digital-revolution/">Microservices, The Machine Tools Of The Digital Revolution</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/microservices-the-machine-tools-of-the-digital-revolution/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Computer Science Curriculums Must Emphasize Privacy Over Capability</title>
		<link>https://www.aiuniverse.xyz/computer-science-curriculums-must-emphasize-privacy-over-capability/</link>
					<comments>https://www.aiuniverse.xyz/computer-science-curriculums-must-emphasize-privacy-over-capability/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Mon, 05 Aug 2019 12:54:32 +0000</pubDate>
				<category><![CDATA[Deep Learning]]></category>
		<category><![CDATA[antithetical]]></category>
		<category><![CDATA[computer science]]></category>
		<category><![CDATA[data availability]]></category>
		<category><![CDATA[deep learning]]></category>
		<category><![CDATA[digital revolution]]></category>
		<category><![CDATA[future technology]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=4273</guid>

					<description><![CDATA[<p>Source: forbes.com The idea of privacy is in many ways antithetical to the data-driven mindset promoted by computer science curriculums over the past decade. Massive advances in <a class="read-more-link" href="https://www.aiuniverse.xyz/computer-science-curriculums-must-emphasize-privacy-over-capability/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/computer-science-curriculums-must-emphasize-privacy-over-capability/">Computer Science Curriculums Must Emphasize Privacy Over Capability</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: forbes.com</p>



<p>The idea of privacy is in many ways antithetical to the data-driven mindset promoted by computer science curriculums over the past decade. Massive advances in data availability and analytic capabilities has led to a reshaping of many curriculums towards coursework teaching how to manage, explore, assess, understand and exploit these newfound digital riches. Deep learning and other analytics courses are frequently filled beyond capacity. In contrast, the idea that programmers should give up their data riches in the name of privacy is a view that receives little attention in many curriculums.</p>



<p>Today’s computer science curriculums have centralized around preparing tomorrow’s future technology leaders to harness the digital revolution. From managing “big data” to making sense of it through deep learning, coursework has heavily emphasized the positives of today’s digital deluge rather than the considerable negatives it can wreak on privacy, safety and security.</p>



<p>Privacy was historically far too often lumped under the heading of cybersecurity and relegated to an afterthought. Privacy violations were largely viewed in the context of companies losing control of customer data, rather than deliberately harnessing that data in privacy-invading manners.</p>



<p>A company that harvested massive amounts of data from its customers, held onto it without any form of cyber intrusion and lawfully resold that data to others was often viewed as privacy-protecting, since it successfully safeguarded the data in its hands from loss.</p>



<p>An increasing number of programs have begun to integrate some form of ethical training into their curriculums. Yet here the focus has largely been on the programmer themselves, emphasizing how to think about what they do with users’ data and how to address biases in their designs. Such courses may emphasize topics like AI explainability to mitigate inadvertent demographic bias and consideration of algorithmic harm, such as why building an AI system that can forcibly uncover members of vulnerable communities could cause grave harm and thus should not be pursued even if it represents a great technical achievement.</p>



<p>Some programs teach compliance with privacy laws like GDPR, but this guidance typically revolves around “minimal minimization” in which data collection is adjusted to fit the letter of the law, if not its spirit,&nbsp;utilizing the laws’ myriad loopholes and exemptions.</p>



<p>In contrast, library and information science curriculums have historically emphasized privacy and civil liberties concerns, especially the minimization of data collection. In contrast to the digital behemoths racing behind us vacuuming up every byte of our online behavior and interests, libraries have historically adopted precisely the opposite stance, keeping only the bare minimum of information they need and deleting data the first moment they can.</p>



<p>Libraries have historically had enormous insights into our most intimate and unfettered interests, often recording our information consumption from the first children’s books our parents checked out of the library to read to us as infants. If libraries kept this information they could build incredible personalized recommendation systems and generate lucrative revenue streams reselling that data or making it available for advertising.</p>



<p>Instead, most public libraries have practiced absolute minimization in which every data point is discarded the moment it is no longer needed. Rather than keeping a user’s entire checkout history through time, most public libraries have historically kept a list only of the items currently checked out, deleting them as soon as the materials are returned.</p>



<p>This minimization was borne out of necessity, with libraries, especially in the pre-Internet era, of great interest to surveillance-minded authorities.</p>



<p>Computer science curriculums, however, have not historically emphasized this idea of minimization-at-all-costs. Quite the opposite, with data hoarding embraced as the path to limitless riches. After all, even if you pay for a service today you are still the product, as data exhaust becomes more valuable than subscription fees.</p>



<p>Privacy naturally conflicts with capability when it comes to data analytics. The more data and the higher resolution it is, the more insight algorithms can yield. Thus, the more companies prioritize privacy and actively delete everything they can and minimize the resolution on what they do have to collect, the less capability their analytics have to offer.</p>



<p>This represents a philosophical tradeoff. On the one hand, computer science students are taught to collect every datapoint they can at the highest resolution they can and to hoard it indefinitely. This extends all the way to things like diagnostic logging that often becomes an everything-or-nothing concept that has led even major companies to have serious security breaches. On the other hand, disciplines like library and information science emphasize privacy over capability, getting rid of data the moment it is safe to do so.</p>



<p>When it comes to government surveillance, data breaches, ethically questionable research, insider threats and other privacy issues, the less data companies keep about their users, the less information there is that can be misused and the lower their storage and analytic costs. If a company can make do with a terabyte of aggregated data rather than a petabyte of individual-level high resolution data,&nbsp;it can achieve considerable cost savings and move many of&nbsp;its batch analyses to real-time.</p>



<p>In the end, rather than enshrining mottos like “data is the new oil” into the vocabulary of tomorrow’s future technology leaders, perhaps we should emphasize “privacy first” and focus on how companies can absolutely minimize the data they collect to ensure a more privacy-protecting and less Orwellian future.</p>
<p>The post <a href="https://www.aiuniverse.xyz/computer-science-curriculums-must-emphasize-privacy-over-capability/">Computer Science Curriculums Must Emphasize Privacy Over Capability</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/computer-science-curriculums-must-emphasize-privacy-over-capability/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
