<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>complexity Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/complexity/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/complexity/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Fri, 19 Mar 2021 06:34:58 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>ARTIFICIAL INTELLIGENCE AND ITS COMPLEXITY: BREAKING THE ICE</title>
		<link>https://www.aiuniverse.xyz/artificial-intelligence-and-its-complexity-breaking-the-ice/</link>
					<comments>https://www.aiuniverse.xyz/artificial-intelligence-and-its-complexity-breaking-the-ice/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 19 Mar 2021 06:34:57 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[BREAKING]]></category>
		<category><![CDATA[complexity]]></category>
		<category><![CDATA[ICE]]></category>
		<category><![CDATA[training]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=13615</guid>

					<description><![CDATA[<p>Source &#8211; https://www.analyticsinsight.net/ Training an Artificial Intelligence model is similar to teaching a child Artificial Intelligence has changed our lives for better. Be it in the form <a class="read-more-link" href="https://www.aiuniverse.xyz/artificial-intelligence-and-its-complexity-breaking-the-ice/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/artificial-intelligence-and-its-complexity-breaking-the-ice/">ARTIFICIAL INTELLIGENCE AND ITS COMPLEXITY: BREAKING THE ICE</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.analyticsinsight.net/</p>



<h2 class="wp-block-heading">Training an Artificial Intelligence model is similar to teaching a child</h2>



<p>Artificial Intelligence has changed our lives for better. Be it in the form of robots, automated cars, or voice based applications like Alexa and Siri, we have seen it all. Without a doubt, AI is that one technology that makes the best use of human intelligence to take up tasks that earlier could only be performed by humans. Machines now stand the potential to learn and put the knowledge gained in the best possible use. All the human-like tasks are now performed using AI.</p>



<p>There are several aspects to Artificial Intelligence and so are the fields within this splendid technology. Some of them that have successfully garnered attention and appreciation equally from every corner of the world are natural language processing (NLP), computer vision, and deep learning. Machine learning is that sub field of deep learning that mainly revolves around analysing data and making predictions out of the analysed data. Needless to say, all this relies heavily on human supervision.</p>



<p>SMU Assistant Professor of Information Systems, Sun Qianru, talks about how training an Artificial Intelligence model has so much in similarity to that of how parents teach their child to identify objects.</p>



<h4 class="wp-block-heading"><strong>AI and its complexity</strong></h4>



<p>Considering the complexity that Artificial I is associated with, Professor Sun’s research mainly talks about –</p>



<p><strong>•&nbsp;</strong>Meta learning</p>



<p><strong>•&nbsp;</strong>Semi-supervised learning</p>



<p><strong>• </strong>Deep convolutional neural networks</p>



<p><strong>•&nbsp;</strong>Incremental learning</p>



<p>Well, not just that. The research also revolves around the application of all of these in recognizing images and videos.</p>



<p>The research, “Fast-Adapted Neural Networks (FANN) for Advanced AI Systems” is currently in its early stage. The research revolves around computer vision. This aspect of computer vision employs algorithms that rely on CNNs (Convolutional neural networks). The areas under scrutiny are image recognition, image processing, etc. All of this work is funded by the Agency for Science, Technology and Research (A*STAR).</p>



<p>Building the reasoning level of model adaptation based on statistical-level knowledge learning is the hypothesis of FANN. Here’s everything that the research talks about –</p>



<p><strong>•&nbsp;</strong>Knowing the fact as to how complex AI is, Sun’s research talks about how critical it is to train AI model that is in line with the current trends in the field.</p>



<p><strong>•&nbsp;</strong>When a model is trained to yield accurate recognition results, the amount of data that goes in is immense. Sun cites an example of face recognition to support this. She argues that if there’s just one face available for the system to recognize, then how will it be possible for it to differentiate that one face from the rest? Only when adequate amount of data comes into play, only when other faces too are employed for face recognition should the model be successful in distinguishing. To learn the differences, the model should have huge data that it can rely on.</p>



<p><strong>• </strong>All said and noted, the fact that machine learning models stand the potential to identify the global features cannot be overlooked. These models encode the data available that help in producing desired identification results. The models are successful in recognizing from images, text or sound. All of these employ deep neural network architectures that contain many layers.</p>



<p><strong>• </strong>Sun’s research takes into account two main aspects. One is where some machine learning models train themselves on a labelled data set. The other being how the best performing AI models are all based on deep learning. The research addresses the point how models are built to determine the data followed by classifying it.</p>



<p><strong>•&nbsp;</strong>The professor talks about how some models get updated when the prediction made turns out to be wrong.</p>



<p><strong>•&nbsp;</strong>There’s yet another project that Sun is working on. It is a food related application for the Health Promotion Board based out of Singapore. The main idea behind this app is to enable the users have a fair knowledge about the nutritional values of the food that they consume. The users can make use of this information to lead a healthy lifestyle. All that the users have to do is take pictures of the food they’re consuming and that’s it. All the relevant information is out there – on their smartphones.</p>



<p><strong>•&nbsp;</strong>However, this is where the complexity began. While training a model, her team had introduced a limited set of categories into it. But, with different photos being clicked, the need of expanding the categories came into play. Not just this, the category list was required to be updated and modified in the Application Programming Interface (API) on a regular basis.</p>



<p><strong>•&nbsp;</strong>The rich diversity that the place brings in posed a challenge for the team. With a different place, comes in a different culture. Hence, the team needs to pay extra attention to train their models by employing effective learning algorithms.</p>



<p><strong>•&nbsp;</strong>All this calls for not only diverse data collection but also on developing different adaptation learning algorithms. The complexity is for sure in existence and the team plans to deal with this by making use of a small data set.</p>



<p>This research by Sun and her team aims to achieve high robustness and computational efficiency, especially in the image recognition aspect. The research team is confident the outcomes of the research will have tons of benefits to offer. The key ones being great improvement in the yield rate and reduction in the manufacturing costs. All this would play a pivotal role when the fast-adapted inspection devices undergoes the process of installation, fabrication, and testing.</p>
<p>The post <a href="https://www.aiuniverse.xyz/artificial-intelligence-and-its-complexity-breaking-the-ice/">ARTIFICIAL INTELLIGENCE AND ITS COMPLEXITY: BREAKING THE ICE</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/artificial-intelligence-and-its-complexity-breaking-the-ice/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>UNRAVELLING A NEW ALGORITHM CAPABLE OF REDUCING THE COMPLEXITY OF DATA</title>
		<link>https://www.aiuniverse.xyz/unravelling-a-new-algorithm-capable-of-reducing-the-complexity-of-data/</link>
					<comments>https://www.aiuniverse.xyz/unravelling-a-new-algorithm-capable-of-reducing-the-complexity-of-data/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 18 Mar 2021 06:23:03 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[algorithm]]></category>
		<category><![CDATA[CAPABLE]]></category>
		<category><![CDATA[complexity]]></category>
		<category><![CDATA[data]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[REDUCING]]></category>
		<category><![CDATA[UNRAVELLING]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=13588</guid>

					<description><![CDATA[<p>Source &#8211; https://www.analyticsinsight.net/ The new algorithm is an effective machine learning tool that is capable of extracting the desired information Big data, evidently, is too large to <a class="read-more-link" href="https://www.aiuniverse.xyz/unravelling-a-new-algorithm-capable-of-reducing-the-complexity-of-data/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/unravelling-a-new-algorithm-capable-of-reducing-the-complexity-of-data/">UNRAVELLING A NEW ALGORITHM CAPABLE OF REDUCING THE COMPLEXITY OF DATA</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.analyticsinsight.net/</p>



<h2 class="wp-block-heading">The new algorithm is an effective machine learning tool that is capable of extracting the desired information</h2>



<p>Big data, evidently, is too large to be processed using conventional data processing tools and techniques. Majority of the information systems produce data in huge quantities that poses difficulties to measure. This complex Big data that organizations have to deal with are characterized by – huge volume, high value, big variability, high velocity, much variety, and low veracity.</p>



<p>Yet another area that generates huge amount of data is the one involving scientific experiments. As days passed by, researchers have come up with highly efficient ways to plan, conduct, and assess research. A combination of computational, algorithmic, statistical and mathematical techniques is what goes behind these scientific experiments. Also, whenever a scientific experiment is conducted, the results obtained are usually transformed into numbers. All this ultimately results in huge datasets. Such big data isn’t that easy to handle and extracting meaningful insights from the same is a trickier task. This is why every possible method to reduce the size of the data is being employed and tested. Today, different types of algorithms are being employed to reduce the data size and also pave the way for extracting the principal features and insights. All this ultimately throwing light on the most critical part of the data, its statistical properties. On the downside, the fact that certain algorithms cannot be applied directly to these large volumes of big data cannot be overlooked.</p>



<p>With many researchers and programmers coming up with ways to deal with this humungous big data in the most optimal manner, Reza Oftadeh, a doctoral student in the Department of Computer Science and Engineering at Texas A&amp;M University, too took a step towards this. Reza developed an algorithm which, according to him, is an effective machine learning tool as it is capable of extracting the desired information. Reza along with his team, which comprises of a couple of other doctoral students and some assistant professors, have published their research work in the proceedings from the 2020 International Conference on Machine learning. This research by Reza and his team was funded by the National Science Foundation and U.S. Army Research Office Young Investigator Award.</p>



<p>There is a fair chance that the data set in consideration has high dimensionality, meaning that it has a lot of features. The problem associated with this is the ability to generalize. This is why efforts from every corner are put in to reduce the dimensionality of the data. With those areas being identified which need to undergo reduction in dimensionality, annotated samples of the same are made to make it easy for further analysis. Well, not just this, tasks such as classification, visualization, modelling, etc. also see a smooth workflow.</p>



<p>Though this isn’t for the first time that such algorithms and methodologies have been put in place. This has been doing rounds for quite some time now but with big data increasing exponentially, analysing it is not just time consuming but also complicated. This led to the invention of ANNs – Artificial Neural Networks. Artificial Neural Networks are one of the greatest innovations that the world has seen on the technical front. Artificial neural networks are made up of billions of artificial neurons. Their task is to extract meaningful information from the dataset provided. In simple terms, Artificial Neural Networks are models that are equipped with a well-defined architecture of many interconnected artificial neurons and are designed to simulate how the human brain works when it comes to analysing and processing data. Artificial Neural Networks have seen numerous applications so far and that one application which sets it apart is the way it is capable enough of classifying big data into different categories based on its features.</p>



<p>When Reza was asked his views on the same, he started off by mentioning how much we rely on ANNs in our day-to-day life. He quoted the examples of Alexa, Siri and Google Translate saying how they are trained to be able to understand what the person is saying. However, he also mentioned how all the features possessed aren’t equally significant. He supported his statement by giving an example of a specific type of ANN called an “autoencoder”. This cannot tell where the features are located and also which features are more critical than the rest, he added. Running the model repeatedly doesn’t serve the purpose as this too, is time consuming.</p>



<p>Reza and his team aim to come take their algorithm to a next level altogether. They plan on to add a new cost function to the network. With this feature, it is possible to provide the exact location of the features. For this, they incorporated an OCR – Optical Character Recognition experiment. This team of researchers trained their machine learning model to convert images of both typed as well as handwritten text into machine encoded text. They made use of digital physical documents for this experiment. This model, on being trained for OCR, holds the potential to tell which features among all are important and must be put into priority. They claim that their machine learning tool would cater to bigger datasets as well, thereby resulting in an improved data analysis.</p>



<p>As of now, the algorithm that this group of researchers have come up with stands the potential to deal with one-dimensional data samples only. However, the team is willing to extend its capabilities to the extent that it will be possible to deal with even more complex unstructured data. The team is ready to face all the challenges that might come their way and explore this algorithm to the farthest level possible. They would also be working in the area of generalizing their method. The reason for doing this is to provide a unified framework to produce other machine learning methods. Ultimately, the objective that still remains is to extract features by dealing with a smaller set of specifications.</p>
<p>The post <a href="https://www.aiuniverse.xyz/unravelling-a-new-algorithm-capable-of-reducing-the-complexity-of-data/">UNRAVELLING A NEW ALGORITHM CAPABLE OF REDUCING THE COMPLEXITY OF DATA</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/unravelling-a-new-algorithm-capable-of-reducing-the-complexity-of-data/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Low-code development combats microservices complexity</title>
		<link>https://www.aiuniverse.xyz/low-code-development-combats-microservices-complexity/</link>
					<comments>https://www.aiuniverse.xyz/low-code-development-combats-microservices-complexity/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Wed, 22 Jan 2020 07:56:10 +0000</pubDate>
				<category><![CDATA[Microservices]]></category>
		<category><![CDATA[complexity]]></category>
		<category><![CDATA[Development]]></category>
		<category><![CDATA[Low-Code]]></category>
		<category><![CDATA[software]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=6305</guid>

					<description><![CDATA[<p>Source: searchapparchitecture.techtarget.com Microservices should simplify software development. In theory, we can stitch microservices together with a top layer, assembling applications out of components. The promise isn&#8217;t new, <a class="read-more-link" href="https://www.aiuniverse.xyz/low-code-development-combats-microservices-complexity/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/low-code-development-combats-microservices-complexity/">Low-code development combats microservices complexity</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: searchapparchitecture.techtarget.com</p>



<p>Microservices should simplify software development. In theory, we can stitch microservices together with a top layer, assembling applications out of components. The promise isn&#8217;t new, but it is worth examining.</p>



<p>Ivar Jacobson proposed the idea of software components, like circuit-board components, in 1967. He wanted to make building software more like assembling prefabricated blocks of code and less like creating everything from scratch. Yet, something happens every time we get close to Jacobson&#8217;s component dream. The components require new layers of complexity. In microservices, this means additional programming to ensure uptime, reliability and observability. We aim for easy-to-assemble building blocks, but end up with a great deal of code in reality.</p>



<p>One emerging alternative that knocks out the microservices complexity problem is low-code development, a way to turn actions and data into applications with minimal conventional programming. In some ways, microservices and low-code development solve similar problems. Low-code development platforms emerged as a way to build apps with out-of-the-box, standardized components, using prebuilt templates. Low-code does not provide the development sophistication of microservices. It can, however, work in situations where managing microservices leads to more effort than the benefit promised by that architecture.</p>



<p>Once software developers and architects understand these microservices complexity issues, they can determine how and when low-code development platforms provide a viable workaround.</p>



<p>Microservices&#8217; code, resiliency and uptime problems<br>
Microservices independently communicate with one another over internet standards, which is what makes the architecture powerful. Because they speak TCP/IP and deliver data payloads in JSON, the components can fit into each other without dependencies. These small services each perform one task well. A company can have a set of services for customer information, another for product lookup, a third for orders and a fourth for delivery.</p>



<p>But breaking things down along business functions means there&#8217;s a lot of code to manage. When something goes wrong, there may be an entire chain of events to debug. Microservices requires logging and monitoring work that exists outside the idea of simple components, and creates an explosion of code.</p>



<p>When something goes wrong, figuring out which component contributed to the issue can be tricky without the right tools &#8212; which, again, means more code. While each service has high uptime in this supported deployment, resilience and reliability at the code level start to crumble.</p>



<p>The alternative in low-code<br>
With low-code development, the platform builds and delivers the building blocks for an application.</p>



<p>The developers, or even business unit representatives, provide the variables, database connection, formatting and styling, and the tool generates the application, sometimes via a drag-and-drop UI. Provide the data, and a low-code development platform can even build a database. This is, in effect, the opposite approach to microservices, where you provide the database, abstract it into a service, and code the service logic.</p>



<p>Low-code development platforms generally reuse a great deal of common code. There is a massive reduction in lines of code, perhaps even hundreds to one. There are also fewer independent components, which are tightly coupled to each other. Each line of code becomes more powerful and traceable.</p>



<p>The simple example of a low-code development platform might be a database with a Create, Read, Update andDelete (CRUD) front end. A programmer can sketch out the table visually and fill in data through a simple spreadsheet. Users can access the data through the CRUD front end, which lives in an application that the platform generates, and can be downloaded from an app store.</p>



<p>A low-code platform does nearly everything that conventionally is coded for an application; most of the work for adopters is in tool configuration. As long as the app is simple, clean and doesn&#8217;t require many integration points, a low-code development platform might be the right alternative to a more complex microservices build. Low-code builds are an easy choice for applications that don&#8217;t need to integrate with other databases or that use a series of small tables. Examples include conference apps or one-time marketing promotions that run with user ID information.</p>



<p>Low-code development does not replace microservices. Once you need to share information between applications, in real time, microservices become the right strategy. But the low-code approach helps developers steer clear of over-engineering apps that don&#8217;t need it.</p>
<p>The post <a href="https://www.aiuniverse.xyz/low-code-development-combats-microservices-complexity/">Low-code development combats microservices complexity</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/low-code-development-combats-microservices-complexity/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
