<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>General Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/general/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/general/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Wed, 16 Jun 2021 05:05:33 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>HOW TO CREATE AN ARTIFICIAL INTELLIGENCE GENERAL TECHNOLOGY PLATFORM</title>
		<link>https://www.aiuniverse.xyz/how-to-create-an-artificial-intelligence-general-technology-platform/</link>
					<comments>https://www.aiuniverse.xyz/how-to-create-an-artificial-intelligence-general-technology-platform/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Wed, 16 Jun 2021 05:05:31 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Create]]></category>
		<category><![CDATA[General]]></category>
		<category><![CDATA[platform]]></category>
		<category><![CDATA[Technology]]></category>
		<guid isPermaLink="false">https://www.aiuniverse.xyz/?p=14343</guid>

					<description><![CDATA[<p>Source &#8211; https://www.bbntimes.com/ “AI” is becoming a construct that has been the subject of increasing attention in technology, media, business, industry, government and civil life during recent <a class="read-more-link" href="https://www.aiuniverse.xyz/how-to-create-an-artificial-intelligence-general-technology-platform/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-to-create-an-artificial-intelligence-general-technology-platform/">HOW TO CREATE AN ARTIFICIAL INTELLIGENCE GENERAL TECHNOLOGY PLATFORM</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.bbntimes.com/</p>



<p><em>“AI” is becoming a construct that has been the subject of increasing attention in technology, media, business, industry, government and civil life during recent years.</em></p>



<p><em>Today&#8217;s AI is the subject of controversy. You might have heard about narrow/weak, general/strong/human level and super artificial intelligence, or about machine learning, deep learning, reinforced learning, supervised and unsupervised learning, neural networks, Bayesian networks, NLP, and a whole lot of other confusing terms, all dubbed as AI techniques.</em></p>



<p><em>Many of the rules and logic-based systems that were previously considered Artificial Intelligence are no longer AI. In contrast, systems that analyze and find patterns in data are dubbed as machine learning, widely promoted as the dominant form of AI.</em></p>



<h2 class="wp-block-heading">What is Wrong with Today&#8217;s AI, Its Chips and Platforms?</h2>



<p>All the confusion comes from an anthropomorphic Artificial Intelligence, AAI, the simulation of the human brain using artificial neural networks, as if they substitute for the biological neural networks in our brains. A neural network is made up of a bunch of neural nodes (functional units) which work together, and can be called upon to execute a model.</p>



<p>Thus, the main purpose in 2021 is to provide a conceptual framework to define Machine Intelligence and Learning. And the first step to create MI is to understand its nature or concept against main research questions (why, what, who, when, where, how).</p>



<p>So, describe AI to people as an AAI or augmented intelligence or advanced statistics, not artificial intelligence or machine intelligence.</p>



<p>Now, re the levels of AAI applications, tools, and platforms?</p>



<p>Lets focus only on &#8220;AAI chips&#8221;, forming the brain of an AAI System, replacing CPUs and GPUs, and where most progress has to be achieved.</p>



<p>While typically GPUs are better than CPUs when it comes to AI processing, they usually fail, being specialized in computer graphics and image processing, not neural networks.</p>



<p>The AAI industry needs specialised processors to enable efficient processing of AAI applications, modelling and inference. As a result, chip designers are now working to create specialized processing units.</p>



<p>These come under many names, such as NPU, TPU, DPU, SPU etc., but a catchall term can be the AAI processing unit (AAI PU), forming the brain of an AAI System on a chip (SoC).</p>



<p>It is also added with 1. the neural processing unit or the matrix multiplication engine where the core operations of an AAI SoC are carried out; 2. Controller processors, based on RISC-V, ARM, or custom-logic instruction set architectures (ISA) to control and communicate with all the other blocks and the external processor; 3. SRAM; 4. I/O; 5. the interconnect fabric between the processors (AAI PU, controllers) and all the other modules on the SoC.</p>



<p>The AAI PU was created to execute ML algorithms, typically by operating on predictive models such as artificial neural networks. They are usually classified as either training or inference generally performed independently.</p>



<p>AAI PUs are generally required for the following:</p>



<ul class="wp-block-list"><li>Accelerate the computation of ML tasks by several folds (nearly 10K times) as compared to GPUs</li><li>Consume low power and improve resource utilization for ML tasks as compared to GPUs and CPUs</li></ul>



<p>Unlike CPUs and GPUs, the design of single-action AAI SoC is far from mature.</p>



<p>Specialized AI chips deal with specialized ANNs, and are designed to do two things with them: task-designed training and inference, only for facial recognition, gesture recognition, natural language processing, image searching, spam filtering, etc.</p>



<p>In all, there are {Cloud, Edge, Inference, Training} chips for AAI models of specific tasks. Examples of Cloud + Training chips include NVIDIA’s DGX-2 system, which totals 2 petaFLOPS of processing power, made up of 16 NVIDIA V100 Tensor Core GPUs, or Intel Habana’s Gaudi chip or Facebook photos or Google translate.</p>



<p>Sample chips here include Qualcomm’s Cloud AI 100, which are large chips used for AAI in massive cloud datacentres. Another example is Alibaba’s Huanguang 800, or Graphcore’s Colossus MK2 GC200 IPU.</p>



<p>Now (Cloud + Inference) chips were used to train Facebook’s photos or Google Translate, to process the data you input using the models these companies created. Other examples include AAI chatbots or most AAI-powered services run by large technology companies. Here is also Qualcomm’s Cloud AI 100, which are large chips used for AAI in massive cloud datacentres, Alibaba’s Huanguang 800, or Graphcore’s Colossus MK2 GC200 IPU.</p>



<p>(Edge + Inference) on-device chips examples include Kneron’s own chips, including the KL520 and recently launched KL720 chip, which are lower-power, cost-efficient chips designed for on-device use; Intel Movidius and Google’s Coral TPU.</p>



<p>All of these different types of chips, training or inference, and their different implementations, models, and use cases are expected to develop the AAI of Things (AAIoT) future.</p>



<h2 class="wp-block-heading">How to Make a True Artificial Intelligence Platform</h2>



<p>In order to create a platform neutral&nbsp;software&nbsp;operating with world’s data/information/content which could run/display properly on any type of computer, cell phone, device or technology platform, the following are required:</p>



<ul class="wp-block-list"><li>Operating Systems.</li><li>Computing/Hardware/Cloud Platforms.</li><li>Database Platforms.</li><li>Storage Platforms.</li><li>Application Platforms.</li><li>Mobile Platforms.</li><li>Web Platforms.</li><li>Content Management Systems.</li></ul>



<p>The AI programming language should act as both the general programming language and computing platform. Its applications could be launched on any operating system and hardware, from mobile-based operating systems, as Linux or Android, to hardware-based platforms, from game consoles to supercomputers or quantum machines.</p>
<p>The post <a href="https://www.aiuniverse.xyz/how-to-create-an-artificial-intelligence-general-technology-platform/">HOW TO CREATE AN ARTIFICIAL INTELLIGENCE GENERAL TECHNOLOGY PLATFORM</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-to-create-an-artificial-intelligence-general-technology-platform/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Machine learning could help Apple Maps fix bogus GPS coordinates</title>
		<link>https://www.aiuniverse.xyz/machine-learning-could-help-apple-maps-fix-bogus-gps-coordinates/</link>
					<comments>https://www.aiuniverse.xyz/machine-learning-could-help-apple-maps-fix-bogus-gps-coordinates/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 14 Feb 2020 07:24:36 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[Apple Maps]]></category>
		<category><![CDATA[General]]></category>
		<category><![CDATA[Google]]></category>
		<category><![CDATA[GPS]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[patent]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=6762</guid>

					<description><![CDATA[<p>Source: appleinsider.com While GPS is a widely-used technology for geolocation, one that is especially useful for navigation while driving, it isn&#8217;t necessarily as accurate as it could <a class="read-more-link" href="https://www.aiuniverse.xyz/machine-learning-could-help-apple-maps-fix-bogus-gps-coordinates/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/machine-learning-could-help-apple-maps-fix-bogus-gps-coordinates/">Machine learning could help Apple Maps fix bogus GPS coordinates</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: appleinsider.com</p>



<p>While GPS is a widely-used technology for geolocation, one that is especially useful for navigation while driving, it isn&#8217;t necessarily as accurate as it could be. Mapping applications like Apple Maps occasionally show the wrong location for the user for a variety of different reasons.</p>



<p>These issues can include interference in the GPS signal caused by trees and mountains, going underground or indoors, signals reflecting off buildings in a city, solar storms, and even rare cases of radio interference or jamming.</p>



<p>These problems aren&#8217;t just limited to GPS, as other Global Navigational Satellite Systems (GNSS) such as Glonass, Galileo, Beidou, and others can suffer from the same issues.</p>



<p>In the patent application published on Thursday by the US Patent and Trademark Office, Apple has come up with &#8220;Machine learning-assisted satellite-based positioning.&#8221; In short, it is a way to analyze GPS data by comparing it against data acquired by a machine-learning model.</p>



<p>The idea is that the device receives its estimated position based on a GNSS signal, then acquiring a set of parameters associated with the estimated position. A reference position is then provided, close to where the estimated position of the device is, to help with correction.</p>



<p>A machine learning model is then generated based on the estimated device position, the reference position, and a set of parameters. This machine learning model is then used to estimate the device&#8217;s location for future GPS readings, until a period of time has elapsed or they have moved to an area where the parameters and the model are inaccurate.</p>



<p>In effect, the device generates the model using the two sets of positioning data to determine how far out from its real position its received GPS coordinates are. For example, in a city with tall buildings, the model could be informed the signal could be reflected, and take that into account along with previous position readings and the general direction of transit to work out a more accurate position based on misguided data.</p>



<p>Apple has included extra claims to take into account the use of a second device, including providing the model to others for use and storage. A Kalman filter, which can be used to estimate data based on a collection of noisy measurements, is also suggested for use, as well as accounting for &#8220;an amount of uncertainty&#8221; for measurements and subsequent positions, and for alerting the user that the position is revised and either takes into account or disregards the GPS data.</p>



<p>The filing lists its inventors as Benjamin A. Werner, Brent M. Ledvina, Dennis P. Hilgenberg, and Aarti Sathyanarayana.</p>



<p>Apple files numerous patent applications every week, and though the filings indicate areas of interest for Apple&#8217;s research and development teams, it doesn&#8217;t guarantee the possibility of Apple adding it to a future product or service.</p>



<p>Apple has been keen to increase its work on machine learning in recent years, including hiring senior Google AI scientist and noted AI expert Ian Goodfellow in 2019 and acquiring firms like Drive.ai and Laserlike. The majority of its public-facing ML work is with Siri, and that too has seen some location-aware improvements.</p>



<p>In August 2018, Apple detailed its use of geographic language models to improve Siri&#8217;s knowledge of local terminology and locations, helping reduce point of interest-based searched by 18.7 percent.</p>
<p>The post <a href="https://www.aiuniverse.xyz/machine-learning-could-help-apple-maps-fix-bogus-gps-coordinates/">Machine learning could help Apple Maps fix bogus GPS coordinates</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/machine-learning-could-help-apple-maps-fix-bogus-gps-coordinates/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>How Far Are We From Achieving Artificial General Intelligence?</title>
		<link>https://www.aiuniverse.xyz/how-far-are-we-from-achieving-artificial-general-intelligence/</link>
					<comments>https://www.aiuniverse.xyz/how-far-are-we-from-achieving-artificial-general-intelligence/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Tue, 11 Jun 2019 10:21:31 +0000</pubDate>
				<category><![CDATA[Human Intelligence]]></category>
		<category><![CDATA[Achieving]]></category>
		<category><![CDATA[Far]]></category>
		<category><![CDATA[General]]></category>
		<category><![CDATA[Intelligence]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=3707</guid>

					<description><![CDATA[<p>Source:- forbes.com These days, when you browse the internet for news on artificial intelligence, you’ll find out about new AI that just managed to do something humans do, yet far <a class="read-more-link" href="https://www.aiuniverse.xyz/how-far-are-we-from-achieving-artificial-general-intelligence/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-far-are-we-from-achieving-artificial-general-intelligence/">How Far Are We From Achieving Artificial General Intelligence?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Source:- forbes.com</p>
<p>These days, when you browse the internet for news on artificial intelligence, you’ll find out about new AI that just managed to do something humans do, yet far better. Present day AI can detect cancers better than human doctors, build better AI algorithms than human developers, and beat the world champions at games like chess and Go. Instances like these may lead us to believe that perhaps, there&#8217;s not a whole lot that artificial intelligence can not do better than us humans. The realization of AI’s superior and ever-improving capabilities in different fields has evoked both hope and caution from the global tech community as well as the general public. While many believe the rise of artificial general intelligence can massively benefit humanity by raising our standard of living and status as a civilization, some believe the development may lead to global doom.</p>
<p>While the debate on whether the development of artificial general intelligence or artificial superintelligence is promising or pernicious rages on, the jury on <em>when</em> such advanced forms of AI will come into existence is also still out. These are important questions that do deserve the coverage and debate they are subjected to. However, before worrying about the future of AI it is necessary to first know what artificial general intelligence exactly is, what it would take to achieve it, and how far existing AI capabilities are from getting there.</p>
<h2>What is the current state of artificial intelligence?</h2>
<p>The internet abounds with stories of stunning applications that exist today, culminating from years of artificial intelligence research. Similar to the aforementioned example of AI systems that can diagnose cancers with greater accuracy than human doctors, there are many other fields where specialized artificial intelligence is replicating human-like reasoning and cognition.</p>
<p>For instance, deep learning algorithms used by social media sites are becoming increasingly adept at recognizing objects, people, and even detailed characteristics of these objects and people. Modern computer vision technology driven by deep learning can now identify people in images posted to social media, the position of the person in the image, their expressions, and any accessories they might be wearing. This gives AI systems the ability to perceive images similar to the way humans do. These systems can go beyond simply identifying people from images and even analyze subtle patterns to discern non-obvious attributes. One example is a Stanford University study that shows how deep neural networks can identify people’s sexual orientation just by analyzing their faces &#8212; an ability that is highly unlikely to be present in humans.</p>
<p>Another instance of AI systems performing human-like feats is natural language processing (NLP), where AI can understand speech or text delivered in natural language. AI is becoming proficient in understanding the meaning of text and speech as part of applications such as chatbots and virtual assistants in smartphones (think of Siri, Cortana, etc.) And advancements in natural language generation, which is the generation of information in normal human language, is being used in numerous applications where machines are required to respond to people voice or text.</p>
<p>With such developments, the gap between human intelligence and artificial intelligence seems to be diminishing at a rapid rate. This might give you the impression that powerful artificial intelligence systems or artificial general intelligence systems may not be too far out in the future. However, it is vital to understand that it takes more than just performing specific tasks better than humans to qualify as artificial general intelligence.</p>
<h2>What exactly is artificial general intelligence?</h2>
<p>Put simply, Artificial General Intelligence (AGI) can be defined as the <em>ability of a machine to perform any task that a human can. </em>Although the aforementioned applications highlight the ability of AI to perform tasks with greater efficacy than humans, they are not generally intelligent, i.e., they are exceedingly good at only a single function while having zero capability to do anything else. Thus, while an AI application may be as effective as a hundred trained humans in performing one task it can lose to a five-year-old kid in competing over any other task. For instance, computer vision systems, although adept at making sense of visual information, cannot translate and apply that ability to other tasks. On the contrary, a human, although sometimes less proficient at performing these functions, can perform a broader range of functions than any of the existing AI applications of today.</p>
<p>While an AI has to be trained in any function it needs to perform with massive volumes of training data, humans can learn with significantly fewer learning experiences. Additionally, humans &#8212; and (perhaps one day) agents with artificial general intelligence &#8212; can generalize better to apply the learnings from one experience to other similar experiences. An agent having artificial general intelligence will not only learn with relatively less training data but will also apply the knowledge gained from one domain to another. For example, an AGI agent that has been trained to process one language using NLP can potentially be able to learn languages having shared roots and similar syntaxes. Such a capability will make the learning process of artificially intelligent systems similar to that of humans, drastically reducing the time for training while enabling the machine to gain multiple areas of competency.</p>
<h2>Can AI ever achieve general intelligence?</h2>
<p>Artificial intelligence systems, especially artificial general intelligence systems are designed with the human brain as their reference. Since we ourselves don’t have the comprehensive knowledge of our brains and its functioning, it is hard to model it and replicate it working. However, the creation of algorithms that can replicate the complex computational abilities of the human brain is theoretically possible, as suggested by the Church-Turing thesis, which states &#8212; in simple words &#8212; that given infinite time and memory, any kind of problem can be solved algorithmically. This makes sense since deep learning and other subsets of artificial intelligence are basically a function of memory, and having infinite (or a large enough amount of) memory can mean that problems of the highest possible levels of complexity can be solved using algorithms.</p>
<h2>How far are we from artificial general intelligence?</h2>
<p>Although it might be theoretically possible to replicate the functioning of a human brain, it is not practicable as of now. Thus, capability-wise, we are leaps and bounds away from achieving artificial general intelligence. However, time-wise, the rapid rate at which AI is developing new capabilities means that we might be get close to the inflection point when the AI research community surprises us with the development of artificial general intelligence. And experts have predicted the development of artificial intelligence to be achieved as early as by 2030. A survey of AI experts recently predicted the expected emergence of AGI or the singularity by the year 2060.</p>
<p>Thus, although in terms of capability, we are far from achieving artificial general intelligence, the exponential advancement of AI research may possibly culminate into the invention of artificial general intelligence within our lifetime or by the end of this century. Whether the development of AGI will be beneficial for humanity or not is still up for debate and speculation. So is the exact estimate on the time it will take for the emergence of the first real-world AGI application. But one thing is for sure &#8212; the development of AGI will trigger a series of events and irreversible changes (good or bad) that will reshape the world and life as we know it, forever.</p>
<p>The post <a href="https://www.aiuniverse.xyz/how-far-are-we-from-achieving-artificial-general-intelligence/">How Far Are We From Achieving Artificial General Intelligence?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-far-are-we-from-achieving-artificial-general-intelligence/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
