<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>network Archives - Artificial Intelligence</title>
	<atom:link href="https://www.aiuniverse.xyz/tag/network/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.aiuniverse.xyz/tag/network/</link>
	<description>Exploring the universe of Intelligence</description>
	<lastBuildDate>Mon, 14 Jun 2021 05:08:56 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>NEURAL NETWORK – EVERYTHING FROM THE SCRATCH!</title>
		<link>https://www.aiuniverse.xyz/neural-network-everything-from-the-scratch/</link>
					<comments>https://www.aiuniverse.xyz/neural-network-everything-from-the-scratch/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Mon, 14 Jun 2021 05:08:54 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[everything]]></category>
		<category><![CDATA[network]]></category>
		<category><![CDATA[NEURAL]]></category>
		<category><![CDATA[SCRATCH]]></category>
		<guid isPermaLink="false">https://www.aiuniverse.xyz/?p=14248</guid>

					<description><![CDATA[<p>Source &#8211; https://www.analyticsinsight.net/ With data being the foundation for almost all the objectives to accomplish, no matter what the industry type is – having the right technologies in place <a class="read-more-link" href="https://www.aiuniverse.xyz/neural-network-everything-from-the-scratch/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/neural-network-everything-from-the-scratch/">NEURAL NETWORK – EVERYTHING FROM THE SCRATCH!</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.analyticsinsight.net/</p>



<p>With <strong>data</strong> being the foundation for almost all the objectives to accomplish, no matter what the industry type is – having the right technologies in place that help in achieving the aim is the need of the hour. Organizations rely on several tools and technologies to make the best possible use of the <strong>data</strong> that is available. As far as <strong>data</strong> is concerned, there are some terms that we get to hear on a regular basis – Artificial Intelligence, big <strong>data</strong>, <strong>data</strong> science, <strong>data</strong> analytics, and so on. Yet another commonly used term is – <strong>neural network</strong>(s). Having a deep understanding of this concept is a little challenging. Well, not anymore! Keep reading to know about <strong>neural network</strong>s in detail – right from the basics to its applications in the magical world of <strong>data</strong>!</p>



<p>The concept of&nbsp;<strong>neural network</strong>s is not something that is traced back to a year or two. It has been in existence for over a decade now. What led to the immense popularity of&nbsp;<strong>neural network</strong>s is the kind of performance delivered at the end of the day. The moment these networks delivered close to human-like performance in a majority of tasks is when they garnered attention from every corner of the world. Right from pattern recognition to machine learning – you name it and&nbsp;<strong>neural network</strong>s have got you covered. These networks form the base of algorithms that help in predicting consumer demand and estimating the freight arrival time.</p>



<h3 class="wp-block-heading"><strong>What are neural networks?</strong></h3>



<p>So what exactly do we understand by&nbsp;<strong>neural network</strong>s? A&nbsp;<strong>neural network</strong>&nbsp;is based on a connection of units or nodes called&nbsp;<strong>neurons</strong>. The connections are called edges. These&nbsp;<strong>neurons</strong>&nbsp;model the&nbsp;<strong>neurons</strong>&nbsp;in the brain. Each of these&nbsp;<strong>neurons</strong>&nbsp;can transmit a signal to other&nbsp;<strong>neurons</strong>. The neuron that receives the signal does the processing. What is to be noted here is that this signal is a real number and the outut is a function of the sum of its inputs. One of the most interesting features of&nbsp;<strong>neural network</strong>s is that the&nbsp;<strong>Neurons</strong>&nbsp;and edges typically have a weight that adjusts as learning proceeds. With the weight increasing or decreasing, the strength of the signal either becomes strong or weak.</p>



<h3 class="wp-block-heading"><strong>Layers of a neural network</strong></h3>



<p>Typically, a&nbsp;<strong>neural network</strong>&nbsp;is constructed from 3 different layer types. The first is the input layer – a layer that receives the&nbsp;<strong>data</strong>&nbsp;to be fed into the network. The second stage is that of the hidden layers. It is here that the whole comutation is done. The last is the output layer that produces the output for the given set of input. The signals make their way from the first (input) layer to the last (output) layer. In some cases, it is possible that the signals traverse through these layers multiple times.</p>



<h3 class="wp-block-heading"><strong>How do neural networks work?</strong></h3>



<p><strong>Neural network</strong>s are trained by processing examples. Each example has a known “input” and also a “result”. Both of these are stored within the&nbsp;<strong>data</strong>&nbsp;structure of the&nbsp;<strong>neural network</strong>&nbsp;itself. When a&nbsp;<strong>neural network</strong>&nbsp;is trained, there is a difference between the predicted output and the actual output. This is called an error. The network then adjusts itself on the basis of a programmed rule. After the adjustments, the end result delivered is close to what the target output is. Later, the training is terminated – after having done sufficient adjustments.</p>



<h3 class="wp-block-heading"><strong>Applications of neural networks</strong></h3>



<p><strong>Neural network</strong>s boast of applications with clear business use cases. It is because of this that organizations are inclined towards investing in them. Right from logistics to customer support,&nbsp;<strong>neural network</strong>s have made their presence felt in all of these.</p>



<p>Out of the wide range of applications, the most prominent ones turn out to be image recognition, pattern recognition, decision making, and sequence recognition among others.</p>



<ul class="wp-block-list"><li>Financial institutions employ&nbsp;<strong>neural network</strong>s for various tasks – fraud detection, loan delinquencies, credit evaluation and attrition to name a few.</li><li>On the medical front,&nbsp;<strong>neural network</strong>s can aid in performing cancer cell analysis, emergency room test advisement, and even prosthesis design.</li><li><strong>Neural network</strong>s hold the potential to study the behavior of the customers.</li><li>Transportation has a lot to do with&nbsp;<strong>neural network</strong>&nbsp;Vehicle scheduling, power routing the systems, etc. are just two of the many applications of&nbsp;<strong>neural network</strong>s in this area.</li></ul>



<p>All in all, <strong>neural network</strong>s hold the potential to solve those intractable problems that otherwise traditional methods struggled to. <strong>Neural network</strong>s have surpassed all odds to reach a stage where we can reap their benefits for now as well as the days that lie ahead.</p>
<p>The post <a href="https://www.aiuniverse.xyz/neural-network-everything-from-the-scratch/">NEURAL NETWORK – EVERYTHING FROM THE SCRATCH!</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/neural-network-everything-from-the-scratch/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Soft-bodied Robots More Efficient With New Deep Learning Neural Network</title>
		<link>https://www.aiuniverse.xyz/soft-bodied-robots-more-efficient-with-new-deep-learning-neural-network/</link>
					<comments>https://www.aiuniverse.xyz/soft-bodied-robots-more-efficient-with-new-deep-learning-neural-network/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 25 Mar 2021 06:35:28 +0000</pubDate>
				<category><![CDATA[Deep Learning]]></category>
		<category><![CDATA[deep learning]]></category>
		<category><![CDATA[Efficient]]></category>
		<category><![CDATA[network]]></category>
		<category><![CDATA[Robots]]></category>
		<category><![CDATA[Soft-bodied]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=13788</guid>

					<description><![CDATA[<p>Source &#8211; https://www.unite.ai/ Soft-bodied robots are an extremely important tool in the wider field of robotics, as traditional and rigid-bodied robots are not able to complete the <a class="read-more-link" href="https://www.aiuniverse.xyz/soft-bodied-robots-more-efficient-with-new-deep-learning-neural-network/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/soft-bodied-robots-more-efficient-with-new-deep-learning-neural-network/">Soft-bodied Robots More Efficient With New Deep Learning Neural Network</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.unite.ai/</p>



<p>Soft-bodied robots are an extremely important tool in the wider field of robotics, as traditional and rigid-bodied robots are not able to complete the same types of tasks. The former interacts with humans more safely, and they can do things like fit into tight spaces.&nbsp;</p>



<p>One of the major challenges involved with soft robots is that they must know where all of their body parts are to complete programmed tasks, and this becomes more difficult as soft robots are able to deform in almost infinite ways.</p>



<p>Now, researchers at MIT have developed a new&nbsp;deep learning&nbsp;algorithm that helps engineers design soft robots in a way that enables them to collect more data on their surroundings. The algorithm works by suggesting an optimized placement of sensors within the robot’s body. This enables it to complete assigned tasks while interacting with the environment.</p>



<p>Alexander Amini is co-lead author of the research along with Andrew Spielberg, both PhD students in MIT Computer Science and Artificial Intelligence Laboratory. The research was published in&nbsp;<em>IEEE Robotics and Automation Letters</em>&nbsp;with other co-authors including Lillian Chin, PhD student, and Wojciech Matusik and Daniela Rus, professors at the university.</p>



<p>“The system not only learns a given task, but also how to best design the robot to solve that task,” Amini says. “Sensor placement is a very difficult problem to solve. So, having the solution is extremely exciting.”</p>



<h3 class="wp-block-heading"><strong>Rigid vs. Soft Robots</strong></h3>



<p>One of the biggest advantages of rigid robots is that they have a limited range of motion, and while this seems like a downside, it means the finite number of joints and limbs lead to more manageable calculations.</p>



<p>These calculations are easier to work with when it comes to algorithms controlling mapping and motion planning. Soft robots cannot do the same as they are flexible.</p>



<p>“The main problem with soft robots is that they are infinitely dimensional,” Spielberg says. “Any point on a soft-bodied robot can, in theory, deform in any way possible.”</p>



<p>In the past, researchers have used an external camera to chart the robot’s position, which is then fed back into the robot’s control program. The new team looked for a way to create a soft robot untethered from external aid.</p>



<p>“You can’t put an infinite number of sensors on the robot itself,” Spielberg continues. “So, the question is: How many sensors do you have, and where do you put those sensors in order to get the most bang for your buck?”</p>



<p>The researchers developed a novel neural network architecture that can optimize sensor placements and learn to efficiently complete tasks. They first split the robot’s body into different regions called “particles.”&nbsp;</p>



<p>The neural network used each particle’s rate of strain as an input, and through trial and error, the network can learn the most efficient sequence of movements for a given task. The network also keeps track of which particles are used more than others so that the network’s inputs can be adjusted.</p>



<h3 class="wp-block-heading"><strong>Outperforming Humans in Sensor Placement</strong></h3>



<p>The network suggests the placement of the sensors on the robot by optimizing the most important particles. In tests, the algorithm outperformed humans when it came to locating the most efficient places to put the sensors.</p>



<p>The algorithm was then tested against a series of expert predictions.</p>



<p>“Our model vastly outperformed humans for each task, even though I looked at some of the robot bodies and felt very confident on where the sensors should go,” says Amini. “It turns out there are a lot more subtleties in this problem than we initially expected.”</p>



<p>According to Spielberg, the new development could help automate the robot design process and help come up with new algorithms to control robot movements.&nbsp;</p>



<p>&nbsp;“…we also need to think about how we’re going to sensorize these robots, and how that will interplay with other components of that system,” he says. “That’s something where you need a very robust, well-optimized sense of touch. So, there’s potential for immediate impact.”</p>



<p>“Automating the design of sensorized soft robots is an important step toward rapidly creating intelligent tools that help people with physical tasks,” says Rus. “The sensors are an important aspect of the process, as they enable the soft robot to “see” and understand the world and its relationship with the world.”</p>



<p></p>
<p>The post <a href="https://www.aiuniverse.xyz/soft-bodied-robots-more-efficient-with-new-deep-learning-neural-network/">Soft-bodied Robots More Efficient With New Deep Learning Neural Network</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/soft-bodied-robots-more-efficient-with-new-deep-learning-neural-network/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>IASST deploys deep learning network for breast cancer prognosis</title>
		<link>https://www.aiuniverse.xyz/iasst-deploys-deep-learning-network-for-breast-cancer-prognosis/</link>
					<comments>https://www.aiuniverse.xyz/iasst-deploys-deep-learning-network-for-breast-cancer-prognosis/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 18 Mar 2021 06:34:13 +0000</pubDate>
				<category><![CDATA[Deep Learning]]></category>
		<category><![CDATA[breast]]></category>
		<category><![CDATA[Cancer]]></category>
		<category><![CDATA[deep learning]]></category>
		<category><![CDATA[deploys]]></category>
		<category><![CDATA[IASST]]></category>
		<category><![CDATA[network]]></category>
		<category><![CDATA[prognosis]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=13600</guid>

					<description><![CDATA[<p>Source &#8211; https://www.biospectrumindia.com/ A team from the&#160;Institute of Advanced Study in Science and Technology (IASST) in Guwahati, an autonomous institute of the Department of Science &#38; Technology, <a class="read-more-link" href="https://www.aiuniverse.xyz/iasst-deploys-deep-learning-network-for-breast-cancer-prognosis/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/iasst-deploys-deep-learning-network-for-breast-cancer-prognosis/">IASST deploys deep learning network for breast cancer prognosis</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.biospectrumindia.com/</p>



<p>A team from the&nbsp;Institute of Advanced Study in Science and Technology (IASST) in Guwahati, an autonomous institute of the Department of Science &amp; Technology, Govt of India, has presented the novel deep learning (DL) based quantitative evaluation of oestrogen or progesterone status with the help of Immunohistochemistry (IHC) specimen to grade for prediction of breast cancer.</p>



<p>The scientists developed a classification&nbsp;method based on&nbsp;deep learning (DL)&nbsp;network to evaluate hormone status for prognosis of breast cancer.&nbsp;</p>



<p>The study by Dr Lipi B Mahanta&nbsp;and her group&nbsp;was done in collaboration with clinicians of B Borooah Cancer Institute, the premier cancer institute of the region. With an enormous prospect for converting to a workable commercial software, this work has been accepted for publication in the pioneer journal Applied Soft Computing.</p>



<p>IHC strain is used as a prognostic marker in breast cancer pathology and involves a special kind of colour staining for identifying malignant nuclei. It possesses different intensity based on which categories are defined in terms of Allred score (ranges 0 to 3) respectively. Scoring systems called Allred and H-score are used by pathologists in the quantification of the immunohistochemical reaction of oestrogen receptor (ER) and progesterone receptor (PR) tissue slides. Hormone receptors, namely oestrogen receptor (ER) and progesterone receptor (PR) contribute to predicting cancer progression and associated risk of late recurrence of the disease.</p>



<p>The team developed an algorithm that indicated whether or not the cancer cells have hormone receptors on their surface. The proposed architecture, namely IHC-Net, can semantically segment the exact positive and negative nuclei from tissue images. Finally, an ensemble method is used, which integrates the decision of three machine learning (ML) models for the final Allred cancer score.</p>
<p>The post <a href="https://www.aiuniverse.xyz/iasst-deploys-deep-learning-network-for-breast-cancer-prognosis/">IASST deploys deep learning network for breast cancer prognosis</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/iasst-deploys-deep-learning-network-for-breast-cancer-prognosis/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>How artificial intelligence can fight cyberattacks</title>
		<link>https://www.aiuniverse.xyz/how-artificial-intelligence-can-fight-cyberattacks/</link>
					<comments>https://www.aiuniverse.xyz/how-artificial-intelligence-can-fight-cyberattacks/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 05 Mar 2021 07:23:01 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Cyberattacks]]></category>
		<category><![CDATA[fight]]></category>
		<category><![CDATA[network]]></category>
		<category><![CDATA[Traditional]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=13267</guid>

					<description><![CDATA[<p>Source &#8211; https://www.fortuneindia.com/ Traditional network security tools have become outdated in the face of sophisticated cyberattacks. Our cybersecurity strategies should embrace latest technologies, such as A.I. and <a class="read-more-link" href="https://www.aiuniverse.xyz/how-artificial-intelligence-can-fight-cyberattacks/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-artificial-intelligence-can-fight-cyberattacks/">How artificial intelligence can fight cyberattacks</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source &#8211; https://www.fortuneindia.com/</p>



<p>Traditional network security tools have become outdated in the face of sophisticated cyberattacks. Our cybersecurity strategies should embrace latest technologies, such as A.I. and machine learning.</p>



<p>For many years, traditional network security tools such as firewalls, anti-virus software, web proxies etc. have been the go-to defences for organisations. While these tools were effective to a certain extent in the past, the dramatic changes brought to the digital world by “Industry 4.0,” over the last decade, has seen a dynamic shift to the cyber-threat landscape thereby reducing the effectiveness of these traditional tools.</p>



<p>As we continue to embrace digital revolution in all aspects of our life, the threat to the cybersecurity landscape is only increasing with each passing day. Cybercriminals, today, are using cutting-edge technologies to launch destructive cyberattacks on large corporations that have far-reaching consequences as was seen in the case of Adobe and Equifax, and India being at the forefront of digitization has become the prime target for cyber criminals. In fact, as per the Acronis Cyber Readiness Report of 2020, India is reporting more cyberattacks than any other country in the world.</p>



<p>From an organisational perspective, apart from loss of critical information, financial losses, reputational damages and disruption in operations, in most cases, it becomes impossible to identify the intensity of the cyberattack, and the amount of data that was actually compromised often remains unknown. This was witnessed even recently when hackers launched attacks on multiple Indian pharmaceutical companies where, till date, there is no visibility on the degree of attack and the nature of data that was compromised.</p>



<p>Cybersecurity is a critical aspect for all organisations today. Unfortunately, most businesses are not adequately equipped to handle these complex cyber threats simply because they continue to rely on traditional techniques. They do not possess the high-end tools required to quickly identify and recover from threats which, if adopted, can go a long way in ensuring cybersecurity. For instance, a study conducted by Cisco in 2019 revealed that A.I. based tools can identify up to 95% of threats faced by an organisation. That being the case, for a country that thrives on information technology, it is critical that organisations transition from traditional solutions to technologically advanced solutions at the earliest.</p>



<p>While talking about technologically advanced solutions, organisations should start depending more on artificial intelligence (A.I.) based tools. Unlike traditional techniques that neutralize the effect of vulnerabilities only upon the identification of the same, the approach becomes very different with the aid of A.I. and machine learning enabled tools. A.I.-based systems are proactive in detecting vulnerabilities since they can analyse patterns and discover loose ends beforehand thereby enabling organisations to take preventive action before they are even affected with a security incident.</p>



<p>For instance, A.I. techniques like “User and Event Behavioural Analytics” can be used to analyse baseline behaviour of accounts and identify anomalous behaviour that might signal a zero-day cyberattack. This can protect organizations even before vulnerabilities are officially reported. An A.I. vendor named ‘Darktrace’ provides a software that utilises A.I. to understand the behaviour of each user, and the software automatically sends out an alert if there is a vital deviation from the normal baseline behaviour. Additionally, apart from using A.I. enabled solutions, organisations should also adopt simple measures like the use of a multi-factor authentication (M.F.A.) process to secure their systems. MFAs can help prevent some of the most common types of cyberattacks, including phishing, brute force and man-in-the-middle attacks.</p>



<p>It is important to remember that hackers are only becoming sophisticated by the day. It is not sufficient to simply introduce tools that ensure cybersecurity. It is equally important that organisations constantly understand the loopholes in their security systems and take measures to fix the same. For this purpose, organisations such as Tesla, Google etc. are increasingly turning to crowdsourced security measures, such as bug bounty programs, to find loopholes in their security systems, by hiring ethical hackers. In fact, many organisations are substituting their traditional penetration testing efforts with crowdsourced security measures since they offer a plethora of benefits including the ability to identify and fix vulnerabilities faster, paying for valid results rather than effort or time and varied expertise of hackers.</p>



<p>However, these techniques, be it A.I. enabled tools or crowdsourced security measures, can never work in isolation no matter how advanced they are. The effectiveness of the cybersecurity architecture of an organisation ultimately depends on the over-arching security model. This security model, thus, should not focus on tools that are merely reactive in nature. Instead, the overall security model should comprise of tools that prevent, predict, detect, and respond to threats in an efficient manner, and this is where the concept of adaptive security architecture comes to play.</p>



<p>Adaptive security, the buzz word in recent times, is an approach that analyses behaviours to protect against and adapt to threats even before they happen. Adaptive security architecture (ASA) is a concept and there are no pre-defined techniques on what constitutes ASA. Thus, organisations have the flexibility to introduce curated techniques (such as A.I.-based tools) so long as such techniques are able to predict, prevent, detect and respond to threats (elements of ASA) in a timely manner. For example, an implementation of ASA is the Emsisoft anti-malware that monitors the behaviour of all active programmes and sends out an alert if suspicious behaviour is detected. As opposed to focusing on preventive measures, ASA is built on the foundation of a more responsive, receptive and real-time outlook when protecting an organisation’s security systems.</p>



<p>While organisations can enforce technologically advanced protocols for ensuring cybersecurity, the role of personnel can never be ignored. Human error has a well-documented history of causing data breaches. This was seen when Equifax’s system was compromised for two whole months simply because of an oversight by the IT team. According to the UK Information Commissioner’s Office, human error was the cause of approximately 90% of data breaches in 2019. This only implies that cybersecurity is a top-down approach. Every single employee, from the CEO to the supervisor, plays an important role. That being the case, it is important that employees understand what they can do to protect the company’s digital assets, how to avoid falling for cybersecurity attacks, and who they should report potential incidents to.</p>



<p>On a concluding note, as India moves towards a five trillion dollar economy and with the IT sector leapfrogging through multiple stages of development faster than many western economies, there is an imminent need for organisations to invest in advanced technologies and personnel training to ensure a watertight cybersecurity architecture.</p>



<p><em>Views are personal. Bhushan is Partner and Chennai head, Shardul Amarchand Mangaldas &amp; Co and Viswanat is Associate, Shardul Amarchand Mangaldas &amp; Co.</em></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-artificial-intelligence-can-fight-cyberattacks/">How artificial intelligence can fight cyberattacks</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-artificial-intelligence-can-fight-cyberattacks/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Juniper extends AI-driven network insights to WAN and branch locations</title>
		<link>https://www.aiuniverse.xyz/juniper-extends-ai-driven-network-insights-to-wan-and-branch-locations/</link>
					<comments>https://www.aiuniverse.xyz/juniper-extends-ai-driven-network-insights-to-wan-and-branch-locations/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 30 Jul 2020 07:53:30 +0000</pubDate>
				<category><![CDATA[Reinforcement Learning]]></category>
		<category><![CDATA[AI-driven]]></category>
		<category><![CDATA[Juniper]]></category>
		<category><![CDATA[network]]></category>
		<category><![CDATA[WAN]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=10589</guid>

					<description><![CDATA[<p>Source: zdnet.com Juniper Networks on Wednesday announced it&#8217;s extending AI-driven insights to WAN and branch networks with a new cloud-based service called Juniper Mist WAN Assurance. Additionally, <a class="read-more-link" href="https://www.aiuniverse.xyz/juniper-extends-ai-driven-network-insights-to-wan-and-branch-locations/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/juniper-extends-ai-driven-network-insights-to-wan-and-branch-locations/">Juniper extends AI-driven network insights to WAN and branch locations</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: zdnet.com</p>



<p>Juniper Networks on Wednesday announced it&#8217;s extending AI-driven insights to WAN and branch networks with a new cloud-based service called Juniper Mist WAN Assurance. Additionally, the company is introducing a new conversational interface to networking operations, enabling either IT teams or end users to more easily communicate with Marvis, Juniper&#8217;s virtual network assistant. </p>



<p>The new capabilities are a part of Juniper&#8217;s growing focus on AI-driven operations, which it stepped up last year with its acquisition of Mist Systems. Mist and Juniper have already delivered AI-driven networking operations to the enterprise with wi-fi, wired and security services. With the addition of WAN, Juniper says it can provide customers with end-to-end AI-enhanced visibility. </p>



<p>Ultimately, the goal is to use AI to shift the focus from network and application behavior to the actual user experience.&nbsp;</p>



<p>The new Juniper Mist WAN Assurance service streams key telemetry data from Juniper SRX devices to the cloud-based Mist AI engine. This enables customizable WAN service levels, and it allows for a proactive response to anomaly detections. The service works with Marvis to correlate events across the LAN, WLAN and WAN for rapid fault isolation and resolution.&nbsp;</p>



<p>&#8220;Today when large enterprises have a problem, they don&#8217;t know where to look,&#8221; Sujai Hajela, Mist co-founder and Juniper SVP, said to ZDNet. Juniper Mist WAN Assurance aims to solve that problem.&nbsp;</p>



<p>Meanwhile, with the new conversational interface for Marvis, customers will be able to learn about their networks with natural language questions such as, &#8220;What was wrong with Bob&#8217;s Zoom call yesterday?&#8221;</p>



<p>With the new interface, Marvis can provide answers to questions based on its access to a large knowledge base, with interactive queries for further help. It leverages reinforcement learning to get better at answering questions over time.&nbsp;</p>



<p>Since the Mist acquisition, Juniper has started rearranging its enterprise business unit around the notion of &#8220;AI-driven enterprise,&#8221; Hajela said. The reformatted business unit, led by Hajela, brings wired access, wireless access and WAN under common leadership with dedicated sales, marketing and engineering.&nbsp;</p>



<p>&#8220;The only way to quantify end user experience is to use AI,&#8221; he said, with a &#8220;cloud stack built from the ground up built to handle AI. We are now extending that paradigm across Juniper.&#8221;</p>
<p>The post <a href="https://www.aiuniverse.xyz/juniper-extends-ai-driven-network-insights-to-wan-and-branch-locations/">Juniper extends AI-driven network insights to WAN and branch locations</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/juniper-extends-ai-driven-network-insights-to-wan-and-branch-locations/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>8 tips before Starting a Career in Data Science</title>
		<link>https://www.aiuniverse.xyz/8-tips-before-starting-a-career-in-data-science/</link>
					<comments>https://www.aiuniverse.xyz/8-tips-before-starting-a-career-in-data-science/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Wed, 29 Jul 2020 06:35:56 +0000</pubDate>
				<category><![CDATA[Data Science]]></category>
		<category><![CDATA[applications]]></category>
		<category><![CDATA[Career]]></category>
		<category><![CDATA[data science]]></category>
		<category><![CDATA[network]]></category>
		<category><![CDATA[Tools]]></category>
		<category><![CDATA[training program]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=10552</guid>

					<description><![CDATA[<p>Source: newcivilengineer.com Data Science can look like an intimidating field to some, especially if you have just started your journey. What tool to learn? What techniques to <a class="read-more-link" href="https://www.aiuniverse.xyz/8-tips-before-starting-a-career-in-data-science/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/8-tips-before-starting-a-career-in-data-science/">8 tips before Starting a Career in Data Science</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: newcivilengineer.com</p>



<p>Data Science can look like an intimidating field to some, especially if you have just started your journey. What tool to learn? What techniques to focus on? Do you learn to code? How much statistics is required? There are several such questions that you will find answers during your journey. Here are some tips that will help you start a career in the field of Data Science:&nbsp;&nbsp;</p>



<h4 class="wp-block-heading">1. Select the right role&nbsp;</h4>



<p>In the data science industry, there are several varied roles. You can be a data engineer, a data scientist, a machine learning expert, a data visualization expert, etc. These are just a few roles that you can go into. Depending on your work experience and background, getting into a certain role can be easier than others. For example, for a software developer, shifting into data engineering is not that difficult. However, if you are not clear regarding the path you should be taking, you will be confused. If you are unclear about the differences between the roles and want to figure out what to do, follow the below-mentioned steps:&nbsp;</p>



<ul class="wp-block-list"><li>Talk to the people already working in the field of data science to determine what roles to entail.&nbsp;</li><li>You can also take mentorship from people. Take a small amount of time and ask the questions.&nbsp;</li><li>Figure out the area that you are interested in and select the role that is best suited for your field of study.&nbsp;</li></ul>



<p>An important tip that you should keep in mind while deciding your role is that you should not jump to a role. You must understand the requirements of the field and then prepare for it.&nbsp;</p>



<h4 class="wp-block-heading">2. Take up a training program and finish it&nbsp;</h4>



<p>Once you have decided the role for you, the next thing that you have to do is make efforts to understand the role. This includes more than just going through the requirements. There are several courses available for data scientists. Finding the resources to learn is not difficult. However, learning them is a different story. </p>



<p>If you decide to take up the training program, make sure that you go through it actively. You have to follow the coursework, discussions, and assignments that happen throughout the course. For example, if you want to be a data scientist, you have to take up a training program for data science certification . For the machine learning engineering role, you should enroll in a machine learning training program. You have to follow the course materials diligently. This also means that the assignments in the course are as important as going through the lectures. When you complete a course from one end to another, you will get a clearer picture of the field. <br> </p>



<h4 class="wp-block-heading">3. Choosing your tools and language&nbsp;</h4>



<p>As mentioned above, it is important for you to complete the course from end-to-end of the topic that you are pursuing. However, a common question that students face is on selecting the tools and language. The straight answer is that you&nbsp;have to&nbsp;select mainstream tools and languages and start your journey. Remember that tools are just a method of implementation. What is more important is understanding the concept.&nbsp;&nbsp;</p>



<p>However, if you are still not sure about what to use, you can start with the simplest language or the one you are familiar with. However, if you are not well-versed with coding, you can use GUI tools. As you cover the concepts, you can start coding as well.&nbsp;&nbsp;</p>



<h4 class="wp-block-heading">4. Join the peer group&nbsp;</h4>



<p>Now that you are aware of the role that you want and are prepared for it, the next important thing to do is join a peer group. This step is important as it helps in keeping you motivated. When you take on a new field, the whole process can be daunting if you do it alone. But when you have friends or colleagues beside you, the task can get a little easier. You can either join a group or connect with people online. Even if you don’t want to join a group, you can have a meaningful discussion over the internet. There are several online forums that can provide you with this form of environment.&nbsp;<br>&nbsp;</p>



<h4 class="wp-block-heading">5. Focus on practical applications as well as the theory&nbsp;</h4>



<p>While you are undergoing training, it is important that you pay attention to the practical applications of all the concepts. This will help you in understanding the concept better and have a deeper sense of how you can apply these concepts in reality. Here are a few tips that you can follow:&nbsp;</p>



<ul class="wp-block-list"><li>Make sure that you complete all the assignments and exercises for understanding the applications better.&nbsp;</li></ul>



<ul class="wp-block-list"><li>Work on data sets to apply your learning. Even if you don’t have any idea of the math behind a technique, you can understand what it does, its assumptions, and interpretation of the results. At a later stage, you can develop a better understanding.&nbsp;&nbsp;</li><li>Take a look at the solutions created by the people who are working in the field.&nbsp;&nbsp;</li></ul>



<h4 class="wp-block-heading">6. Follow the right resources&nbsp;</h4>



<p>If you want to learn Data Science, you must engulf every source of knowledge that you can find. Some of the most influential data scientists have blogs where you can find several useful sources of information.&nbsp;All of&nbsp;these data scientists are active and update the followers regarding their findings. They also post frequently about Data Science’s recent advancements.&nbsp;&nbsp;<br>&nbsp;</p>



<p>You&nbsp;have to&nbsp;make a habit of reading about data science every day and stay updated with the recent findings. However, you also&nbsp;have to&nbsp;make sure that you don’t follow incorrect practices.&nbsp;</p>



<h4 class="wp-block-heading">7. Improve your communication skills&nbsp;</h4>



<p>Communication skills are an important part of the data scientist role. Even if you are technically profound, it is possible that you get rejected because of your poor communication skills. What you can do is take the help of a friend who has good communication skills and ask them for honest feedback. You need this skill for sharing your idea or proving your point.&nbsp;</p>



<h4 class="wp-block-heading">8. Network&nbsp;&nbsp;</h4>



<p>When you have just started studying, it is better to focus on learning and not do too many things at the initial stage. Only after you have a hang of the field should you attend industry conferences and events. You can also participate in hackathons and meetups.&nbsp;&nbsp;</p>



<p>Data Science is in huge demand right now. And this is the reason why employers have been investing significant time, effort, and money. If you take the right steps, it can lead to significant growth.&nbsp;&nbsp;</p>
<p>The post <a href="https://www.aiuniverse.xyz/8-tips-before-starting-a-career-in-data-science/">8 tips before Starting a Career in Data Science</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/8-tips-before-starting-a-career-in-data-science/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>How is the Internet of Things (IoT) Vulnerable?</title>
		<link>https://www.aiuniverse.xyz/how-is-the-internet-of-things-iot-vulnerable/</link>
					<comments>https://www.aiuniverse.xyz/how-is-the-internet-of-things-iot-vulnerable/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Wed, 03 Jun 2020 07:33:19 +0000</pubDate>
				<category><![CDATA[Internet of things]]></category>
		<category><![CDATA[Internet of Things]]></category>
		<category><![CDATA[network]]></category>
		<category><![CDATA[Security]]></category>
		<category><![CDATA[software]]></category>
		<category><![CDATA[Vulnerable]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=9245</guid>

					<description><![CDATA[<p>Source: sdxcentral.com Internet of Things (IoT)  vulnerabilities stem from the tendencies of the devices to have low computational power and hardware limitations that don’t allow for built-in security features. <a class="read-more-link" href="https://www.aiuniverse.xyz/how-is-the-internet-of-things-iot-vulnerable/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-is-the-internet-of-things-iot-vulnerable/">How is the Internet of Things (IoT) Vulnerable?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: sdxcentral.com</p>



<p>Internet of Things (IoT)  vulnerabilities stem from the tendencies of the devices to have low computational power and hardware limitations that don’t allow for built-in security features. On top of that, IoT devices may sacrifice security in order to be first to market. If the vendor is a startup that fails, the needed updates of security patches won’t come, leaving a user with an open attack vector on their network.</p>



<p>It is a best practice for organizations and individuals to research the vendors of their IoT devices and ensure they are reputable and have a commitment to security with documentation. Furthermore, changing default credentials, using unique passwords, keeping the device’s software up to date, and encrypting stored and transmitted data can help protect IoT devices.</p>



<p>Vendors that are making IoT devices can reduce vulnerability by using anti-rollback mechanisms that prevent unauthorized entities from reverting software to an older, less secure version. Additionally, vendors should ensure the operating systems (OSs), code, and any third parties that provide software or hardware, are not providing insecure products.</p>



<h3 class="wp-block-heading">Device Vulnerabilities</h3>



<p>An IoT device can have one or multiple vulnerabilities that make it an easy target for hackers to gain access to a network and move laterally to more critical devices or systems. The following points were informed by the Open Web Application Security Project’s (OWASP’s) Top 10 2018 list of IoT vulnerabilities.</p>



<p>IoT devices often require passwords for users to access services or control the device. The default credentials of the devices can be weak, easy to guess, or hardcoded. A hardcoded password, or embedded credentials, are unencrypted passwords that are in the source code of a device. The reasoning behind this is to simplify setting up devices at scale, despite the significant risk to the device’s security.</p>



<p>An extra step vendors should consider for making credentials harder to crack is two-factor authentication. For example, some intelligent thermostats support two-factor authentication when signing into user accounts. When a user, which includes organizations, utilizes two-factor authentication, they are both creating new usernames and passwords, along with using an additional form of credential that an attacker is less likely to have access to.</p>



<p>It isn’t just the credentials used to access the device that is at issue. The interfaces, like a backend API, that a device uses to connect to a larger network ecosystem can be compromised. When that occurs, it can be because there is no method to authenticate or authorize the entity accessing the device, weak encryption or lack thereof, or no filtering of traffic coming in or going out from the device.</p>



<p>Even if vulnerabilities like these can be identified, not all IoT devices can be updated securely. This is the case when firmware validation isn’t implemented, the update is delivered in plain-text, there are no anti-rollback mechanisms, or users are not notified of updates. An anti-rollback mechanism would prevent attackers from downgrading a device to an older edition of the software that has known security vulnerabilities the attacker can exploit.</p>



<p>If an IoT vendor develops its device or devices with insecure software libraries or other components that are from an insecure source, then the device will naturally be insecure. The other components include insecure customization of operating system (OS) platforms and the use of third-party software and hardware that come from a compromised supply chain.</p>



<p>Where a user’s or organization’s personal information is stored is also important. If it is stored on an insecure device or insecure environment, then it is vulnerable to being discovered by an attacker. Data encryption is a basic and near-mandatory approach that can secure data in storage, in transit, or during processing.</p>



<h3 class="wp-block-heading">Attackers Take Advantage of IoT Vulnerabilities</h3>



<p>A major player in the malware world that focuses on IoT devices is the Mirai malware, which creates a botnet largely consisting of IoT devices. It infects a device through brute force password attempts where it goes through known default credentials that allow access to the device. Once inside it forces the device to scan the internet for vulnerable devices, which tend to be IoT devices. Once a sufficiently large botnet is made, they are typically used to launch a distributed denial of service (DDoS) attack on an organization. The result is the organization’s network and subsequent services it provides via the internet go down. This reinforces the importance of changing passwords away from their defaults and using two-factor authentication where possible with IoT devices.</p>



<p>A different kind of vulnerability that is known but has not been reported to be exploited, was found in the St. Jude Medical’s Merlin@home cardiac devices. These devices included pacemakers and defibrillators. According to the Food and Drug Administration (FDA), a vulnerability existed in the devices’ RF transmitters. Had the transmitters’ vulnerability been exploited, the battery could be drained rapidly, or the device could send shocks in the incorrect pace. The devices transmit over radio frequencies in order to send data to physicians to assess and monitor the device’s function, which means fewer in-person visits to the doctor for check-ups. The data can be transmitted over cellular or wireless internet connections.</p>



<h3 class="wp-block-heading">Securing IoT Devices in the Office and for Remote Workers</h3>



<p>Typical security best practices still apply to IoT devices and can be used to counteract the vulnerabilities mentioned above. Encrypting data is a major aspect of securing data and transmissions so attackers cannot read any data they would otherwise have access to after compromising the device. Personal, proprietary, or confidential information held by an organization or the device vendor should be encrypted as a best practice; however, unless a vendor discloses the privacy protocol information to users and organizations, the users and organizations cannot know what type of encryption, if any, is being used.</p>



<p>Using password managers, or simply changing default passwords to something unique to a user or organization’s catalog of passwords, is part of quality password hygiene that can prevent IoT devices from being compromised. Two-factor authentication increases the strength of credentials even further. This is especially helpful when it comes to malware like Mirai that uses known default credentials to quickly gain access to IoT devices.</p>



<p>As more employees start to work from home offices, IoT devices in their homes become an attack vector that could lead back to an organization’s sensitive information. Organizations that can afford to provide security tools to their employees known to have consumer IoT devices should do so to protect their networks. Virtualized security tools are helpful in securing IoT devices because they can be scaled more easily to accommodate for a large remote workforce. Virtual private networks (VPNs) and virtualized firewalls can enable encryption, monitor data entering and exiting the local network, and prevent malware from getting to IoT devices in the home.</p>



<h3 class="wp-block-heading">IoT Vulnerabilities: Key Takeaways</h3>



<ol class="wp-block-list"><li>IoT devices are vulnerable because they do not have the computational power to run security functions and vendors may sacrifice security in the rush to market.</li><li>Organizations should research the vendors they buy from to ensure they are reputable and security-minded.</li><li>A best practice to secure a device is to make new login credentials and use two-factor authentication to access and control IoT devices.</li><li>The Mirai malware easily finds and infects IoT devices and spreads by scanning for devices with default credentials.</li></ol>
<p>The post <a href="https://www.aiuniverse.xyz/how-is-the-internet-of-things-iot-vulnerable/">How is the Internet of Things (IoT) Vulnerable?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-is-the-internet-of-things-iot-vulnerable/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Could Network Data Be a Lifesaver During a Pandemic?</title>
		<link>https://www.aiuniverse.xyz/could-network-data-be-a-lifesaver-during-a-pandemic/</link>
					<comments>https://www.aiuniverse.xyz/could-network-data-be-a-lifesaver-during-a-pandemic/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Fri, 15 May 2020 07:21:15 +0000</pubDate>
				<category><![CDATA[Data Mining]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[could]]></category>
		<category><![CDATA[data mining]]></category>
		<category><![CDATA[network]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=8792</guid>

					<description><![CDATA[<p>Source: thefastmode.com One hundred years ago, a flu pandemic swept across the world, claiming more than 50 million lives between 1918 and 1920. Now we again face <a class="read-more-link" href="https://www.aiuniverse.xyz/could-network-data-be-a-lifesaver-during-a-pandemic/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/could-network-data-be-a-lifesaver-during-a-pandemic/">Could Network Data Be a Lifesaver During a Pandemic?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: thefastmode.com</p>



<p>One hundred years ago, a flu pandemic swept across the world, claiming more than 50 million lives between 1918 and 1920. Now we again face the reality of a viral pandemic, and we find ourselves struggling to track the rapid transmission and replication in hopes of getting ahead of the curve. Are we destined to continually repeat this cycle? Or is there a way to completely eradicate viral pandemics before the next century?</p>



<p>One thing we do know is that in the twenty-first century we have something that previous generations did not &#8211; advanced technology. Technologies such as artificial intelligence (AI), machine learning (ML), location intelligence and data analytics can be harnessed to slow and stop the spread of viral infections, provided we can combine them with the promise of next-generation 5G networks.</p>



<p><strong>Together, from a distance</strong></p>



<p>We’ve come a long way from the wired switchboard telephones of one hundred years ago. Mobile technology is ubiquitous; in fact, more than 80% of the population in China, Europe, Russia and North America subscribes to mobile service, according to the GSMA. And it is this hyper-connectivity that is going to be crucial in turning the tide.</p>



<p>Although the transmission of a virus itself is almost impossible to accurately track, we do have the means to track individuals that have tested positive, or those who may have come into contact with it. More than 11 countries worldwide are already using their mobile networks to track, surveil and stem the spread of the current pandemic. At the very heart of this tactic is location intelligence technology.</p>



<p>Having individual sets of data is useful, but when massive volumes of data are collected and collated at scale, the possibilities become almost limitless. Thanks to exponential advances in AI, ML and data mining algorithms, what was a simple means of tracking mobile network subscribers’ locations in order to provide service suddenly becomes a powerful analytics engine to identify patterns of movement and subscriber behaviors. As high-speed 5G networks become more widespread, this new level of insight will prove invaluable when it comes to making predictions about population movement, helping healthcare workers and governments halt the spread of infectious viral diseases.</p>



<p><strong>The promise of tomorrow</strong></p>



<p>Of course, there are a number of questions about how to best leverage these innovations, as well as discussions to be had about how to anonymize and aggregate this data so it can be used to inform better decision making while protecting subscribers’ privacy. We have tried to address the most salient points in these discussions in a recent paper entitled Today’s Technologies Can Save Lives.</p>



<p>But what’s clear is that in order to make pandemics a thing of the past, we must fully realize the potential of the technologies we have at our disposal, while also respecting one another. Ultimately, it’s our human ingenuity and technological capability that will win the war against pandemics.</p>



<p>As we all practice ‘social distancing’ for the foreseeable future, it’s true that we are now more dependent than ever before on technology to keep us connected. But if we can capitalize on the unbridled potential of today’s advanced technologies, the type of crisis we’re now facing could be confined to the history books for good.</p>
<p>The post <a href="https://www.aiuniverse.xyz/could-network-data-be-a-lifesaver-during-a-pandemic/">Could Network Data Be a Lifesaver During a Pandemic?</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/could-network-data-be-a-lifesaver-during-a-pandemic/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Rise in valuation of deep learning chipset market</title>
		<link>https://www.aiuniverse.xyz/rise-in-valuation-of-deep-learning-chipset-market/</link>
					<comments>https://www.aiuniverse.xyz/rise-in-valuation-of-deep-learning-chipset-market/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Thu, 30 Jan 2020 07:16:23 +0000</pubDate>
				<category><![CDATA[Deep Learning]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[asia pacific]]></category>
		<category><![CDATA[Automation]]></category>
		<category><![CDATA[chipset market]]></category>
		<category><![CDATA[deep learning]]></category>
		<category><![CDATA[market research]]></category>
		<category><![CDATA[network]]></category>
		<category><![CDATA[Revenue]]></category>
		<category><![CDATA[robot]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=6466</guid>

					<description><![CDATA[<p>Source: eletimes.com Transparency Market Research delivers key insights on the global deep learning chipset market. In terms of revenue, the deep learning chipset market is estimated to <a class="read-more-link" href="https://www.aiuniverse.xyz/rise-in-valuation-of-deep-learning-chipset-market/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/rise-in-valuation-of-deep-learning-chipset-market/">Rise in valuation of deep learning chipset market</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: eletimes.com</p>



<p>Transparency Market Research delivers key insights on the global deep learning chipset market. In terms of revenue, the deep learning chipset market is estimated to expand at a CAGR of ~24% during the forecast period, owing to numerous factors, regarding which TMR offers thorough insights and forecasts in its report on the deep learning chipset market.</p>



<p>The deep learning technology is driving advancements in artificial intelligence (AI). In the current scenario, the deep learning chipset technology is being led by graphics processing units and central processing units. However, in the next few years, other chipset types including, application-specific integrated circuits and field programmable gate arrays, are expected to play an extended role.</p>



<p><strong>Deep Learning Chipset Market: Dynamics</strong></p>



<p>While in the current scenario, a large number of developments are being witnessed in the robotics industry, the difference between human skills and robot motor skills still remains large. Machines still need to go a long way to match the human proficiency even at the level of basic sensorimotor skills such as grasping things. However, by linking learning with a continuous feedback, this gap can be bridged. By doing so, it would become easier for robots to understand the complexity of the current world and handle issues intelligently and reliably.</p>



<p>The neural network technology has made great strides to design computer programs that can process images, text, and speech, and can draw pictures too. However, introducing various actions and control mechanism adds substantial new challenges to the network. Overcoming these challenges would help in understanding the method in which machines would communicate in the current ecosystem. By bringing the power of large-scale deep learning to the robotic control, fundamental issues in robotics and <strong>automation</strong> can be easily resolved. This, in turn, is expected to augment the deep learning chipset market during the forecast period.</p>



<p><strong>Deep Learning Chipset Market: Prominent Regions</strong></p>



<p>North America is the dominant region in the deep learning chipset market. Growth of the market in the region can be attributed to high investments and a large number of <strong>manufacturers</strong> in the region. Moreover, the market in Europe is expected to witness significant growth during the forecast period, due to increasing demand for deep learning chipsets for use in prediction of frauds and failures in the region. Asia Pacific, South America, and Middle East &amp; Africa are emerging markets for deep learning chipsets, offering lucrative opportunities to vendors and system integrators in the long term.</p>



<p><strong>Deep Learning Chipset Market: Key Players</strong></p>



<p>Key players operating in the global<strong> deep learning</strong> chipset market are IBM Corporation, Graphcore Ltd, CEVA, Inc., Advanced Micro Devices, Inc., NVIDIA Corporation, Intel Corporation, IBM Corporation, Movidius, XILINX INC., TeraDeep Inc., QUALCOMM Incorporated, and Alphabet Inc.</p>
<p>The post <a href="https://www.aiuniverse.xyz/rise-in-valuation-of-deep-learning-chipset-market/">Rise in valuation of deep learning chipset market</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/rise-in-valuation-of-deep-learning-chipset-market/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>How machine learning and automation can modernize the network edge</title>
		<link>https://www.aiuniverse.xyz/how-machine-learning-and-automation-can-modernize-the-network-edge/</link>
					<comments>https://www.aiuniverse.xyz/how-machine-learning-and-automation-can-modernize-the-network-edge/#respond</comments>
		
		<dc:creator><![CDATA[aiuniverse]]></dc:creator>
		<pubDate>Sat, 18 Jan 2020 07:35:20 +0000</pubDate>
				<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[applications]]></category>
		<category><![CDATA[Automation]]></category>
		<category><![CDATA[data centers]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[modernize]]></category>
		<category><![CDATA[network]]></category>
		<guid isPermaLink="false">http://www.aiuniverse.xyz/?p=6234</guid>

					<description><![CDATA[<p>Source: siliconangle.com Applications are expected to move from data centers to edge facilities in record numbers, opening up a huge new market opportunity. The edge computing market <a class="read-more-link" href="https://www.aiuniverse.xyz/how-machine-learning-and-automation-can-modernize-the-network-edge/">Read More</a></p>
<p>The post <a href="https://www.aiuniverse.xyz/how-machine-learning-and-automation-can-modernize-the-network-edge/">How machine learning and automation can modernize the network edge</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Source: siliconangle.com</p>



<p>Applications are expected to move from data centers to edge facilities in record numbers, opening up a huge new market opportunity. The edge computing market is expected to grow at a compound annual growth rate of 36.3 percent between now and 2022, fueled by rapid adoption of the “internet of things,” autonomous vehicles, high-speed trading, content streaming and multiplayer games.</p>



<p>What these applications have in common is a need for near zero-latency data transfer, usually defined as less than five milliseconds, although even that figure is far too high for many emerging technologies.&nbsp;&nbsp;</p>



<p>The specific factors driving the need for low latency vary. In IoT applications, sensors and other devices capture enormous quantities of data, the value of which degrades by the millisecond. Autonomous vehicles require information in real-time to navigate effectively and avoid collisions. The best way to support such latency-sensitive applications is to move applications and data as close as possible to the data ingestion point, therefore reducing the overall round-trip time. Financial transactions now occur at sub-millisecond cycle times, leading one brokerage firm to invest more than $100 million to overhaul its stock trading platform in a quest for faster and faster trades.</p>



<h3 class="wp-block-heading">Operational challenges</h3>



<p>As edge computing grows, so do the operational challenges for telecommunications service provider such as Verizon Communications Inc., AT&amp;T Corp. and T-Mobile USA Inc. For one thing, moving to the edge essentially disaggregates the traditional data center. Instead of massive numbers of servers located in a few centralized data centers, the provider edge infrastructure consists of thousands of small sites, most with just a handful of servers. All of those sites require support to ensure peak performance, which strains the resources of the typical information technology group to the breaking point — and sometimes beyond.&nbsp;</p>



<p>Another complicating factor is network functions moving toward cloud-native applications deployed on virtualized, shared and elastic infrastructure, a trend that has been accelerating in recent years. In a virtualized environment, each physical server hosts dozens of virtual machines and/or containers that are constantly being created and destroyed at rates far faster than humans can effectively manage. Orchestration tools automatically manage the dynamic virtual environment in normal operation, but when it comes to troubleshooting, humans are still in the driver’s seat.&nbsp;</p>



<p>And it’s a hot seat to be in. Poor performance and service disruptions hurt the service provider’s business, so the organization puts enormous pressure on the IT staff to resolve problems quickly and effectively. The information needed to identify root causes is usually there. In fact, navigating the sheer volume of telemetry data from hardware and software components is one of the challenges facing network operators today.&nbsp;</p>



<h3 class="wp-block-heading">Machine learning and automation&nbsp;</h3>



<p>A data-rich, highly dynamic, dispersed infrastructure is the perfect environment for artificial intelligence, specifically machine learning. The great strength of machine learning is the ability to find meaningful patterns in massive amounts of data that far outstrip the capabilities of network operators. Machine learning-based tools can self-learn from experience, adapt to new information and perform humanlike analyses with superhuman speed and accuracy.&nbsp;&nbsp;</p>



<p>To realize the full power of machine learning, insights must be translated into action — a significant challenge in the dynamic, disaggregated world of edge computing. That’s where automation comes in.</p>



<p>Using the information gained by machine learning and real-time monitoring, automated tools can provision, instantiate and configure physical and virtual network functions far faster and more accurately than a human operator. The combination of machine learning and automation saves considerable staff time, which can be redirected to more strategic initiatives that create additional operational efficiencies and speed release cycles, ultimately driving additional revenue.&nbsp;</p>



<h3 class="wp-block-heading">Scaling cloud-native applications</h3>



<p>Until recently, the software development process for a typical telco consisted of a lengthy sequence of discrete stages that moved from department to department and took months or even years to complete. Cloud-native development has largely made obsolete this so-called “waterfall” methodology in favor of a high-velocity, integrated approach based on leading-edge technologies such as microservices, containers, agile development, continuous integration/continuous deployment and DevOps. As a result, telecom providers roll out services at unheard-of velocities, often multiple releases per week.&nbsp;</p>



<p>The move to the edge poses challenges for scaling cloud-native applications. When the environment consists of a few centralized data centers, human operators can manually determine the optimum configuration needed to ensure the proper performance for the virtual network functions or VNFs that make up the application.</p>



<p>However, as the environment disaggregates into thousands of small sites, each with slightly different operational characteristics, machine learning is required. Unsupervised learning algorithms can run all the individual components through a pre-production cycle to evaluate how they will behave in a production site. Operations staff can use this approach to develop a high level of confidence that the VNF being tested is going to come up in the desired operational state at the edge.&nbsp;</p>



<h3 class="wp-block-heading">Troubleshooting at the speed of AI&nbsp;</h3>



<p>AI and automation can also add significant value in troubleshooting within cloud-native environments. Take the case of a service provider running 10 instances of a voice call processing application as a cloud-native application at an edge location. A remote operator notices that one VNF is performing significantly below the other nine.&nbsp;&nbsp;</p>



<p>The first question is, “Do we really have a problem?” Some variation in performance between application instances is not unusual, so answering the question requires a determination of the normal range of VNF performance values in actual operation. A human operator could take readings of a large number of instances of the VNF over a specified time period and then calculate the acceptable key performance indicator values — a time-consuming and error-prone process that must repeated frequently to account for software upgrades, component replacements, traffic pattern variations and other parameters that affect performance.</p>



<p>In contrast, AI can determine KPIs in a fraction of the time and adjust the KPI values as needed when parameters change, all with no outside intervention. Once AI determines the KPI values, automation takes over. An automated tool can continuously monitor performance, compare the actual value to the AI-determined KPI and identify underperforming VNFs.</p>



<p>That information can then be forwarded to the orchestrator for remedial action such as spinning up a new VNF or moving the VNF to a new physical server. The combination of AI and automation helps ensure compliance with service-level agreements and removes the need for human intervention — a welcome change for operators weary of late-night troubleshooting sessions.&nbsp;</p>



<h3 class="wp-block-heading">Harnessing the competitive edge</h3>



<p>As service providers accelerate their adoption of edge-oriented architectures, IT groups must find new ways to optimize network operations, troubleshoot underperforming VNFs and ensure SLA compliance at scale. Artificial intelligence technologies such as machine learning, combined with automation, can help them do that.</p>



<p>In particular, there have been a number of advancements over the last few years to enable this AI-driven future. They include systems and devices to provide high-fidelity, high-frequency telemetry that can be analyzed, highly scalable message buses such as Kafka and Redis that can capture and process that telemetry, and compute capacity and AI frameworks such as TensorFlow and PyTorch to create models from the raw telemetry streams. Taken together, they can determine in real time if operations of production systems are in conformance with standards and find problems when there are disruptions in operations.</p>



<p>All that has the potential to streamline operations and give service providers a competitive edge — at the edge.</p>
<p>The post <a href="https://www.aiuniverse.xyz/how-machine-learning-and-automation-can-modernize-the-network-edge/">How machine learning and automation can modernize the network edge</a> appeared first on <a href="https://www.aiuniverse.xyz">Artificial Intelligence</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.aiuniverse.xyz/how-machine-learning-and-automation-can-modernize-the-network-edge/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
