Upgrade & Secure Your Future with DevOps, SRE, DevSecOps, MLOps!

We spend hours on Instagram and YouTube and waste money on coffee and fast food, but won’t spend 30 minutes a day learning skills to boost our careers.
Master in DevOps, SRE, DevSecOps & MLOps!

Learn from Guru Rajesh Kumar and double your salary in just one year.

Get Started Now!

Nutanix rolls out new capabilities for big data workloads

Source: itbrief.com.au

Nutanix has extended its Nutanix platform with new features around big data and analytics, as well as unstructured data storage. The new capabilities will be part of the Nutanix Objects 2.0.

They will include the ability to manage object data across multiple Nutanix clusters for achieving massive scale, increased object storage capacity per node, and formal Splunk SmartStore certification.

The enhancements add to Nutanix’s cloud platform which is optimised for big data applications, to deliver performance and scale, and help to maintain cost by maximising existing, unused resources.

According to the company, there is a demand for such capabilities as more companies look to efficiently manage extremely large volumes of unstructured data as well as analyse the data in real time to extract business insight.

Companies are reliant on business data to create personalised customer experiences. This results in IT teams struggling with siloes, complexity, and operational inefficiencies.

Options currently available do not offer secure, end-to-end solutions to run big data applications that can easily scale, Nutanix states.

As such, Nutanix’s software puts a focus on scale, performance and simplicity as well as in-built automation and simple operations with the idea of enabling data scientists, security teams and businesses to focus on extracting value from data.

New features for running big data workloads includes increased scale-out object storage with multi-cluster support, deeper storage nodes with up to 240TB of storage and enhanced security.

Nutanix VP of product marketing Greg Smith says, “Every company is striving to become a data-driven company. In addition, Nutanix offers a subscription licensing model to enable greater flexibility.

“By natively integrating object storage services with Nutanix’s HCI solution, IT teams can now leverage unused resources to reduce costs and streamline storage management and administration.

“Big data applications require incredible scale and performance at competitive cost structures. The Nutanix platform, with the addition of multi-cluster object storage, offers a compelling solution for unstructured object storage that leverages existing storage resources for improved storage economics.”

IDC research director at the storage team Amita Potnis says, “Digital transformation requires web-scale storage for enterprise workloads.

“Object storage is rapidly becoming the storage of choice for next gen and big data applications. As object storage makes the leap from the cloud to the data centre and mission critical workloads, economics must be balanced with performance.

“Nutanix is known for flexibility and simplicity. Multi-cluster support and certification with Splunk SmartStore with Nutanix Objects will allow for massive scale at the right price and performance that these workloads require.”

Furthermore, Nutanix Objects is now certified by Splunk as SmartStore compliant allowing customers to simply and seamlessly manage Splunk data growth with Nutanix Objects.

Joint customers can now run Splunk workloads on Nutanix software, and leverage Nutanix Objects for built-in object storage to support their Splunk environment. New capabilities included in Nutanix Objects 2.0 are now generally available.

Related Posts

What is Data Ethics and what are the Types of Data Ethics Tools?

What is Data Ethics? Data ethics is a branch of ethics that focuses on the responsible collection, use, and dissemination of data. With the rapid advancement of Read More

Read More

What is High-Performance Computing Clusters and what are the Components of HPC Clusters

Introduction to High-Performance Computing Clusters High-Performance Computing (HPC) clusters are crucial for organizations that need to process and analyze vast amounts of data in a short period. Read More

Read More

What is Cloud Computing and what are the Features and Benefits of Cloud Computing Platforms?

Introduction to Cloud Computing Platforms When we talk about cloud computing, we often refer to the various platforms that allow us to store, manage, and access data Read More

Read More

What is Big Data Processing and what are the Types of Big Data Processing Tools ?

What is Big Data Processing? Big data refers to extremely large data sets that cannot be processed by traditional computing methods. Big data processing involves various techniques Read More

Read More

Big Data Role in Decision making in addressing organizational problems

Source – https://www.techiexpert.com/ Enterprises and organizations always work to improve and mitigate how they respond to challenges and make their businesses agile at the center of every Read More

Read More

What Is The Definition Of Big Data?

Source – https://timesnewsexpress.com/ Did you realize that a fly motor can produce more than ten terabytes of data for only 30 minutes of flight time? What’s more, Read More

Read More
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x