Upgrade & Secure Your Future with DevOps, SRE, DevSecOps, MLOps!

We spend hours on Instagram and YouTube and waste money on coffee and fast food, but won’t spend 30 minutes a day learning skills to boost our careers.
Master in DevOps, SRE, DevSecOps & MLOps!

Learn from Guru Rajesh Kumar and double your salary in just one year.

Get Started Now!

Updated AtScale platform streamlines analysis of big data

Source: searchbusinessanalytics.techtarget.com

The latest AtScale platform update, released Wednesday, aims to simplify data searches and speed up the process of data analysis.

With the rollout of its 2019.2 platform, AtScale is attempting to make accessing and analyzing big data simpler and faster across both different databases and business intelligence platforms.

AtScale is a data virtualization warehouse vendor that was founded in 2013. The AtScale platform serves as a conduit between data lakes and other data sources such as Teradata, Oracle, Snowflake, Redshift, BigQuery, Greenplum and Postgres, and data analysts doing BI on platforms from Microsoft, Tableau, Qlik and other vendors.

In addition to faster performance, key components of the updated AtScale platform include augmented semantic intelligence in time-series  and time-relative analysis, and stronger security, particularly for Tableau Server Impersonation.

Using machine learning and augmented intelligence capabilities, the AtScale platform is now able to query data and receive the relevant information in just seconds, according to the vendor.

“AtScale offers an important technology capability,” said Boris Evelson, an analyst at Forrester. “With information stored in data lakes, and BI platforms existing outside those data lakes, AtScale connects the two. AtScale is bringing the BI to the data.”

He added, “It is definitely a trend to bring BI to where the data is.”

Bringing BI to the data fast is where the new AtScale platform looks to improve over previous iterations.

With the latest AtScale platform, Matthew Baird, co-founder and chief technology officer of AtScale, said enterprises using effectively structured queries can reduce query time from 30 minutes using traditional cloud data warehouses to less than 10 seconds.

“At AtScale we’re trying to be really smart about how we structure the query,” Baird said.

That, he said, comes down to the algorithms AtScale uses and the quality of its ML and AI capabilities.

Baird, who noted the need for trained data analysts far exceeds the number of data analysts available, said AtScale is trying to put the capabilities of what would be a data engineering team in the AtScale platform.

“When a query comes in, the team in the server can make decisions, and it can rewrite the query as needed,” he said.

The newest AtScale platform update comes just six weeks after the company released version 2019.1, which expanded support for cloud data transformation for on-premises, hybrid cloud and multi-cloud users.

Dave Menninger, senior vice president at Ventana Research, noted that while sometimes viewed in the same category as cloud data warehouses such as Snowflake and Google BigQuery, the AtScale platform builds out from what enterprises already have stored in the cloud.

AtScale “focuses on creating and managing the metadata needed to access and optimize queries of large and diverse data sources,” Menninger said.

Meanwhile, Wayne Eckerson, president of The Eckerson Group, said AtScale is finding its place as a platform that supports any and all databases and BI technologies and will provide their users with the same view of business data.

AtScale’s focus on the universal semantic layer – a means of abstracting and simplifying complex data into common business terms – is a good position for AtScale because all BI vendors can support a semantic layer, Eckerson said.

That, according to Baird, has been a significant attraction for customers.

“We’re seeing customers say they’re on a journey from a bunch of data sources to a single source of truth, and you guys can do that,” Baird said.

Related Posts

What is Data Ethics and what are the Types of Data Ethics Tools?

What is Data Ethics? Data ethics is a branch of ethics that focuses on the responsible collection, use, and dissemination of data. With the rapid advancement of Read More

Read More

What is High-Performance Computing Clusters and what are the Components of HPC Clusters

Introduction to High-Performance Computing Clusters High-Performance Computing (HPC) clusters are crucial for organizations that need to process and analyze vast amounts of data in a short period. Read More

Read More

What is Cloud Computing and what are the Features and Benefits of Cloud Computing Platforms?

Introduction to Cloud Computing Platforms When we talk about cloud computing, we often refer to the various platforms that allow us to store, manage, and access data Read More

Read More

What is Big Data Processing and what are the Types of Big Data Processing Tools ?

What is Big Data Processing? Big data refers to extremely large data sets that cannot be processed by traditional computing methods. Big data processing involves various techniques Read More

Read More

Big Data Role in Decision making in addressing organizational problems

Source – https://www.techiexpert.com/ Enterprises and organizations always work to improve and mitigate how they respond to challenges and make their businesses agile at the center of every Read More

Read More

What Is The Definition Of Big Data?

Source – https://timesnewsexpress.com/ Did you realize that a fly motor can produce more than ten terabytes of data for only 30 minutes of flight time? What’s more, Read More

Read More
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x