What is the Use for Data Analytics?
Analytics offers many benefits to organizations as they embark upon digital transformation, including:
- Increasing efficiency and driving cost out of operations
- Maximizing customer satisfaction
- Developing new products and services
- Use of streaming data to respond to issues and opportunities in near real-time
The number of use cases made possible by data analytics seems limitless and, on top of that, we are only now beginning to glimpse the potential of machine learning and other forms of artificial intelligence to open new frontiers of what organizations can achieve with data.
But as we at Dell Technologies engage with customers on a variety of use cases, the more we learn that many are still struggling with the prerequisite task of getting data into their analytics environments in order to deploy the use cases they want. This task is called ETL, or “extract, transform, load”, and it can be defined as the process of reading data from one or more data sources, transforming that data into a new format, and then either loading it into another repository (such as a data lake) or passing it to a program[1].
Dell Technologies and its telecom data analytics ISV partner, Cardinality have been working together to help customers resolve complex ETL issues so that they can do the kind of analytics they want. What follows are real-world examples that illustrate three key pain points customers tend to experience with ETL, and how we have helped resolve them.
Data Myopia
Telcos sit on a wealth of data, but organizational or technical barriers can often make it difficult for data engineers and data scientists to gain access to the data they need. The data analytics team at one tier-1 telecom operator faced just such a challenge. Only able to access data from the IT environment, the team couldn’t get the data they needed to start answering questions about the factors that influence customer satisfaction. To solve this problem, Cardinality conducted a pilot on a small footprint of Dell PowerEdge servers to demonstrate to the Network Operations team the value that could be unlocked with a simple use case: device analytics. In a matter of days after configuring its ETL Engine to ingest data from the operator’s network probes, Cardinality was able to produce a real-time dashboard of all the mobile phones and other devices on the network, and show vital information such as types of SIM cards the devices where using and which could be upgraded to 4G networks. This operator was able to build on this initial use case to create a complex, Network Customer Experience use case that delivers measurable business benefits by using machine learning to analyze over 350 network KPIs in order to predict and circumvent customer churn.
Creeping Complexity
New technology spaces typically offer developers a wealth of tools to choose from. Many tools, both open source and proprietary, exist in the world of data analytics (e.g., Informatica, Talend, Kafka, StreamSets, Apache NiFi, Airflow, and many more). While choice can be good, the use of too many tools by too many different people in a single environment can make management a costly ordeal.
One telecom operator that Dell Technologies recently worked with had fallen victim to the creeping complexity that can be introduced when there is too much choice and too little control. Over time, different developers decided to use whatever “flavor of the month” tools looked interesting to them, and this resulted in a situation where it became next to impossible to debug existing use cases and create new ones.
Dell Technologies and Cardinality were able to quickly clean things up with the Cardinality ETL Engine, which provides an elegant and easy-to-maintain mechanism for ingesting data. The result is that the operator is now able to build use cases without having to worry about the complexity of ETL.
Data Indigestion
A variation on the complexity theme has to do with the complexity of data sources themselves.
Dell Technologies helped another customer that was saddled with having to keep up with a variety of data formats from different network probes. Having multiple probes is complicated by the fact that probe vendors occasionally change their data formats, requiring rework and telecom expertise to reformat the data into formats used for analytics. An additional problem is that some older, proprietary data formats can’t always be used with newer ingestion tools, introducing latency and performance limitations, and this ingestion “indigestion” can limit the kinds of real-time use cases that can be put into production.
By modernizing the customer’s environment with the Cardinality ETL Engine, we were able to relieve the customer of the headache of having to manage a multitude of data sources and were further able to vastly improve streaming performance. The number of data records ingested and parsed per day increased from 9 billion to 23 billion and the number of files needing to be discarded due to format quality issues dropped to nearly zero.
“Plumbing” Matters
The Dell Technologies Service Provider Analytics with Cardinality solution dramatically reduces customers’ data ingestion pain points with an ETL Engine that allows customers to:
- Get into production fast with “out-of-the-box” ETL functionality that is purpose-built for telecom environments
- Collect streaming and non-streaming data with the low latency and high throughput
- Lower OPEX by reducing the resources needed to manage multiple data formats from different sources
- Scale from small to huge on a unified data analytics platform
Dell Technologies and Cardinality offer customers a Kubernetes-based microservices platform that spans the data pipeline from ETL to analytics to prebuilt telecom use cases is tuned to run on scalable, high-performance Dell PowerEdge clusters and is integrated with Dell Isilon and Pivotal Greenplum. Together, Dell Technologies and Cardinality are committed to ensuring they can make the most out of data analytics.