Published April 25, 2022

Meeting the Moment: How 5G Fuels Hyperscale Data Analytics

5G has accelerated Telco’s need to adopt a more performant framework and technology in the areas of network security, predictive modeling, IoT and sensor-driven data analytics, location-based monitoring, and more.

By Mara Crisafulli, Enterprise Account Executive at Ocient

Consumers associate 5G cellular services with huge bandwidth, speed, and functionality boosts. But for the global telecom industry, establishing these next-generation services wasn’t solely about investing in gear, such as 5G cell towers. 5G propelled them into a realm of data analytics that has now reached hyperscale – the need to analyze trillions or more records in seconds to minutes for a near real-time picture of activity and network intelligence.

Between 5G’s march to one billion smartphone subscriptions this year and the tsunami of 18 billion IoT endpoints, Telco’s have faced an exponential rise in data volume, variety, and velocity. For analytics teams, all that data glitters like gold, but it will require new approaches to manage and process remarkable amounts of data efficiently. (1)

Though 5G is a milestone, carrier data has always been enormous and widely distributed — requiring these global firms to manage and monitor data sprawl, security, and privacy issues. Telecommunications providers harvest data to develop insights to increase sales, optimize products and customer experiences, generate new revenue streams, and enhance network reliability and security. Data warehouses coupled with business intelligence tools have helped these carriers mine mountains of network metadata for valuable insights to sharpen forecasts and troubleshooting efforts.

Yet, for various reasons, including data privacy regulations and legacy storage networks, data silos continue to impede progress, and real-time data analytics on petabyte-scale data sets remain a work in progress for many carriers. Data silos may inhibit analytics teams from continuously correlating data across multiple business units. And though Telco’s remain robust adopters of AI and machine learning technologies, they will need to scale their operations to leverage hyperscale opportunities. To achieve this transformation and futureproof themselves for continued growth, the industry must embrace a modern approach to data management and analytics that balances cost efficiency with performance at massive scale.

The Lines are Open

Telecom companies have more than a handful of compelling reasons to pursue hyperscale data analytics programs. Here are some of the most attractive, if challenging, hyperscale opportunities for carriers:

  • Customer churn — Telcom companies need to quickly understand when, where, and why they lose customers. But they don’t want to wait until they lose subscribers. McKinsey estimates that with improved, predictive analytics, Telco’s “can reduce customer churn by as much as 15%.” (2)
  • Robocall fraud — Telco’s analyze traffic patterns to understand and block predatory behaviors to help protect their customers. This is one of many risk management applications that have become essential to cost containment and subscriber satisfaction.
  • Network performance — Telcom companies have a limited number of skilled technicians. Where should they send technicians to maximize their efficiency and improve network performance? With predictive analytics they can improve capacity models, save money and prevent an outage or service disruption.
  • Network security — Hyperscale data analytics solutions can help cyber teams identify and prevent network vulnerabilities by identifying suspicious activity and recommending proactive security measures.
  • Lawful intercept – Telecom companies are tasked with mining petabytes of network metadata to find needle-in-the-haystack Internet or call detail records for law enforcement agencies investigating digital crimes.
  • Call center outcomes — Customer contact centers remain major cost centers. According to McKinsey, Telco’s can use advanced analytics to predict incoming tickets and improve time-to-resolution by 15%. (3)

The path to developing long-term hyperscale data analytics solutions for future-forward telecommunications providers begins with several key considerations:

  • Can the solution scale? With billions of IoT devices, new edge networks and about one billion 5G phones in use, scale is job number one. Scale is not only important from the perspective of supporting a growing volume of data, but also with regards to scaling to support multiple workloads and users. Access to data at scale, security at scale, developing efficient data pipelines and integrating data at scale – these are all important challenges to tackle when upgrading to a modern data analytics solution built for hyperscale datasets. No one is starting entirely from scratch, and few would start down this road unescorted by highly experienced partners.
  • Does it excel at Geospatial analysis? The amount of new intelligence and new opportunities coming from objects in motion continues to grow at a rapid pace. Ensuring your data analytics solution can harness a variety of data types, including temporospatial data, is critical to developing competitive products and services, enhancing customer experience, and protecting the network from future vulnerabilities. Being able to harness data from a growing number of disparate sources and locations in a streamlined way is critical to future success.
  • How does it perform? Latency here can make or break a customer experience or preventative action. At hyperscale, Telco’s must analyze trillions of records in seconds, join massive tables with billions of rows, and develop insights from data in flight. System availability and uptime is critical to success and organizations need to achieve high performance within an affordable cost structure.
  • What is the true cost? Beyond the rack rate, what will it cost to run complex, continuous analytics with your hyperscale data set? If you’re using a Database Throughput Unit (DTU) approach, evaluate whether the solution will be sustainable and affordable as it scales. Is there a long-term data retention charge? Are you getting charged every time you run a query? Egress charges? Is there ambiguity around extract, transform, and load (ETL) to get data into your system? Determine whether the solution enables you to track cost anomalies and take prompt corrective action.

What’s the first step Telco’s must take to reach their hyperscale analytics program objectives? Settle on a proven hyperscale solutions architecture. Sometimes that means letting go of solutions that work fine for analyzing small data sets but break down at hyperscale or simply aren’t priced to scale.

Telco’s have leveraged multiple tools to tackle hyperscale data analytics challenges in the past, often with underperforming solutions that were costly to execute when put to the test at hyperscale. Ocient provides a Compute Adjacent Storage Architecture™ (or CASA), hyperscale ETL service, and optimized indexing designed to deliver massively parallelized transformation, loading, storage, and analysis for complex data types, including IPFIX records, 10 to 50 times faster than other solutions.

Knowing that Telco providers may opt for on-prem as well as cloud-hosted deployments, Ocient offers ultimate flexibility to deploy in the customer’s data center as well as in the cloud, including hosted in OcientCloud. And because engineering any hyperscale data solution is no easy feat, Ocient removes the resource-intensive requirements to stand up a new system by engineering end-to-end solutions tailored to each customer use case.

For hyperscale analytics at an affordable price point, check out Ocient and get started on the path to doing more with your data, trillions of records at a time.

Endnotes: