5G demands a new and unified approach to data management
July 2, 2018 - Aleks
The move to Digital, 5G and IoT has generated significant pressure for operators to overhaul their data strategy and fast. Ericsson forecast that there will be in the region of 1.5 billion IoT devices driven by cellular connections by 2020. What’s more, Silicon Republic project that, with the onset of 5G from 2019 onwards, mobile data traffic is expected to surge by eight times during the forecast period, reaching 110 exabytes per month by 2023. As is already evident, market changes in technology, economics, shifting business models and customer engagement practices have profoundly impacted operations. This is due to inordinate data volumes, the embracing of Internet economics and the bespoke relationship management models required for operators to prosper in the new Digital Economy.
Impending data complexity requires a proactive strategy
Until now, the focus has been on catching the wave of change through accelerated Digital Transformation efforts and 5G rollouts. Notably, the 5G business case has been largely driven at a use case level by IoT, VR/AR services and Super High Definition video services – all of which are presenting heightened interactive dynamics, demanding real-time actionable data. However, the attempt to stay at the forefront of these industry evolutions has meant that operators have failed to move from a reactive to a proactive position when it comes to effectively managing and leveraging the data complexity coming down the track.
The comprehensive management of such complex data environments demands that operators move away from point data solutions and siloed ecosystems towards a consolidated view of data and its consumption channels. In the converged networks of today, data flows originate from a myriad of upstream data sources, often with a myriad of conflicting format types. Ingested raw data in and of itself holds minimal value to the business until it is aggregated, correlated and enriched to form applicable data sets which can be consumed by downstream applications or consumers for either direct value delivery or automation purposes.
Put simply, we are facing a data precipice as operators brace to absorb the impact of the impending demands of 5G, the data volumes imminent on the back of IoT and the rapidly changing business models brought about with the move to Digital. s are entering a period of greater diversification than ever before – Becoming the conduit for lifestyle utilities, international banking, and original media, to name but a few – Let alone meeting the pressing needs of the traditional consumer business.
5G and IoT driving intelligence to the edge
It is becoming acutely clear that a unified data fabric, which sits across the network, is the best means of resolving the data orchestration complexities experienced today. Forrester rightly foresee that the move to 5G and IoT will increasingly drive intelligence to the edge, compounding the need for a consolidated, 360 degree data integration layer to guarantee data completeness and decisioning integrity. Resolving not only today’s data management constraints, the move to 5G necessitates a data fabric approach to ensure that operators are strategically positioned to truly deliver on network slicing and the secure execution of 5G service orchestration.
So what is a data fabric and why should you care? As described by IDG, the concept of a “data fabric” is an approach to help organisations better deal with fast growing data, ever changing application requirements and distributed processing needs. The term references technology that creates a converged platform that supports the storage, processing, analysis and management of disparate data. Data that is currently maintained in files, database tables, data streams, objects, images, sensor data and even container-based applications can all be accessed using a number of different standard interfaces.
The inefficiency of realizing value from data today
Take, for example, the inefficiency of the current data processing models which exist across today’s networks. The delivery of a reliable and competitive service has become markedly more complex with the dawn of Network Functions Virtualisation (NFV) and the subsequent strides made in the realisation of Software Defined Networks (SDN). When it comes to gaining insights and deriving business value from complex data environments, it is estimated that up to 80 percent of data processing is consumed by the heavy lifting of basic data cleansing, normalisation and preparation tasks in order to ready the data and make it applicable to an operator business – all before the data sets themselves can be put to work to generate value outcomes for the business.
Steve Lohr of The New York Times stated: “Data scientists, according to interviews and expert estimates, spend 50 percent to 80 percent of their time mired in the mundane labour of collecting and preparing unruly digital data, before it can be explored for useful nuggets.” An estimate put forth in 2016, the volume and veracity of the data flows now in play is set only to drive greater complexity and longer lead times in the realisation of operational efficiencies and actionable insights. This processing hindrance becomes more pronounced when we consider that most data analytics solutions operate on a cost model based on the volume of data ingested, rather than the value outcomes delivered. Operators need to take the control back in order to reduce these inordinate outlays and to realise greater RoI for the business.
Today, across most markets, this acute problem is even more pronounced due to data privacy regulations. Data scientists are being required to be much clearer and purposeful about how they process data and, when the purpose is unknown at exploratory phases, they must have robust data anonymisation applied. So now, what is needed is a data fabric that has collected and cleaned personal and anonymous data sets to support a functioning business, delivering on known use cases and exploratory data practices.
A data-driven operation drives competitive advantage
As of December 2017, Gartner hailed the role of the Chief Data Officer (CDO) as a linchpin in digital business transformation. While these data and analytics leaders were traditionally focussed on data governance, data quality and regulatory compliance, the CDO group has expanded its remit and is fast becoming the driving core in delivering tangible business value and enabling a data-driven culture across the Operator business.
Gartner suggests that by 2021, the office of the CDO will be a mission critical function. However, many believe that this is already the case for those on the front foot of competitive differentiation in the Digital Age. We are in the midst of a vast swathe of mammoth transformation projects across the industry in the move to Digital and 5G. These projects are now at a critical point, where incremental return and operational efficiencies must be realised while the pain of transforming continues to draw out. This requires an overhaul of traditional data strategies and the means by which value is distilled from data management practices, therein legitimising the pronounced role of the CDO.
In short, the move to Digital, 5G and IoT is poised to accelerate the need for a unified approach to data management across the business as Operators brace themselves for greater data volumes, greater service diversity and greater network complexity than ever before. This means that data strategies must be re-evaluated to meet these demands and to truly deliver on the data integration needs of the business and the potential of data analytics and AI in the realisation of maximum return – It’s now or never. Trying to remodel the strategic approach to data in retrospect will be restrictive and the value limited. Therefore, a proactive approach is required and now is the opportune time to futureproof the move to Digital and 5G.
Written by Julia Hogarty, marketing manager of Openet Data and Cloud Group