January 31, 2011 - Openet
Many operators are investigating models that allow them to better manage the data deluge, and many eyes have fallen upon next generation technologies. But with so much legacy infrastructure still in use, congestion will remain a key issue for some time.
It’s well known that mobile broadband has really taken off in those markets where operators have deployed flat rate pricing. But some of these “unfettered” users tend to generate vast amount of traffic, resulting in a poor end user experience when networks are congested. But because revenues are capped, the pressure piles on operators’ margins and develops uncertainty over business models.
In a sense, the mobile industry is experiencing the same problem which faced the fixed line operators some ten years ago: the dreaded “heavy users,” who take advantage of flat rate pricing and hammer the network for all they can. Figures released by infrastructure vendor Nokia Siemens Networks (NSN) on data gathered from commercial HSDPA networks over 2010 indicate that fewer than five per cent of mobile broadband customers use more than 90 per cent of the available bandwidth.
Jonathan Earle, head of consumer marketing at O2 UK—the first UK carrier to push the iPhone, added the operator’s perspective on the phenomenon recently, when he said that: “Data consumption has doubled every six months which is just an unsustainable business model… 20,000 or so heavy data users on our network are consuming about 30 per cent of all our data capacity.”
As a result, many operators have begun investigating models, which allow them to better manage this traffic, improving quality of service while at the same time increasing revenues and lowering costs. When looking for ways to deal with the explosion in data usage, many eyes in the industry fall upon the next generation, bigger, faster, better technologies.
But with so much legacy infrastructure still in use, congestion will remain a key issue. So the options available boil down to three different approaches to alleviate strain: move to LTE; employ some kind of offload strategy (femtocells) or network sharing; or deploy optimisation technologies. The main consideration here is that the traffic problem for many operators has already arrived—or will do very soon—and LTE is somewhat expensive and also has a long rollout cycle. An offload strategy may be more affordable and quicker to roll out but it will still take some time to complete. Which leaves us with optimisation, a system which is cheaper still and could potentially be deployed within weeks or months.
Optimisation is not a new technology. Indeed, it has existed for at least as long as the internet has carried traffic. But with the arguments about net neutrality rumbling along in the background, it does remain one of the most controversial elements in the industry.
Compression technologies are widely deployed by operators to squeeze more efficiency out of mobile web browser traffic. In November, Norwegian software firm Opera, which develops the world’s most popular mobile browser, opened a datacentre in Iceland to help it compress and manage all the web traffic for more than 71 million monthly users of Opera Mini. Opera Mini compresses data by up to 90 per cent before sending it to the phone, resulting in more rapid page loading and more web per MB for the end user, the company said.
Canadian vendor Research In Motion (RIM) on the other hand, elected to build its own network to ensure the traffic to and from its BlackBerry users was not only free flowing but also secure. While the service is a hit with its end users, the model has landed RIM in hot water with certain sensitive governments, which would like access to their citizens’ communications.
The thing is, with the growing popularity of smartphones, less and less of that mobile data traffic is actually coming from web browsers. More and more is in-application traffic, and the vast majority of that is video. “In a typical day, the typical mobile phone user finds that they’ve spent the entire day on the internet but they haven’t spent any time on the web and this is an important paradigm for operators to deal with,” said Chris Hoover, VP of product management at transaction management firm Openet.
This rampant consumption can be explained in part by the growing adoption of mobile video. An oft quoted statistic from Cisco’s Visual Networking index, which tracks and forecasts bandwidth consumption, found that video will account for 66 per cent of global mobile data traffic by 2014, growing at a compound annual growth rate (CAGR) of 131 per cent from 2009 to 2014.
Derek McManus, CTO of O2 UK, famously said that “Watching a YouTube video on a smartphone can use the same capacity on the network as sending 500,000 text messages simultaneously.” What throws a spanner in the works for the operators and has the potential for rubbing the net neutrality folks up the wrong way, is that video can’t be compressed in much the same way as other traffic, or not without substantial quality loss anyway. So optimisation methods must move with the times.
Georges Antoun, Ericsson’s head of product area IP & broadband networks, warned that while he’s seeing growing usage of broadband network optimisation, the type of traffic the network is carrying has changed and the management method must change with it.
“Video is now the dominant traffic type. If you asked the same question five years ago it would have been peer to peer (P2P) and that’s what used to kill the network. But now 90 per cent of all IP traffic is video and the characteristics of video are forcing us to deal with the network differently,” he said. “As a consumer I can deal with delays and latency in most traffic types, but I can’t deal with a lot of delays in video, so you have to deal with video differently.”
Antoun said that optimisation was a technology that used to fall under the umbrella of convergence or some form of traffic shaping but is now talked about as network intelligence, in terms of distributed architecture and the collapsing of optical and IP technologies and policies. “We need to understand who the users are in the network and what they’re trying to do,” Antoun said. “Intelligence is driving efficiencies in infrastructure but it’s also about driving applications and value added services, and the brokering of these services with the end user.”
But while every vendor and its dog seems to be getting into the traffic management space to help operators deal with the date deluge, Dean Bubley, founder of Disruptive Analysis is concerned that most of what’s on offer at present represents a siloed approach to management. “What I’m expecting to emerge over the next couple of years is a smarter and more holistic approach, where different elements within the network and billing and charging system co-operate to make a much more user friendly, application friendly and network friendly way of managing traffic,” he said.
Patrick Lopez, chief marketing officer of Vantrix, agrees: “We implemented optimisation technology in silos as and when it was needed. So at first we started with ringtones and wallpapers, then messaging as consumption of multimedia content evolved, then TV on demand and now video streaming,” he says. “This change in content consumption has changed our traction in the market. We’ve always had video optimisation technology but it’s only over the last 12 months that the market has ‘caught up’ with us in needing this. Most new customers are for video optimisation,” Lopez said. End users see more problems with video than with messaging because of its live nature, and part of the problem with optimisation is that many operators don’t have the measurement tools in pace to see what the quality of experience is like with and without optimisation. Lopez posits that most users would accept lower video quality over continuous interruptions or buffering.
To come back to intelligent optimisation, such an approach to the delivery of a quality of service (QoS) strategy would allow the creation of new business models and open new partnership opportunities that create additional revenues or lower costs. For instance, an over the top (OTT) video application provider that relies on the quality of the connection would be interested in paying the operator for having a guaranteed prioritised connection when there are network constraints and the customer is trying to access the application.
Of course this is one of the most controversial developments of QoS policies and at the moment there is no consensus on such a development among the mobile operators. One large operator confirmed to Informa Telecoms & Media that it always gives a consistent user experience in mobile broadband and would not differentiate. In contrast, another large operator confirmed that QoS-based customer segmentation will be implemented and that it is not necessary to let customers know that some of them will have better experience than others.
Some of the loudest detractors to this concept— the supporters of net neutrality —internet powerhouses like Google, Yahoo and Microsoft, have established a model that some consider a great threat to the mobile operator community in that it establishes a direct consumer relationship and disregards the pipe used to maintain that relationship. So what operators have to capitalise on, as well as using optimisation techniques in a smart fashion, according to Constantine Polychronopoulos, chief technology officer of mobile internet infrastructure specialist Bytemobile, is user data.
“The operators have information about the subscriber that no other entity in the internet environment can have; for example, they know everything the subscriber has done over the lifetime of their subscription and the location of each event. They don’t have to let this data outside of their networks, so they are very well positioned to win the race for the mobile internet,” he says.