Whether you are streaming the latest boxing match or watching the most recent Game of Thrones episode, during its first run, the network is going to play a major role in determining your quality of experience like never before. There is unprecedented demand for content, and with the proliferation of smart devices capable of displaying video, every pair of eyeballs on the internet is a potential consumer.
The widespread availability of video combined with streaming technology means our viewing expectations are now much more demanding on the infrastructure. We expect content to be there, wherever and whenever we want it. We want to watch it, pause it, play it in slow motion, watch it again, analyze it, save it for later, share it with friends. We want to watch it at home, on the train or at work – because there’s always the underlying risk of spoilers, or missing out on the next-day office discussion should we be unable to view it.
There have been some very visible failures when streaming high demand content. Just last month Game of Thrones fans expressed their disappointment about the season opener. The viewing experience just wasn’t the way many anticipated. Due to incredible interest in the 7th season premiere, many viewers in the Americas decided to stream the season opener since the stream is made available simultaneously, regardless of time zone. Some ended up suffering through a failed website and were subjected to a streaming experience full of delays and buffering. An experience that just didn’t meet expectations for a show with such high production values. They took to social media outlets to share their disappointment and thoughts on the matter. Even more recently, the poor streaming experience that some viewers had during the pay-per-view Mayweather-McGregor fight has resulted in legal action being taken.
But could these quality-of-experience failures have been avoided? You may look at other “over-the-top” (OTT) services and wonder why these problems haven’t been wrung out of the infrastructure by now. There are clear challenges when live-streaming an event. Live coverage requires constant, uncongested connectivity and widely available capacity. Feeds can be made multiple redundant and traffic engineering tools used to provide the highest probability of success. Even then, the demand for any given event is notoriously hard to predict, and if a live event does unexpectedly “go viral” congestion, buffering delays and outright overload and system failures can occur. This is particularly true in static legacy networks where the data pipes are only so big and there is significant oversubscription at the network edges. The underlying technology just wasn’t designed for this level of streaming.
Difficulties streaming Game of Thrones are more likely just the tip of an approaching iceberg that’s only set to expand as the trend towards wherever, whenever, content consumption continues. According to some estimates US Pay-TV providers lost about 1.9 million subscribers in 2016 while OTT services picked up about 900,000 subscribers. The growth of streaming shows every sign of continuing. Disney recently announced it’s moving its content from Netflix with the aim of creating its own streaming services. Disney brings with it Pixar and ESPN, the latter of which dominates the live sporting landscape and which Disney has said will be offered as an OTT service from 2018 onwards – therein lies a glimpse into the future.
When the majority of us cut the cord, what will happen when we’re all watching Game 7 of the NBA Finals over broadband? Or the Olympics 100m Final? Or the Superbowl?
The truth is our current networks are destined to struggle regardless of whether we anticipate Superbowl-level demand or not. Today’s networks need a re-think and an upgrade if we are to enable a world where streaming broadcasts becomes the norm rather than the exception.
On top of this, the coming 5G era creates an opportunity for networks to seamlessly orchestrate wireline and wireless assets to create the desired user experience. Other emerging technologies such as Network Functions Virtualization (NFV), and the ability to move compute and storage assets to the edge of the network will further improve things.
These types of solutions could have assisted during the recent streaming struggles. With virtualized network apps able to be readily deployed the ability to rapidly augment content delivery networks with “virtual caches” located at critical network choke-points may have provided the power and capacity to ensure the stream stayed live.
To get there, strategic investment in both wireline and wireless infrastructure is crucial for operators network providers to achieve success in supporting future services. This look towards the future is part of the reason why ongoing capital investments related to fiber infrastructure are a key to the success. For example, Verizon has been very public about its belief in the critical role fiber plays in its future services plans. But streaming isn’t the only thing fiber investments will be able to improve – think Internet of Things, smart cities, the Industrial Internet, and more. We need to ensure these new fiber networks are architected the right way so we are ahead of the curve when 5G is set for deployment.
Network providers want to cater to the consumer, there’s no doubt about it, and the solutions are on the horizon, some nearer than others. It just takes a bit of planning and a re-imagining of the shape and operations of our current network architectures to ensure we can watch what we want, whenever and wherever we want.
This article is published as part of the IDG Contributor Network. Want to Join?