- Alexander Schaefer
OTT as Television of the future
Over the last couple of years, a dramatic change of paradigm has shaken the TV and broadcasting industry. The term “OTT” (short for Over-The-Top) broadcasting quickly became the leading buzzword in the market, promising to bring the next generation of TV to the world. Instead of using traditional broadcasting methods like cable networks or OTA, modern platforms make use of the incredibly fast growing internet industry and its networks. However, this new approach comes with some very important requirements. One of them is quality; OTT must work just as smooth and reliable as regular TV. This article is dedicated to give you an overview of the most critical aspects of OTT broadcasting and current approaches to overcome these challenges and make OTT – as promised – the future of television!
VOD is easy - The main challenge will be live content
When asking anyone to name you some of the top OTT companies today, the answer will most likely contain names like Netflix, Amazon, Hulu, YouTube, and Roku. While it is true that for example Netflix and YouTube combined make up an astonishing 50% of the entire downstream traffic in the US, only 3% of the generated traffic is live. Right now, 95% of all video traffic is Video-On-Demand (VOD). This will, however, change over the next years until 2021. At that time, live content will make up 13% of all media traffic. That’s a growth rate of over 430% and makes lives video the fastest growing segment of internet traffic.
What is interesting to see is that even today with only 3% live video traffic share, there are huge problems that broadcasters are struggling with. “Shitstorms” during big live sports events due to broken connections and bad video quality are no longer rarities these days and almost every one of us knows the infamous “spinning circle” on the screen while the player is desperately trying to fetch the next live video segment. Michael Burns, a journalist, and editor for the IBC Daily, wrote a very informative article about the 2018 world cup and why many broadcasters actually failed to deliver it to their subscribers.
Back to the problem. Right now, Video-On-Demand services already start offering 4K content – unimaginable for real-time streaming cases. One of the main reasons why VOD works so much better than today’s live video streaming is the huge and well-developed infrastructure of content delivery networks with their globally distributed caching servers. Since a VOD stream is basically nothing more than a big amount of old-fashioned “static content”, traditional CDNs can perfectly make use of their existing infrastructure – as long as they can keep up with the scale. This is by far not true for live streaming. While approaches like HLS and Dash are still based on HTTP and technically can be delivered over traditional CDNs, they are already exceeding their limits regarding infrastructure, reliability, and quality.
VOD is easy - The main challenge will be live content
When it comes to taking live (or linear) OTT to the next level, there are three main issues, content providers and CDNs will have to tackle. In order to prepare for the coming storm of users who will expect an OTT live video stream to be as brilliant, fast, and stable as typical OTA or cable TV, we will need to improve scalability, reliability and quality of service. Our CTO Christopher Probst, who is a long-year expert in the field of network architectures, CDNs, and video streaming wrapped up the technical details for these three pillars of live OTT:
Today's CDN infrastructure and the idea of linear OTT broadcasting on a global scale do not get along very well. Having millions of users at the same time makes the worst-case in VOD a standard case in live. Think about every single one of Netflix' 118 million users requesting a stream at the exact same time. There is no way that a physical server infrastructure can handle such situations in an efficient way.
One very interesting KPI when it comes to live streaming is the "fetch ratio" (time to fetch a segment / length of a segment) which is directly connected to the occurrence of "rebuffering" events. Since CDN servers can be highly overloaded for short times during live events, the available bandwidth per users varies drastically, causing moments in which the time to load a segment is longer than the actual video content within that segment. This causes the player to stop and wait for the segment to be fully loaded. For the user, this means being interrupted and waiting until the CDN has done its job. As users are connected to a single edge at any given time, the reliability of video experience depends on luck and the provider's ability to manage multiple CDNs at once.
Quality of Service
Since modern video players are capable of dynamically switching between multiple streaming qualities (ABR), rebuffering events are usually announced in advance by a loss of video quality. Although this mechanism prevents a stream from being completely stopped, it is even more annoying to see a football game in 240p, knowing that it will break down anyways within the next seconds.
First-Generation P2P Video Delivery Networks And What Must Change
As we have seen, there are several problems when it comes to linear OTT broadcasting. To overcome these problems, people came up with the idea of using P2P networks and combining them with traditional content delivery networks. Such a “hybrid” approach would create an auto-scaling network that dynamically grows with the number of users watching a live stream. The first generation of hybrid P2P approaches still had Video-On-Demand as their main use case. Since VOD is all about static content, the first-gen P2P networks were designed based on the BitTorrent protocol. Although not having the best reputation, BitTorrent turned out to work quite well for VOD content. This is no wonder since BitTorrent was always designed to be a file sharing protocol. Back in 2001 when Bram Cohen started with the development, nobody had live OTT broadcasting in mind (actually not even OTT Video On Demand).
Traditional, BitTorrent-like P2P Video Delivery Networks
Yet, there is an old saying in the IT community: “never change a running system”. It is a well-known principle in software development and comes from the fact that every introduction of a new system in order to solve a problem is very likely to produce new problems at the same time. With a hybrid P2P/CDN solution, it is no different.
Decentralization as a disadvantage
The BitTorrent protocol was specifically designed to be decentralized. If a user wanted to join the network, he eventually would have to do a lot of management work in order to communicate with its peers. Also, every user can only “see” his directs neighbors which cause a lot of trouble if suddenly peers decide to leave the network (also called “churn”). Churn is one of the biggest threats to decentralized P2P networks and dealing with it on a per-user basis causes a lot of data overhead and network delay.
Missing content security
When it comes to content security, first-gen P2P networks are proven to have significant problems with so-called “Pirate Peers” who proactively send video ads or manipulated content to other users. A single peer simply can not verify whether or not content he received from another peer is valid or not. Even the ETAG checksums that can be provided by a CDN are not sufficient since all P2P networks work on a Sub-Chunk level.
Another big problem of first-gen P2P networks is the exclusion of internet service providers in the equation. CDNs (and after all, content providers) heavily rely on good relationships with their partnering ISPs. Even the best hybrid P2P/CDN combination will be useless if an underlying ISP collapses due to the sheer power of peers, uploading data without being managed according to the current network state. The correct handling of critical network routes and congestions is crucial in order to actually improve your situation and not completely breaking down your live stream.
What Do P2P Networks Have To Change?
The answer to all three problems caused by first-gen P2P networks is what’s called a “server-side-managed P2P network”. The key idea here is replacing the decentralized, Peer-Only approach with a P2P network that is managed by a centralized backend and to connect it to your content.
Using a setup like this unlocks some pretty cool opportunities:
Use push instead of pull. First-gen networks use P2P servers that work as an “IP address dispenser” for new peers. A server-side-managed network can replace those dispensers with so-called long-lasting trackers. Such a tracker will keep a connection to every peer and continuously send instructions which content to fetch from a CDN or to push to another peer. Having peer and content information at hand, a long-lasting tracker can easily instruct peers to distribute content among each other in the mathematically perfect way. Thus, WebRTC connections are only used for data exchange instead of being overload with network management traffic between peers.
Secure your content. With a backend that is connected to the actual video content (even if it is encrypted), you can provide sub-chunk checksums for your video segments. This completely secures your content against pirate peers and video content manipulation. It even allows you to report and remove pirate peers from your P2P network.
Byte-precise traffic regulation. When having a network manager in place, you are in control of every byte going through your P2P network. This allows you to lower, boost, shift and focus P2P traffic globally on a real-time basis. By including metrics like a client’s internet service provider, the network manager is able to identify areas and networks that are P2P-friendly and focus on these areas while offloading more congested networks at the same time.
The OTT delivery network of the future and what we can do with it today
With a development history of over 7 years, StriveCDN has implemented the very first server-side-managed approach to build a second-generation P2P network. We strongly focused on only live video delivery and a fully server-side managed P2P network. In May 2018, we presented a case study at the CDN Summit.
The case study shows the first results with this new approach in the OTT broadcasting market. Here are the most important insights and results:
84% peak savings
With more than 60,000 concurrent viewers, Strive’s server-side-managed P2P video delivery network managed to take care of 84% of the entire video traffic and offload it from the broadcaster’s CDN.
2% fallbacks to the CDN
Only 2% of all video segments had to be refetched from the CDN after not being delivered over P2P. The fallback mechanism prevents additional delays for users. Still, 98% of the time, a fallback was not necessary.
36% of users with higher bitrate
Tracking the distribution of viewers over three different video quality levels, Strive increased the average bitrate of every viewer by about 36%.
Where To Go From Here?
For the rest of this year, we strongly focus on providing the support of native streaming environments such as iOS, Android, and Smart TVs. We also work hard on providing more insights regarding quality metrics and real-time analysis.
I hope that this article gave you a deep insight into our approach of P2P video delivery and the problems it solves. As a young company, we are always interested in feedback, so please feel free to reach out to us via Live Chat or email.
Thank you very much for reading, have a great day and don’t forget to subscribe to our newsletter to stay updated with new content and information about P2P video delivery and StriveCDN!
About Strive Technologies
Strive is a leading technology provider for OTT broadcasters and live streaming companies. Our technology “Flink” is used by broadcasting companies around the world to improve video QoE and cost efficiency. Based in Germany, Strive developed Flink over a seven-year period of time, constantly improving and adapting the technology to the quickly shifting market requirements. Today, Flink connects over 150,000 users worldwide on a daily basis, saving our customers over 80% of CDN traffic with our unique server-side-managed Peer-To-Peer network.