Video on Demand (VOD) platforms usually support a wide range of devices. Despite the attempts to unify streaming protocols, DRMs, and other parts of the tech stack, it’s almost impossible to have only one playlist/manifest that fits everyone.
The problem actually goes beyond supporting different streaming protocols or DRM types. The list is long and may contain codecs, performance, or levels of protection. At Showmax, we tackle this challenge with a component called Playback API.
TL;DR: Playback API is a service that provides a content URL based on the information provided by a device or application. Content is then delivered to the device/application in a format that can be played back in the best available quality.
In the early days of Showmax, the main purpose of the Playback API was to select a playlist/manifest from formats like DASH, SmoothStreaming, or HLS, with the right DRM system. The device or application let developers (read: the Frontend teams) know what streaming protocol and DRM is supported, so they just call the API that provides this information and receive the content URL. For example: Android/AndroidTV always get a DASH manifest with Widevine DRM, and iOS/tvOS get an HLS playlist with FairPlay. However, web browsers or smart TVs are not so simple and we need to use some sort of detection.
It’s worth noting that all Showmax APIs that serve content-related data use the Showmax Content Management System (CMS) as the source of truth. In addition to managing actual content and scheduling by content operators, the CMS is also used for configuring content protection, and to decide which CDN should be used.
Simple diagram showing the logic of URL selection.
The content URL is chosen based on the information provided by the caller and the platform-related restrictions, both of which impact the URL that is returned. One such example is called Streaming Profile Restriction, which is basically a configuration matrix in our CMS that includes dimensions like device, streaming protocol, DRM, country, and maximum allowed resolution in the playlist. We have several packages (configurations generating playlists by a packager) with different sets of resolutions.
This solution was introduced when we had different content contracts for European and African markets. Even though we are now providing our services mainly in Africa, this feature still has some applications. We used it for limiting quality during lockdowns to relieve overloaded ISPs and their networks, and to drop higher profiles in playlists within our bandwidth-capping option saving data in applications. Now, this is mostly done directly by the players themselves.
When we added next-gen codecs into the pipeline, we introduced a feature called “capabilities.” We found out during testing that providing the playlist with all codecs and letting players choose what they can play works in theory, not in practice. Capabilities are extra information detected and provided by the device to the API—they can ask for a preferred codec, for example. The process of selecting a preferred codec has two steps. The first is to detect codecs the device supports, then, if the most suitable codec is one of the next-gen codecs, test it in practice. The device may lie, which is especially true for cheaper, low-end Androids. The most suitable codec should save bandwidth, provide better picture quality, and be efficient with battery usage. Once a preferred codec is chosen and provided, the Playback API will adjust the content URL to reflect device capabilities.
In the future, we may introduce 4K capability to serve 4K resolution only on devices which have enough performance to decode such a resolution. It may sound defensive, but it’s really useful when covering lots of various edge cases.
CDN application in CMS
We always try to find the optimal path for a specific customer at the time of request to serve content in the highest-possible quality. To do it right is kind of the holy grail for CDN engineers.
Our CMS allows us to configure the preferred third party CDN (or CDNs) edges for specific countries or ISPs, which helps us pick the best CDN for the customer. It’s usually the closest Point of Presence (PoP), which, in reality, means we are using an in-house CDN for South Africa and nearby countries. Countries farther away from our PoPs are served by third party CDNs with a local PoP. Backup CDNs can be also configured as described in this blog post: Keep the stream live, despite any infrastructure failure.
Another use-case of preferring an ISP-specific CDN is when we have a special arrangement with an ISP, and the requirement is to use a specific network path for delivering content.
A special use-case is the configuration for so-called “hot assets”— a small subset of content in high demand. Increased demand requires us to involve extra CDN edges to serve that content (most recently, we experienced this with Game of Thrones and Chernobyl). This is automatically-detected based on the thresholds, and we use the amount of data-per-time window, or manually curated list of assets.
When entering a new territory or just adding a new “cold” CDN into the stack, we have the option to start prewarming that CDN by slowly increasing the amount of traffic routed there. This mitigates possible buffering issues caused by lots of MISS requests, and protects our origins from getting DDoSed by a CDN.
Finally, all of these configuration options can be time-based, with the traffic spread across more CDNs during weekends and evenings, and the in-house CDN during off-peaks hours for cost efficiency.
The process of selecting the right content URL is controlled on both sides—backend/CMS and frontend/applications—and the final mechanism of selecting the URL is done in the Playback API. A hybrid approach like this is beneficial because it leverages the advantages of both options while mitigating their respective disadvantages. With respect to flexibility, the backend changes can be released faster than changes in frontend applications. However, curating a list of devices with capabilities on the backend would be very tedious.
That’s all for today. In future installations, we will walk you through our incorporation of consistent hashing into CDN edge selection or some of the automations related to the CDN selection process aka our journey multi-CDN.