As we incorporate video streaming into our daily routines, whether it’s viewing Netflix favorites or discovering new stuff on YouTube, the evident convenience comes with a hidden environmental cost. It is expected that the fast developing streaming market, which is characterized by a steady supply of new services, will expand significantly. According to research, the industry will be worth $330 billion by 2030.
The energy usage linked with streaming has left an alarming carbon imprint. This result prompts a critical investigation of the idea known as “green streaming.”
The need to combine our digital pleasures with environmental obligations is obvious. It promotes a community-wide effort to adopt ecologically beneficial behaviours, encouraging platforms to favour renewable energy. This helps us to limit our environmental effects while also ensuring that the convenience of on-demand TV does not endanger the planet’s health.
OTT Platforms and Covid 19
Your favourite sports person reaches a new milestone on live television and you watch them in all their glory and celebrate it with them vs your favourite sports person reaches a new milestone and you get to know about it only 42 seconds later.
The difference between the two is what is termed as latency and the above example shows you how latency affects your content viewing experience.
Latency goes hand in hand with live streaming – the relation between the two is directly proportional when it comes to influencing the ‘live factor’ of said event. Always measured in seconds, there’s no benchmark of what low and high latency is. The thumb rule is that anything lower than the average in the field of broadcasting is usually defined as low latency (usually clocked at under five seconds).
But latency comes with a trade-off. Time is offset with quality. The closer you get in terms of time to the event, lesser are your chances of experiencing the event in 4k resolutions. What invariably follows are audio sync issues thus taking even more away from the viewing experience. There are situations where this trade-off is acceptable, like sporting events where most of the story is conveyed through visuals but is unacceptable when it’s two-way live conversation streams, online gaming and live auctions.
If latency is such a critical part of live content viewing, why is it not standardised across platforms and geographies? This is because it depends on a number of factors like bandwidth of the user, internet connection types, video encoding, video format and distance. While latency can’t be eliminated, efforts are constantly being made to control them. Keeping internet connections and speeds aside, the two factors that affect latency, or lack of it, are encoder settings and streaming protocols. In some cases, choosing the streaming protocol is up to the CDN but in most cases, the said platform used for streaming comes with pre-set streaming protocols. The key players in this market are WebRTC developed by Google (used by Google Hangouts), HLS and DASH (used by Netflix, Hulu) amongst others.
Thus, when it comes to choosing a streaming protocol, it’s best to select one that does most justice to the type of content. A multitude of factors such as picking a standard partner or a bespoke solution, scalability across geographies, cost, adaptive bitrate streaming, quality limitations, monetisation, etc. play a role in this selection.
In case you’re a streaming platform trying to beat the 5-to 6-second range, a standard- based HTTP technology is likely to be your smartest choice, since it will come nearest to supporting a similar list of capabilities you’re presently utilizing like content security, subtitles, and adaptation. If your content requires even lower latency, you’ll most likely need a WebRTC-based or WebSockets-based arrangement or an exclusive HTTP innovation that’s custom made.
Low latency is what puts the live in live streaming, but how live is live really? This is a question that’s best answered by analysing and optimising the variables that affect the CDN, the audience, the content objective and finally, the content itself.