Video: Delivering Incredible End-User Experiences Using Containerized Microservices Closer To Endpoints

At my recent EdgeNext Summit event, one of the key talks was how to allow developers to deliver interactive experiences to end users while ensuring low application response times, leading to higher engagement and happier users. Haseeb Budhani, CEO of Rafay Systems did a presentation on how the company’s platform enables developers to deliver these highly engaging user experiences by running containerized microservices closer to endpoints. Video link: https://www.youtube.com/watch?v=iNyazklVTQw&t=17s

Right Now, It’s Not About VR and Autonomous Cars: See Which Edge Applications and Edge Platforms Are Ready To Go Today

With all the hype of edge computing and edge cloud, everyone is now claiming to have products that are focused on the “Edge”. This branding and edge washing has created a lot of confusion around “What is the Edge”, “Where is the Edge” and “Who owns the Edge”?

Furthermore, the futuristic view around autonomous cars, virtual reality and remote surgery is over used: is this the best we can do for use cases for edge? At my recent EdgeNext Summit event, Yves Boudreau, VP of Partnerships and Ecosystem Strategy for Edge Gravity by Ericsson, presents what they have learned in the 12 months on real edge computing and which applications are likely to exemplify the near team user of the edge. Video link: https://www.youtube.com/watch?v=Vxm9mpltXv8

Understanding Packet Loss and Its Impact On Mobile Content Performance

The transient nature and pervasiveness of packet loss, jitter and other performance problems occurring over a wireless “last mile” are often poorly understood, and hardly ever quantified. At my recent EdgeNext Summit event, Subbu Varadarajan from Zycada illustrated the impact of packet loss on performance over a wireless connection. Based on the analysis of 100+ billion transactions, he demonstrated the scope of packet loss, explained best practices to measure its impact, and showed a live demo of Zycada’s packet loss mitigation over the wireless last mile.

The “Fortnite Effect On Networks and How ISPs are Working To Improve the eSports Experience

Online multiplayer gaming is no longer the preserve of a small number of avid fanatics. It’s gone mainstream and is attracting a populist wave as manifested by the “Fortnite effect”. At my EdgeNext Summit in October, Haste, a network service that improves network performance and user experience for gaming, discussed how Fortnite and other online multiplayer gaming platforms have become leading edge applications. In this video from the show, Adam Toll, Founder and Board Director at Haste, presents how their technology and Ericsson’s Edge Gravity platform, and ISPs, are working together to deliver a superior gaming experience to players.

Edge Computing Helps Scale Low-Latency Live Streaming, But Challenges Remain

At my recent EdgeNext Summit, there was a lot of discussion about how edge computing can be used to deliver next generation OTT viewing experiences with low latency. Traditional HTTP streaming formats such as HLS and MPEG-DASH have gained wide acceptance, making it relatively easy to reach viewers on almost any viewing device at global scale. However, these formats have significant limitations when used in traditional live streaming workflows, including slow startup, stream latencies, and latency drift over time. By utilizing edge compute resources, new methods and formats are emerging to address these limitations and improve the experience for viewers and content distributors.

Live streaming has some inherent challenges that impact infrastructure requirements differently from traditional VOD delivery. Demand for live events can grow quite quickly, requiring instant scalability. For example, the recent online trivia craze has seen demand for online streams grow from zero to over one million viewers in just a matter of minutes, creating massive challenges in ramping to meet this instant demand. Unlike traditional on-demand workflows where popular content can be pre-cached in multiple locations to reduce bottlenecks, live content must be ingested, packaged, and pushed to edge locations as it is created. And with growing viewer frustration around poor user experiences when it comes to streaming live events, caching and buffering of live content to gain scalability and ensure reliable playback becomes a greater challenge.  

Traditional cloud service providers offer hyper-scale data centers with computing resources, but they are in relatively few locations and often not located in densely populated areas where viewers may reside. Sending source video streams from these centralized cloud data centers to viewers in distant locations requires extensive middle-mile capacity and can create peering bottlenecks as viewership grows. To viewers, this often means slow or inconsistent startup of video streams, video quality degradation as more viewers watch the event, or even the inability to join popular live streams. These quality, reliability and scalability challenges are impeding the way consumers think of live streaming as a true replacement to cable TV.  

Edge computing can help alleviate these bottlenecks by sending source streams to edge servers that perform stream splitting and replication functions. By locating these edge servers in more locations across the globe, edge computing provides the ability to scale up to meet the demands of even the largest live events. By scaling at the edge, traditional bottlenecks are avoided and transit costs are reduced by serving streams from locations that are closer to viewers. And by distributing this capacity in metropolitan areas where viewers are located, it ensures lower latency and higher QoE by reducing the distance the packaged stream must travel to the viewer, reducing transit costs and helping to eliminate peering bottlenecks. The result is greater scalability to provide higher-quality viewing experiences.

One of the live streaming technologies that many think is ideally suited for edge computing is WebRTC. Source video feeds can be ingested through local ingest locations using traditional low-latency streaming formats such as RTMP. The source video can then be pushed to edge servers around the globe where the RTMP feed is converted to WebRTC through edge compute instances running in locations close to viewers. Unlike traditional HTTP live streaming formats, WebRTC uses UDP instead of TCP. UDP streaming takes advantage of modern fiber-based IP infrastructures that utilize hardware-based switches and routers to deliver higher sustained average bandwidth and picture quality to viewers. 

WebRTC also promises to open up new interactive workflows by allowing viewers to simultaneously watch live streams at the same time. With the growing popularity of live online gaming such as Fortnite and the legalization of sports betting in the U.S., the ability for content distributors to deliver synchronized interactive online live streaming will help drive the increased consumption of live content and the need for better edge computing functionality and scale.  

One of the first CDN companies that has implemented this approach to using edge compute for scalable live video streaming is Limelight Networks. And with their recently announced partnership with Ericsson to leverage its UDN Edge Cloud Platform to provide edge computing and delivery within carrier networks should provide even greater scalability and capacity for live video streaming at the edge. This will become increasingly important as 5G begins to be rolled out, as I noted in my recent blog Here’s Why Today’s Video Infrastructure Is Not Ready For 5G, And How Edge Technologies Can Help.

All CDN vendors are talking about ultra-low latency video delivery, but not a single one will tell you it’s easy to do at scale. It takes a lot of resources to deploy and adds overhead cost to operating their service. Aside from the technical challenges in scaling, CDNs also have to target specific customers who see the value and are willing to pay more for the improved user experience. In a recent survey I did of over 100 broadcast and media customers, 80% of them said they wanted ultra-low latency functionality, but were not willing to pay more for it. Many expect the functionality to be part of a standard CDN delivery service. 

By utilizing edge computing capacity within carrier networks as well as in globally distributed data centers to scale and distribute live video streams, content distributors want to reduce costs while ensuring the highest quality live viewing experiences on both mobile and fixed devices. Edge infrastructure will help make this a reality, and it’s going to take time for the business model to be figured out, but it will happen. I’m excited to see what new applications will be enabled in the live video world from the edge and with low-latency. They are coming.