StackPath Exits CDN Market, Akamai Acquires Top Customers, No Major Impact on the Industry

StackPath, which has thousands of CDN customers, mostly in the SMB market, has notified customers of their intent to exit the content delivery business and cease CDN operations on November 22, 2023. As part of their exit from the space, Akamai has acquired “approximately” 100 enterprise CDN contracts from StackPath.

Akamai says these select customers represent at least $50,000 in ARR with the potential to grow to a minimum $100,000 ARR by cross-selling security and compute. Akamai says most of the customers are net-new for the company with very little overlap across their current customer base. Akamai anticipates this transaction will add approximately $20 million in revenue for the full year in 2024.

Akamai is not acquiring any IP or patents from StackPatch and is not taking on any of their adult customers or any services tied to transit, co-location, or managed services, which was part of StackPath’s offering since their acquisition of Highwinds, BandCon, and MaxCDN.

It is unknown what will happen to the thousands of other CDN customers StackPath has, the majority of which are very small. Some of these customers are paying as little as $10 a month, so CDN services are not crucial to their business. StackPath’s exit from the CDN market will have no real impact on the industry since StackPath had not been a major player in the space for many years. Industry analysts have been overestimating StackPath’s CDN revenue by hundreds of millions of dollars, which they didn’t have. The company had under $50 million in actual CDN revenue tied to large customers with many analysts including non-CDN services like transit and colocation in their numbers.

StackPath was launched in 2016 having raised $150 million and did four acquisitions to start the company, including MaxCDN, a small CDN setup for self-service SMB customers. In 2017, StackPath acquired Highwinds, another CDN with a larger customer base and used their branding for their CDN services. In 2010, Highwinds acquired BandCon, which provided IP services, transit, colocation, and other infrastructure services.

With Akamai not taking on any StackPath employees or infrastructure and simply acquiring contracts, it’s an easy and no-risk value proposition for the company. Terms of the deal were not disclosed.

Sponsored by

Nielsen’s Data Comparing Cable TV Viewing to Streaming is Flawed and Misleading

It’s sad that most major media outlets run stories with dramatic headlines saying, “In July, linear TV made up less than half of all TV viewing, according to Nielsen,” without questioning the data or how the word “linear” is being defined. Nielsen’s latest comparison data of cable TV to streaming viewing is NOT apples-to-apples and their methodology isn’t clear or transparent. But most in the media don’t care as it plays into their goal of wanting to call pay TV dead, something they have been saying for more than five years. Spoiler, it’s not dead.

Nielsen’s terms are so broad and undefined as they use phrases like “TV usage”, which isn’t clearly defined. TV is a device, it’s not a “type” of viewing. Later they use the term “broadcast viewing”, but you view broadcasts on TV. They also use the term “cable viewing”, which is a type of delivery mechanism to deliver the video. Nielsen provides no details of what devices are being measured for streaming usage and includes FAST channels, which as presented, is the same as TV viewing. It’s linear channels.

Nielsen is lumping in the method of delivery (streaming, cable, satellite) with the business models, (pay TV, AVOD, SVOD, FAST) and then ignoring some services completely like YouTube TV, Hulu + Live TV, and Sling TV, without any explanation as to why. In their “other” category they say that, “Streaming platforms listed as “other streaming” includes any high-bandwidth video streaming on television that is not individually broken out.” What is defined as “high-bandwidth video streaming?”

There is zero transparency in Nielsen’s methodology and definitions and yet the streaming industry and the media continue to use it to help propel the notion that anything that isn’t streaming-related isn’t a viable business anymore. Sadly, the industry doesn’t push back on Nielsen, question the data, and demand transparency because they only care about pushing the agenda of “pay TV is dead”, which Nielsen’s pie charts supposedly prove.

No one seems to care about the methodology of how data is collected or the accuracy of what is being compared and that’s a step backwards for the industry.

Some History of Microsoft’s NetShow Theater Server, from 1994, Originally Code-Named Tiger

You have to be one of the OG’s in streaming to know what this is. Microsoft’s video server technologies, originally code-named Tiger, (because it sliced data into “stripes” for storage) led to its innovation in streaming media and was first demonstrated in 1994. The fundamental problem addressed by the Tiger design was that of efficiently balancing user load against limited disk, network and I/O bus resources.

Tiger accomplished this balancing by striping file data across all disks and all computers in the distributed system, and then allocating data streams in a schedule that rotates across the disks. In April 1996, Microsoft’s Research Advanced Technology Division presented a paper written by Bill Bolosky, Richard Draves, Robert Fitzgerald (who are still at Microsoft!) and others from their team at the Sixth International Workshop on Network and Operating System Support for Digital Audio and Video (NOSSDAV 96) in Japan.

At the time, Microsoft had built many Tiger configurations running video and audio streams at bitrates from 14.4Kbits/s to 8Mbits/s and using several output networks ranging from the internet through switched 10 and 100 Mbit/s ethernet to 100Mbit/s and OC-3 (155Mbit/s) ATM. Their paper described an ATM configuration set up for 6Mbit/s video streams with the 15 disk Tiger system rated for a capacity of 68 streams.

Microsoft was already working on video technology as early as 1993, when their video for Windows development kit was sent to developers that year.

Executive Interview: Dr. Fou Details YouTube’s Ad Scandal and the Complexities of Fraud in the Online Video Advertising Market

For my latest “Executive Interview” podcast, I sit down with ad fraud expert Dr. Augustine Fou, who joins me for a deep dive into the murky waters of ad fraud detailing the allegations against YouTube. Learn how these misrepresentations are impacting advertisers’ trust, why YouTube’s reply to the allegations ignored the issues, and what this could mean for YouTube’s revenue. You can listen to the podcast here.

We also detail challenges in the overall video advertising industry when it comes to detecting ad fraud, why advertisers don’t have much incentive to fix it, the lack of clear measurement, methodology, and transparency in reporting and the impact of AI on the market.

If you’re interested in understanding the complexities of ad fraud in the online video advertising market and the challenges and trust issues that plague the industry, or if you’re an advertiser wanting to ensure your campaigns are not misrepresented, this is an episode you won’t want to miss.

Research report and data from Adalytics:

Google’s reply to the report:

Adobe, YouTube, and Twitch to Enhance RTMP With Not-For-Profit Veovera Software Leading the Initiative

Veovera Software, a not-for-profit organization, has stepped forward to spearhead the mission of modernizing RTMP, aligning its specification with the latest state-of-the-art technology. With the support of Adobe, a member of the organization, Veovera’s current focus is to steward and modernize the RTMP specification. YouTube and Twitch, also members of the organization, are backing the objective of enhancing RTMP. (Link to specs here)

RTMP was initially developed by Macromedia as a TCP-based protocol for high-speed transmission of audio, video, and data between servers and Flash players. It became the de facto standard for web-based streaming video as a crucial component of Flash Video. While RTMP died as a delivery protocol, RTMP continues to be a dominant force, especially on the ingestion side. Many broadcasting platforms incorporate this protocol into their processes due to its speed and reliability, characteristics that are particularly vital for first-mile delivery.

Slavik Lozben, Chairman of Veovera and one of the original creators of RTMP, told me that, “RTMP remains the most widely used format for live broadcast, with over 75% of broadcasters relying on it for ingestion. (According to Wowza’s Video Streaming Latency Report from 2021) Surprisingly, the next most popular technology accounts for less than a third of this usage. What’s remarkable is that RTMP hasn’t been updated in over 10 years. Given the escalating demands of the streaming community, it’s clear that RTMP needs enhancements to meet present-day challenges.”

RTMP has not been updated in over a decade, causing challenges related to not having the latest tech (e.g. support for current codecs, HDR). Companies that rely on RTMP now face a pivotal decision: should they invest in a costly overhaul or continue to utilize this increasingly outdated technology? Enhancing RTMP specification is a more manageable task compared to investing in an expensive transition to a different protocol.

Twenty-one years later, it’s astonishing to witness RTMP retaining its relevance after its original introduction in Flash Video. As an initial step, Veovera has incorporated support for AV1, VP9, and HEVC and is now striving to quickly define additional capabilities. Their goal is to continuously refine RTMP while ensuring backwards compatibility, without disrupting the internet or current tools. RTMP has remained untouched for quite a while, likely due to the absence of an organization willing to lead the effort to formally align the protocol within the industry. The challenges are not just technical but also involve logistical aspects. Veovera aims to:

  • Prevent protocol bifurcation and maintain RTMP as a single, unified definition
  • Enhance and modernize the RTMP/FLV specification with new functionality
  • Collaborate with organization members and third-party solution providers to help deliver implementations that support the enhanced RTMP specifications
  • Engage with the RTMP community to promote RTMP enhancements

Veovera says there’s a significant demand for this initiative across the industry and solution providers who rely on RTMP are eager to see these enhancements become a reality. Despite being in existence for many years, RTMP has remained relevant. Any longstanding streaming service has likely utilized RTMP at some point. Today, platforms and services such as Google, YouTube, Twitch, PlayStation, Meta, OBS, FFmpeg, VideoLAN, TikTok, and others continue to incorporate RTMP in their workflow.

Veovera has enhanced the RTMP specification by introducing popular video codecs like VP9, HEVC, and AV1. As for their 2023 roadmap, Veovera has a goal to enhance the recent video codec updates with the inclusion of widely-used audio codecs. The top contenders for this integration are Opus, FLAC, AC-3, and E-AC-3. Furthermore, Veovera intends to specify more features for RTMP, such as support for a seamless reconnect command plus other capabilities. Considering these enhancements, we can anticipate content to have lower latency and better quality. As a result, it’s plausible that RTMP will continue to stay relevant for many years to come.

The Hollywood Strike All Comes Down to Metrics, Methodology and Transparency

The writers and actors strike comes down to a discussion on how best to measure the impact that content has, on a per title basis, with metrics and methodology that both sides agree on, while offering transparency. That’s not promising.

The writers and actors strike is a complex topic due to, amongst other things, a lack of agreed-upon methodology for defining streaming viewership and data transparency. But when SAG-AFTRA is quoted as saying, “It is not okay anymore for companies to just bring in huge amounts of revenue from people’s work and not share it with them,” I think a clear distinction needs to be made between “revenue” and “profits”. Those are very different metrics.

Many are talking about how much money Disney and others streamers are making, but Disney’s DTC business is not profitable. Their DTC business has $10.84 billion in losses from Q1 fiscal year 2020 through Q2 fiscal year 2023. Disney has said they expect Disney+ to, “achieve profitability in fiscal 2024, assuming we do not see a meaningful shift in the economic climate”, but their business is not profitable today. Why isn’t SAG-AFTRA suggesting profitability as one of the metrics so that the more a streaming platform makes, the more writers and actors could make?

Instead, SAG-AFTRA proposed that performers receive a 2% share of the revenue generated from streaming content and they want to use Parrot Analytics’ content valuation platform to determine what revenue was generated by each piece of streaming content. The problem is that Parrot Analytics is using metrics such as Google searches and social media engagement to define which content is considered most valuable to the streaming platforms – based on their “estimates”.

The AMPTP has correctly pointed out that that Parrot’s data is not available to anyone who doesn’t subscribe to them and it, “lacks any demonstrable link to the actual revenue received by the service in the form of new or retained subscribers.” When the core methodology being proposed is not based on actual viewership, I think that’s a problem. Using “popularity” in Google searches and on social media is too vague of a methodology. Popular doesn’t always equate to profitability.

Studios value content differently on a host of factors including the content costs, type of content, target market etc. and those values change. Streaming platforms are constantly evolving their strategies on what content to invest in and that shifts all the time. It’s an extremely fluid business and will continue to be so going forward. Streaming does not and will not look like what linear used to look like. Applying a linear TV model for royalty payments tied to streaming distribution is not going to work.

Both sides are so far apart on the foundation of what should even be measured, how to measure it and what data to share. It’s not looking good to this strike being resolved any time soon.

Streaming Summit Back In NYC as Part of NAB Show New York, Call For Speakers Open

I’m excited to announce the Streaming Summit is coming back to NYC, as part of the NAB Show New York, October 24-25. The call for speakers and sponsorships are now open and I am looking for speakers, moderators and presenters. The show will focus on the latest trends around sports streaming, bundling and packaging of content, advertising measurement, content discovery, scaling video workflows and delivering a great user-experience at scale. Both business and technical topics will be covered and a networking reception will take place October 24th at 5pm. Full details are on the website and speaking spots will go fast! If you want to be involved in some capacity, now is the time to reach out to me to discuss your ideas.