CEO of BBC Studios’ DTC Business Discusses How North America Became British TV’s Biggest Opportunity

At the NAB Streaming Summit on April 20, 2026, Robert Schildhouse, CEO of BBC Studios’ DTC business, discussed the strategic evolution of streaming and the specific success of BritBox. Schildhouse, an industry veteran who helped launch Hulu in 2008, outlined how his division prioritizes a “durable and profitable” business model over the high-volume, “grow-at-all-costs” strategy popularized by Netflix.

Unlike general entertainment platforms aiming for 200 million global households, BritBox focuses on a disciplined, niche approach, which some might call the “Anti-Netflix” model. Some of Robert’s key takeaways on the business included:

  • Profitability Over Scale: BritBox prioritizes profit margins and lifetime customer value over sheer subscriber counts
  • Distinctive Content Lanes: BritBox leans into specific British genres that resonate with American audiences, particularly mysteries, crime, and period dramas
  • Retention vs. Hits: Instead of relying on a single quarterly “mega-hit” to drive sign-ups, the service focuses on its “deep cannon” of thousands of hours of programming to keep users engaged long-term

Schildhouse highlighted the successful launch of BritBox Premier, a premium tier that offers 4K quality, early access to shows, mobile downloads, and documentaries from BBC Select. Though launched with little fanfare, it already represents over 10% of its own-and-operated subscribers.

An interesting stat Robert gave out was that 50% of BritBox’s direct subscribers are on annual plans. These users are worth more than twice as much as monthly subscribers, given their significantly higher tenure and lower churn. BritBox’s audience leans older and more female, a segment Schildhouse describes as more loyal and less prone to “serial churn” than younger viewers.

Marketing efforts are highly specific: while a general viewer might not see BritBox ads, target demographics are “overwhelmed” by them across TV and social media to keep the service top-of-mind.
Robert also teased the May 6 release of The Other Bennet Sister (a Pride and Prejudice spin-off), which he views as a major “on-ramp” for reaching broader female audiences.

Schildhouse strongly believes the industry’s future lies in aggregation and bundling, not fragmentation. BritBox is actively experimenting with bundles—having already partnered with Starz, MGM+, and Hallmark—to introduce the service to unique, adjacent audience profiles.

My thanks to Robert for speaking at the NAB Show Streaming Summit and giving everyone an update on BritBox’s business and market strategy. My apologies that his fireside chat is not available on demand; we had a technical issue with the recording. The rest of the presentations from the show can be seen here.

Sponsored by

Netflix Announces Its Ad Tier Now Has More Than 250 Million Monthly Active Users

At its UpFronts presentation, Netflix announced it has more than 250 million monthly active users (not subscribers) on its Standard With Ads plan, up from 190 million in November of 2025. During Netflix’s Q1 earnings, the company said 60% of new Netflix customers now choose the ad plan, and a new stat from the UpFronts also showed that more than 80% of ad-supported viewers sign in to Netflix weekly.

In its Q1 earnings report, Netflix noted its ad business is expected to double this year, reaching $3 billion in revenue. In a letter to shareholders, the company noted that building the ads business has been a priority and that Netflix now works with over 4,000 advertisers, up 70% year over year.

The 250 million number is not the number of subscribers on its Standard With Ads plan. Netflix uses its own data, not a third-party, to calculate monthly active users as everyone in a household who watches more than one minute of ads on the service each month.

Starting next year, Netflix will also launch the ad-supported plan in 15 more countries: Austria, Belgium, Colombia, Denmark, Indonesia, Ireland, the Netherlands, New Zealand, Norway, Peru, the Philippines, Poland, Sweden, Switzerland and Thailand.

BitMar Is A Scam, Don’t Sign up For It: All Content Can be Found for Free Online

A company called BitMar, which promises to “stream everything legally,” is scamming people out of $150 by linking to free YouTube content via Bing search and Pluto TV. The CEO reached out, offering me a “personal lifetime membership” and hoping I would “inform my audience” about the service. I’ll be happy to inform them. Stay away from BitMar!

While the company doesn’t say it streams content from the major OTT platforms, it still uses their name in its marketing, saying: “BitMar provides easy access to more movies, and TV shows, than: Cable, Satellite, Netflix, Disney Plus, Max/HBO Max, Amazon Prime Video, Apple TV+, Peacock, and Hulu combined, and more songs, than: Pandora, Spotify, Amazon Prime Music, and Apple Music—combined.”

As expected, all of its recent reviews on the Google App Store are 1-star. Google shouldn’t allow this type of app in the store; Apple doesn’t. The fine print on BitMar’s site makes for a good laugh. “Some content may be internationally restricted. Not all content is free and/or accessible. We have a no-refund policy.”

If you are a consumer reading this post, do not pay for this service.

Wall Street Doesn’t Understand the CDN Business, What AI Agentic Traffic Is, and How Bits are Delivered

Fastly’s stock was down 40% on Thursday, May 7th, one day after its Q1 earnings report, with Piper Sandler saying, “The disappointment was in the core delivery business that saw lower quarter-over-quarter volumes than most were expecting, as pricing remained stable.” Who’s most, and why were they expecting something different?

If you talk to customers, which Wall Street doesn’t, in particular, the large ones, none of them are seeing delivery volumes accelerating faster than projected. These large customers matter most to Fastly, as its top 10 customers accounted for 34% of revenue in Q1. Customers have optimized their encoding, overall bitrates have gone down, and 4K traffic accounts for 1-2% of all bits delivered across CDNs and isn’t growing. There is no catalyst in the market for Wall Street to expect volume increases quarter-to-quarter, outside of a CDN winning a new contract(s).

Another Wall Street firm, KeyBanc Capital Markets, had been bullish on the potential for “agentic use of the internet and web applications to drive CDN traffic higher.” That statement demonstrates no understanding of delivery technology, the scale of traffic requests, and how content is delivered. Agentic traffic refers to traffic generated by autonomous AI agents. AI agents are not watching videos from streaming services or downloading large software files, which account for the largest majority of the bit volume for a CDN vendor. Anything an AI agent does, such as requesting information from a website, accounts for a very small volume of data per request.

As of late 2024–2025, the average home page size is approximately 2.1 MB, according to HTTP Archive. A 30-minute video streamed at 6 Mbps averages 2.7 GB of data delivered. The video delivers 1,200 times as many bits as a webpage, or 2,400 times more for a 60-second video.

Before earnings, year-to-date, Fastly’s stock was up 210%, and it rallied 19% ahead of earnings. Even with the stock down 40% as of 2:35pm ET, it’s still up 212% in the past year. While Fastly raised its full-year 2026 revenue guidance to $710-$720 million, the midpoint was below Wall Street’s $716 million estimate. That, plus Fastly’s Q2 revenue guidance of $170-176M, which would be mostly flat from Q1, would be a realistic explanation for the stock’s decline.

Anyone on Wall Street expecting an acceleration in bit delivery within the CDN product bucket clearly doesn’t understand the market conditions, pricing, contract terms, commitments, or size of files delivered. And yet, so many public data points exist, many from the CDNs themselves, that Wall Street misses.

Disclaimer: I have never bought, sold or traded shares in any public CDN vendor, nor do I own any shares in any private CDN vendor.

A Year Later: Examining Google Media CDN’s Evolution in Scale, Monitoring, and Architecture

A year ago, I wrote about Google’s Media CDN offering and its positioning in the market, which was primarily centered on leveraging Google’s network for large-scale video delivery. As with any service, the initial value proposition is only part of the story. The more telling measure is its subsequent evolution in response to customer usage and industry demands. A year later, Google has made key enhancements to its Media CDN, focusing on adding capacity and operational tooling, as well as onboarding large media and entertainment customers.

The fundamental challenge for CDNs remains handling massive, concurrent traffic spikes associated with live streaming. Events over the past year, such as the Super Bowl, FIFA World Cup, and IPL, have continued to set new streaming benchmarks. One notable change in Google’s Media CDN offering is that since early 2025, it has tripled its delivery capacity through a combination of Google’s Media CDN offering and YouTube capacity. Beyond raw capacity, several architectural and commercial updates have been introduced to address common customer pain points around origin performance and budget predictability.

Google has added new caching and routing options, including Flexible Shielding, with shield regions in South Africa, the Middle East, and the U.S. The goal is to improve cache offload rates by keeping traffic within a region, thereby avoiding the latency and data-transit costs associated with the “hairpinning” effect of fetching content from a distant origin. This is an add-on feature that lets customers choose between optimizing for performance or offloading, in addition to the platform’s existing multi-region caching and shielding architecture, which is offered at no cost.

Google says a series of updates is aimed at solving common origin integration issues. For instance, the platform now supports HEAD requests, has increased its maximum segment size from 10MiB to 25MiB (likely to better support high-bitrate 4K streams), and has added support for multi-part range requests. These changes suggest a focus on improving interoperability with a wider range of customer storage configurations. On the commercial side, Google has introduced monthly savings plans. This model provides a committed-use discount, giving customers a more predictable TCO, a departure from a pure pay-as-you-go cost structure.

For live events, real-time performance visibility is critical. Google has introduced a Monitoring as a Service (MaaS) offering, positioned as a “broadcast operating center” equivalent. It is designed to provide a customer’s operations team with intensive, proactive support for events, with a consolidated view of an event’s health, from origin performance to end-user metrics.

The tool’s intended impact is to enable proactive issue detection, allowing teams to identify potential problems before they affect a large number of viewers. For customers with large-scale events that have used it, the tool is intended to provide real-time data needed to manage a professional broadcast.

Google’s initial market entry with its Media CDN offering was focused on its network foundation. The changes over the last year, however, indicate a shift toward addressing more specific, operational, and architectural challenges faced by large-scale broadcasters. The introduction of more flexible caching, origin compatibility fixes, and broadcast-grade monitoring tools suggests a maturing platform. The true measure will be how these enhancements perform under pressure across a broader range of customer workloads, but the direction of development is clear.

New Codec Data: 40% of Respondents Plan to Deploy AV1 in 2026, H.264/AVC Has 84% Adoption

New data from NETINT’s state of video encoding survey shows H.264/AVC sits at 84% production deployment, making the standard effectively universal and plateaued. HEVC is at 65% in production with another 20% planning deployment, putting it on a path toward H.264-level ubiquity. But the breakout number belongs to AV1. At 17% current production deployment, AV1 might not look like much on its own. But 40% of respondents plan to deploy AV1 in 2026, giving it a combined reach of 57% by year-end. That is not early-adopter experimentation. That is mainstream planning.

The barriers to AV1 adoption are worth noting because they are not primarily about the codec itself. Hardware decode support leads at 54%, followed by toolchain limitations at 43% and encoding compute costs at 37%. The licensing story is essentially a non-issue for AV1, with less than 1% cited it as a barrier. This stands in sharp contrast to VVC, where 44% flagged licensing and royalties as a concern. That licensing gap alone may determine which next-gen codec is deployed first.

One additional data point I found interesting: organizations running three or more codecs in production today are 57 times more likely to have AV1 in their stack compared to single-codec operators. Current codec stack complexity is the strongest predictor of future adoption velocity. That pattern makes intuitive sense. Teams with multi-codec experience already have the tooling, testing workflows, and operational maturity to add another format without starting from scratch.

VP9 offers a cautionary data point here. The data shows it sits at 15% production deployment with respondents, with only 15% planning further adoption, and 70% reporting no plans at all. VVC generates interest in evaluation, with 29% planning to adopt it, yet the successor to HEVC remains at just 4% in production deployment. It’s clear that VVC is facing both licensing headwinds and toolchain immaturity, unlike AV1. The report describes a clear codec progression ladder: organizations tend to climb from H.264 through HEVC and VP9 to AV1, meaning the current stack is a better predictor of next-gen adoption than company size or industry vertical.

GPUs Dominate, But the Hardware Picture Is More Complicated Than It Appears
GPUs hold 72% of the hardware acceleration market share, driven primarily by NVIDIA’s NVENC ecosystem. That number is not surprising. What is more revealing is what is happening underneath it. Among organizations using hardware acceleration, 41% deploy multiple hardware types simultaneously—GPU plus VPU, GPU plus on-premises appliance, or broader combinations. This is not a one-size-fits-all market. Organizations are matching different hardware to different workload profiles, and the data suggests this trend is accelerating rather than consolidating.

GPU-specific pain points explain the hardware diversification. Power consumption leads at 39%, followed by codec/feature gaps at 37% and insufficient stream density at 35%. For live and high-density workloads, these are meaningful operational constraints. The report notes that VPU evaluation intent stands at 49%, nearly matching the 51% who plan to evaluate GPUs. Live and broadcast operators are 2.2 times more likely to adopt VPUs, driven by density economics and latency requirements that GPUs can struggle to meet at scale.

The primary encoding method split also tells a story of transition. Software/CPU-based encoding remains the single largest encoding architecture at 37%, while hardware-accelerated (30%) and hybrid approaches (32%) collectively account for 62% of the market. The era of pure software encoding is winding down for anyone operating at a meaningful scale.

AI in Encoding Has Crossed the Mainstream Threshold
60% of respondents currently deploy AI/ML in at least one encoding workflow, and 70% plan to expand those capabilities in 2026. That is no longer experimental. The more interesting story is where AI is heading versus where it started. Current deployments concentrate on what I would call adjacent applications: transcription and captioning at 46%, scene classification at 37%, and super-resolution at 26%. These sit next to the encoding pipeline but do not fundamentally change how encoding decisions are made. The highest-growth applications tell a different story. Content-aware ladder generation shows the highest planned growth at 77%, followed by QoE prediction at 50%. These are core encoding intelligence functions—they sit inside the pipeline and directly influence bitrate allocation, resolution selection, and quality optimization.

This migration from auxiliary to core defines the AI trend in encoding for 2026. It also explains why AI expansion correlates with operational scale rather than company size. Organizations processing high VOD volumes are 2.3 times more likely to plan AI expansion regardless of employee count. GPU users are 2.4 times more likely, since GPU infrastructure provides the compute foundation for AI workloads. The implication for hardware vendors is clear: offering AI-ready compute alongside encoding capability could become a baseline expectation.

The Quality Measurement Gap Nobody Is Talking About
VMAF has crossed the 50% adoption threshold at 52%, establishing itself as the de facto objective quality metric for the encoding industry. PSNR remains at 32%, mostly due to legacy pipeline inertia rather than active preference. But here is the number I found interesting: 30% of respondents operate without any formal QA metrics.

These organizations are making encoding decisions such as codec selection, bitrate allocation, and hardware investment without systematic quality measurement. The report finds they are 2.7 times more likely to report frequent quality issues and 1.9 times more likely to cite viewer complaints as a business problem. That gap correlates strongly with team size: organizations with one to two-person video teams are 3.1 times more likely to lack formal QA than those with 10+ person teams.

This connects to a broader theme in the data. Content Adaptive Encoding (CAE) has nearly identical adoption across live (25%) and VOD (28%) workflows, which the report identifies as the clearest indicator of encoding sophistication, regardless of use case. Organizations using CAE for live workflows are 2.4 times more likely to also use AI/ML in their encoding pipeline. CAE adoption appears to serve as a gateway to broader encoding intelligence. For the 30% without formal QA, they are not just behind on measurement; they are likely missing the optimization wave reshaping streaming economics.

The Biggest Roadmap Blockers Are Not Technical
One of the more revealing sections of the report examines what is actually preventing organizations from executing their 2026 encoding roadmaps. The top constraints are capital and budget limitations (39%) and limited team capacity (38%). Device and technology support comes in at 30%, cost-per-channel pressure at 30%, and integration/API gaps at just 10%.

In other words, 77% of respondents cite at least one organizational barrier, compared to just 19% citing a purely technical one. This shows the bottleneck is not the technology. It is the budget to buy it and the team capacity to implement it. This matters because the data elsewhere in the report shows that 49% of all respondents have video engineering teams of 1 to 5 people. Even among companies with 5,000+ employees, small video teams are not uncommon. When the report says lean teams are the norm across all company size brackets, it has real implications for how vendors should think about integration complexity, time to value, and operational overhead.

Edge Encoding: High Interest, Low Conviction
42% of respondents expressed interest in deploying encoding at the edge, while 30% remain uncertain. This represents the largest undecided cohort on any topic in the survey. Only 29% said they are not interested at all. Among those interested, the top use cases are localized ladder generation (56%) and dynamic ad insertion (48%).

Edge interest follows a U-shaped curve by company size. Small companies (46%) and large enterprises (43%) show the highest interest, while mid-market firms lag at 26–28%. Interest also correlates with operational scale: 50% of organizations running 250+ live channels are interested, compared with 35% at smaller scales. The data suggests edge economics become more compelling as encoding volume increases, regardless of overall company size. But the 30% uncertainty band represents the biggest market education opportunity in the survey. These organizations see the potential but lack the information or confidence to commit.

Live event operators show the strongest interest in edge, at 44%, driven by latency requirements for real-time delivery. This aligns with what I have been hearing from operators at industry events—edge latency is becoming a competitive differentiator for sports and live entertainment. The report suggests that vendors who can reduce ambiguity through reference architectures, concrete use cases, and trial programs will convert undecided prospects faster than those who lead with product specs.

Who Responded and Why It Matters
Before drawing too many conclusions from any survey, it is worth understanding who completed it. Live events and sports broadcasting represent the single largest vertical at 35%, followed by subscription VOD at 18%, AVOD/FAST at 7%, and enterprise communications at 6%. Combined streaming verticals: live sports, SVOD, AVOD, and UGC represent 65% of the sample. EMEA leads geographically at 44%, followed by North America at 40%. The report is transparent about its limitations. Survey distribution through industry channels and NETINT’s network may overweight organizations already evaluating hardware encoding solutions. APAC (19%) and LATAM (16%) are underrepresented relative to their share of global streaming growth. These are fair caveats and worth keeping in mind when interpreting the data.

The combination of executive-level respondents with real budget authority and hands-on engineers with operational experience makes this one of the more credible encoding industry surveys I have reviewed. The fact that nearly half the respondents are VP-level or above means the stated priorities and investment plans are not aspirational wish lists from mid-level managers—they reflect where actual dollars are going.

The report runs more than 50 pages and covers codec adoption, hardware acceleration, AI/ML integration, quality measurement, cost structures, and organizational archetypes. The full report is available from NETINT here. It includes detailed archetype analysis of four distinct organizational segments, build-versus-buy economics, executive priority rankings, and projections. For a vendor-sponsored survey, NETINT deserves credit for producing something with analytical depth.

Akamai Details Rising Supply Chain Costs and Upcoming Price Adjustments

Due to rising costs for servers, RAM, SSDs, and energy, Akamai has notified customers and partners of upcoming interim surcharges and pricing adjustments for contract renewals. Akamai shared candid data with me on the massive economic pressures they are seeing across their supply chain and the specific market drivers behind the updated pricing.

While CDN pricing has declined steadily for years, the current hardware and energy markets are forcing a shift. Akamai provided a detailed explanation of why they are implementing adjustments to account for what they describe as significant market-driven forces. They noted that over the years, Akamai has worked hard to maintain economic stability for its customers and partners amid rising global infrastructure costs. However, they are now introducing what they call a modest adjustment in response to the economic pressures reshaping the industry.

According to the data Akamai provided, the cost of server components has spiked significantly since October 2025. Specifically, their market costs for RAM have more than doubled, and SSD pricing has seen significant increases. Due to ongoing supply shortages, Akamai noted that server costs increased between 75% and 200% compared to the first half of 2025. These hardware costs are not stabilizing. In fact, they are continuing to climb on an almost monthly basis. This aligns with recent industry analysis, with some analyst firms noting that end-user prices for memory and storage could rise by 150% to 300% or more in 2026 and 2027, compared to 2025 levels.

Hardware is not the only factor, with Akamai noting that energy costs increased by more than 200% in many regions. Personnel costs are also rising significantly, as the market for highly talented engineering and services talent remains highly competitive. Given the growth in server costs over the past few months and a forecast for double-digit increases to continue into 2026, the company explained that it can no longer absorb these costs while maintaining the same performance and security standards.

To address this, Akamai is implementing two specific changes:

  • Interim Surcharge: Effective April 1, 2026, a 3% surcharge will be passed through to customers and partners.
  • Renewal Adjustments: Akamai will incorporate a price adjustment of up to 10% on contract renewals to reflect the current cost environment.

In response to my questions about what they have done internally to offset these costs, Akamai was clear that they are working to mitigate the impact through a variety of internal optimizations. This included org structure adjustments and migrating their own internal workloads to Akamai Cloud to significantly reduce third-party cloud spend.

Ultimately, Akamai views this decision as an operational necessity to maintain the security and performance levels its customers require. The company noted that these changes are required to fund the ongoing costs of infrastructure, maintenance, and innovation associated with managing increasingly sophisticated cloud and edge services.

While nobody likes cost pressures, this is clearly a systemic issue across the infrastructure market, not isolated to Akamai. We are seeing similar moves from other players. Hetzner recently announced price increases of up to 50% for its services, and OHVcloud has implemented similar measures. This reflects broader market realities, where every provider is facing the same spike in server and power costs, and the current cost environment is likely to continue into 2027. I will continue to track how other CDNs and cloud providers respond to these same supply chain pressures.