Hosting Dinner With Fastly For Wall Street Investors, July 25th in NYC

On Wednesday July 25th at 6pm, in association with JNK Securities, I will be hosting a dinner in NYC for Wall Street investors with Fastly’s President (Joshua Bixby), Head of Strategic Partnerships (Lee Chen), and Fastly’s CFO (Adriel Lares). Hot on the heels of another big round of funding for Fastly, the company has spent the past seven years disrupting the CDN market, focusing on performance, security and media delivery.

We plan on discussing the changes in the competitive landscape with the need for platform services at the edge and the advent of serverless and edge compute trends. Attendees will learn about the innovative approach Fastly has taken with their network architecture and how it is disrupting the market in functionality, performance and underlying cost structure. We will also discuss the factors (i.e., scale, price, features, ease-of-deployment) that customers use to evaluate their CDN vendor and discuss some of the latest pricing, product and customer use cases for edge compute.

If you are interested in attending the dinner, please reach out to me but note that this dinner is only for those on Wall Street and space is limited.

Hosting Dinner For Wall Street Investors With StackPath, July 19th in NYC

On Thursday July 19th at 6pm, in association with JNK Securities, I will be hosting a dinner in NYC for Wall Street investors with Lance Crosby, StackPath’s Chairman and CEO. Lance previously founded what is considered to be one of the first cloud platforms, SoftLayer, which he sold to IBM for $2 billion in 2013. Attendees will hear how StackPath delivers enterprise-grade security and performance in a frictionless, on-demand platform with cloud-scale control and flexibility. We will discuss the latest needs by customers to protect, accelerate, and innovate for CDN, WAF, and Managed DNS, along with the latest trends in automation and virtualization capabilities.

If you are interested in attending the dinner, please reach out to me but note that this dinner is only for those on Wall Street and space is limited.

ISPs In Africa Say Netflix and Other OTT Services Are Taxing Local ISPs, Threatening To Derail Broadband Investments

The thorny issue of international content providers like Netflix selling online VOD subscriptions services in South Africa/Africa, whilst simultaneously not paying a cent of their revenue to South African Revenue Services in the form of company tax, has been reported from a number of different perspectives. In the case of Netflix, at 300,000 paying subscribers in South Africa at roughly $10 per month, it amounts to a lot of corporate tax avoidance. But to be fair to Netflix, right now the way the law is written, Netflix isn’t legally required to pay the tax.

Yet a different issue with much more severe impact, conveniently ignored until recently, is the issue of how OTT content players who generate vast amounts of local traffic, contribute to the expansion of the local network infrastructures, in Africa/South Africa. ISP networks are expected to carry large volumes of traffic, at peak times, without any congestion, latency, or buffering to end-users binge watching Game of Thrones in HD.

In recent conversations with a few ISPs in Africa, they are hugely reluctant to continue to put in place the capacity that is needed, when they have to pay all the costs to do so. Historically, Netflix has argued that they shouldn’t have to pay ISP’s to deliver its video content because consumers are already paying for Internet access. In addition, their dominance and huge appeal to VOD customers gives them massive leverage to force ISP’s into submission to host their hardware, or the ISPs risk losing disgruntled broadband consumers.

The harsh reality is that especially in Africa, ISP’s are being squeezed even more by consumers who are demanding lower broadband pricing, whilst expecting their access circuits to go faster and faster, whilst they consume more and more video at higher and higher quality. ISPs say something has to give and have commented that it is threatening the growth of broadband investments as a result. It has been well reported that Netflix has grudgingly entered into paid peering agreements with networks like AT&T, Verizon and Comcast in the U.S., but in Africa, it’s a different story.

When Netflix entered the market in 2016, they were not entertaining any paid peering negotiations but rather focussed on settlement-free peering and deploying their Open Connect servers in local networks. Generally, some ISPs were ok to receive the Netflix servers just to ensure that they could deliver on their subscribers’ content experience, and ISP’s also saved on the lowered usage of expensive international bandwidth as Netflix servers were now in the country, serving the local demand.

However, fast forward two years to 2018, and this “squatting model”, as some ISPs refer to it, is now completely broken, thanks to the exponential demand of video and arrival of many other content giants who are finding their new homes in locations like Teraco data centres. When ISP’s in Africa factor in the likes of Google, Netflix, DSTV Box Office, Showmax, DSTV Now, AWS, and Azure, all who have the expectation to put traffic on the networks and expect everything to keep working, without contributing to the economic effect, it’s not a model for success.

Teraco data centres have become the preferred hosting location for many local and international companies, due to their network neutral placement, which all the ISP’s have a fiber connection to. The previous hosting model of placing servers within an ISP’s data centre building created a situation whereby it was very difficult, if not impossible, to move away from an ISP who became unreliable or uncompetitive, thereby creating the perfect opportunity for Teraco to prosper locally.

Ironically, most ISP’s have built up substantial local hosting portfolios over the past 10 years, and traffic delivery fees have always been a component of the monthly bill. The economics at the time were well understood. However, whether company servers are placed in ISP’s data centres or Teraco’s data centres, traffic still flows in much the same way to end users, and must be charged for. Most ISP’s whose hosting business is migrating to Teraco, face catastrophic revenue loss, and no means to fund the exponential traffic growth emanating from these data centres.

Local ISP’s in Africa tell me they simply can’t keep the pipes open without these content giants contributing pro-rata to the financial effect they have on upgrading these networks to deliver their content at the quality they expect. For them, ISPs say the free ride has to come to an end, especially in markets like Africa where broadband infrastructure is not mature or pervasive, and enormous build-outs are still required to connect millions of households, who want a high-quality VOD experience.

Some ISP’s have calculated that the local IP transit cost attributable to Netflix alone for delivering their content to a subscriber in South Africa, is roughly $0.40 cents per month, which is less than 5% of Netflix’s monthly $10 fee. ISPs argue that’s not a lot when you consider Netflix is taking out roughly $3M per month from the local South African economy and putting nothing back.

If Amazon AWS, Microsoft Azure and Google with their cloud platforms are successful in luring enterprise South Africa away from local ISP’s and onto their locally hosted cloud platforms, and traffic from the Teraco data centres continues to get a free ride, ISPs in Africa say it’s game over for everyone in the ISP value chain. Of course this debate is nothing new and we’ve seen Netflix and ISPs argue for years over who should pay for the network improvements that are needed to deliver all of the video that is going over the last-mile. But when ISPs say they are getting wary of continuing to invest, in a country like Africa where broadband still needs a lot of development, it’s bad news for all companies involved in the video ecosystem.

The Streaming Media Industry Needs A “Real” Awards Program For Vendors, Based On Actual Methodology

There are a lot of award programs in the streaming media industry and while many don’t want to admit it, most of them are a joke. Awards are given out to vendors with no real methodology behind them and it’s all based on marketing dollars spent, connections with editorial departments, or stuffing ballots. At the NAB Show in Vegas, nearly a dozen media organizations gave out awards to vendors and many of them spent no time looking at the product, getting hands-on with it, and picked vendors based on what was said in a press release. I’ve seen products win awards that were still in beta, had no referenceable customers, could not scale or simply didn’t do what they claimed. (Note, the NAB Show’s do not give out any awards of their own tied to the streaming industry, news/media companies simply use the NAB Show name in their own awards program)

Of course, everyone likes to win awards, but I think they should be based on a ranking system of some kind, with some real methodology behind them. Without that, it’s hard for vendors who don’t spend a lot on marketing to win awards and many vendors get passed over. I’m also seeing vendors put a good percentage of their marketing budget towards winning awards, with the idea that it somehow helps them win new business. And yet from what I can tell, winning awards doesn’t drive new revenue for vendors at all. I have yet to talk to a customer who said they picked one vendor over another, because it won an award in the market. They are much more likely to take a recommendation from a current customer, or an expert in the field, over anything else.

If an awards program was backed by a person or an organization that was well-respected in the industry and was based on real methodology of some kind, I do think it would help vendors get new business and help them stand out. But to date, I haven’t seen an awards program of that kind, outside of very specific technical programs by a standards body like SMPTE and others, that are tied to more traditional pay TV products. And while all vendors operate in a competitive environment, I also think peers would be willing to help vote on companies they respect, products they think have merit and live events that provide a good user experience.

With all this in mind, the next logical question is, what would it take to provide a respected awards program in the market for the streaming media industry? For vendors to be highlighted properly and companies to know how they were being compared to their peers. What would you want to see included in such a program? I’m interested to hear your feedback, either in the comments section below or you can drop me an email at any time.

Better Video Compression Can’t Fix The OTT Infrastructure Problem, Hardware Might

Last month I write a blog post detailing how the current infrastructure strategy to support OTT services isn’t economically sustainable. I got a lot of replies from people with their thoughts on the topic and some suggested that the solution to the platform performance problem will be fixed with better video compression. While it’s a great debate to have, with lots of opinions on the subject, personally I don’t think better gains in compression are the answer.

While a breakthrough in compression technology would allow the current streaming infrastructure to deliver more video, the unfortunate reality is that significant resources have already been invested in, to optimize video compression, and the rate of improvement is far below the video growth rate. Depending on who’s numbers you look at, video streaming traffic is currently growing at a CAGR of about 35%. On the other hand, video compression has improved by about 7% CAGR over the past 20 years (halving the video bitrate every 10 years). Unless video compression has a major breakthrough of 15:1 improvement or more, in compression efficiency, better compression alone cannot solve the internet video bottleneck.

Compression is really about making the video itself smaller and therefore more efficient to deliver and not about the performance of the underlying streaming delivery platform. The benefits of any potential breakthroughs in video compression technology will likely benefit all streaming video delivery platforms equally and, as such, are a separate topic from the requirement to increase the performance of the streaming video platforms themselves.

Similarly, approaches to improve performance by moving popular content to edge devices closer to the customer, such as at cell phone towers and cable head end locations, have been suggested as ways to further improve throughput and capacity. These techniques should all be pursued regardless of what underlying technologies are used to actually stream the video but, unfortunately, these techniques, used separately or even together, offer only incremental benefits to performance. Given the massive scale of the projected demand for streaming video, we need an order of magnitude of improvement in today’s performance in order to address the capacity gap.

To increase the performance of the streaming video infrastructure, there are only two main areas of focus – the software and the hardware. While great strides have been made over the last decade by many in optimizing the streaming video software layers, even the most capable developers are facing diminishing returns on their efforts as additional software optimizations yield progressively fewer performance increases. Continued software optimization efforts will continue to eke out incremental performance gains but are not likely to be a significant factor in addressing the overwhelming capacity gap faced in the industry. By process of elimination, this leaves the hopes pinned firmly on the hardware side of the equation.

Unfortunately, the latest research shows that the performance curve for increases in CPU performance over time has flattened out. Despite increases in the number of transistors per chip, a number that is now approaching 20 billion on the largest CPUs, and the shrinking of the transistors themselves, to sub-10nm in the latest chips, we have reached diminishing returns. The projected increases in annual CPU performance are so far below the 35% annual growth rate in streaming video traffic demand that I think we can eliminate improved CPU performance as a possible solution for the capacity gap problem.

If the software and CPU platform, which has worked so well since the inception of the streaming video industry, cannot meet the future needs of the market, what can? Since the dawn of computing, the solution to insufficient software and CPU performance has been to offload the workload to a specialized piece of hardware. Examples of specialized hardware being utilized to enable higher performance include RAID cards in storage, TCP/IP offload engines in networking, and the ubiquitous GPUs which have revolutionized the world of computer graphics. Thanks to GPU’s, a consumer today can watch 4K video streams on their selection of viewing devices, including televisions, computers, and mobile platforms. Given this track record of success, its seems clear that the innovation that is required is a piece of specialized hardware that can offload the heavy lifting of streaming video from the CPU and software. This technology exists and is now making its way to market thanks to the recent advances in Field Programmable Gate Array or FPGA technology.

Traditionally, FPGA technologies have not been part of the data center infrastructure, but this has begun to change in recent years. In fact, the current edition of the magazine ACMQueue, a publication of the Association of Computing Machinery, features “FPGAs In Data Centers” as its feature article. Early work on implementing FPGA’s in the datacenter was pioneered by Microsoft who, with an effort called Project Catapult, proved that massive increases in the performance of web queries (for the Bing search engine) could be obtained by offloading the workload from the CPU’s and software to FPGA’s. This effort was so successful that Microsoft now deploys FPGAs as a part of their standard cloud server architecture. In order to address the needs of the streaming video industry, an ideal solution would be to offload video streaming to a specialized FPGA, which isn’t a new idea, but something that’s just now being seriously talked about. This would essentially require the implementation of the functionality of an entire streaming video server onto a single chip but, which is both possible and highly advantageous to do so.

Some of those in the industry who know way more than me when it comes to FPGA technologies have suggested new offerings in the market by companies like Hellastorm and others, have the potential to change the way video is distributed. Their solution in particular takes the core functionality of an entire streaming video server and implemented it on a chip the company calls the “Stream Processing Unit”, which eliminates software layers from the data path. The result is that they can offload the heavy lifting of the streaming video workload, thereby freeing up valuable CPU resources to improve the performance of other tasks, dramatically increasing streaming video performance.

Additionally, if the use case requires only streaming video functionality, the CPU, OS, RAM, and much of the other circuitry normally required to stream video can be eliminated entirely, thereby saving significant amounts of electricity. Going forward, the streaming video functionality can follow the hardware performance curve and benefit from improvements to FPGA technologies just as the prior generation of streaming video servers benefited from improvements in CPU technologies. The company claims that their Stream Processing Unit technology will keep pace with advances in networking speeds such that the streaming video being delivered from the platform will fully saturate the networking connection, be it 10Gbs, 100Gbs, and even 400Gbs in the future.

The dual benefits of greater capacity delivered with dramatically less power consumption make video stream offload attractive even before one considers the substantial benefits that can be obtained by freeing up CPU resources to provide other network services. Hardware offload of streaming video, with its ability to rapidly scale in performance over the next few years, appears to be the best way to address the looming capacity gap. Some don’t want to invest in hardware because they feel it has too high of a CAPEX requirement, but based on what we have seen in the market, and from speaking with those who are tasked with dealing with the massive influx of video traffic on the networks, video compression alone isn’t going to solve the capacity gap problem. I am interested to hear what others think can be done to solve the problem in the market, feel free to leave your comments below.