Google Cloud Announces CDN Interconnect Program With Level 3, Fastly, Highwinds & CloudFlare

Google Cloud has announced a new collaboration with four CDN providers, Level 3, CloudFlare, Fastly and Highwinds in a program they are calling CDN Interconnect. The goal is to allow joint customers of these CDN providers and Google’s Cloud Platform to pay reduced prices for in-region Cloud Platform egress traffic. I’m also hearing that additional CDNs will join the program before too long, giving content owners even more flexibility. For customers using Google Cloud as their origin source, this will lower their delivery costs and should improve delivery performance.

CDN Interconnect allows select CDN providers to establish direct interconnect links with Google’s edge network at various locations. Customer egressing network traffic from Google Cloud Platform through one of these links will benefit from the direct connectivity to the CDN providers and will be billed according to Google’s pricing. Intra-region carrier interconnect traffic will cost Google Cloud customers $0.04/GB in NA, $0.05/GB in EU and $0.06/GB in APAC.

Google notes that this pricing only applies to intra-region egress traffic that is sent to CDN providers approved by Google at specific locations approved by Google for those providers. So not every location a CDN provider has would necessarily be approved for the CDN Interconnect program. You can read more about the announcement on Google’s cloud blog.

Sponsored by

Former KIT Digital CEO and CFO Arrested: Charged With Accounting Fraud

It was only a matter of time. KIT Digital’s former CEO and Chairman Kaleil Isaza Tuzman and former CFO Robin Smyth have been arrested and charged with accounting fraud. Both are being held pending extradition to the U.S. and face up to 20 years in jail if convicted on the most serious charge. In 2013 I published a long blog post with details on accounting irregularities at KIT Digital, with info given to me by former employees. Stories of money missing, lack of accounting control and deliberate intent to change KIT’s numbers. At the time I was only able to say there was a “possible” SEC Fraud Investigation in the works, but I knew it was already well underway internally at the SEC.

The SEC was already investigating Kaleil Isaza Tuzman for insider trades and he landed in their cross hairs even more when several law firms filed securities class action lawsuits against KIT Digital. At the time I investigated the company, insiders told me that KIT Digital simply made up revenue that did not exist and counted revenue from contracts that were cancelled or expired. It sounds like that’s exactly what happened as the SEC is charging Kaleil Isaza Tuzman and Robin Smyth with recognizing improper revenue, overstating company assets, and understating losses between 2010-2012. Kaleil Isaza Tuzman is also being charged with conspiring with a hedge fund operator from 2008 to 2011 to inflate KIT’s share price that allowed Kaleil to conceal purchases of his own company’s stock.

The SEC complaint says that the defendants knowingly or recklessly caused KIT Digital to divert nearly $8M into a slush fund, that resulted, among other things, in a phony reduction of receivables. Falsely recognized approximately $1.5M in revenue for a product that was never delivered, nor ever paid for. They are also charged with hiding the loss of $2M with an offshore money management firm and hid the fact they were aggressively trading KIT Digital stock on the open market. In one instance, they took money they said was being used to buy a competitor and instead, gave that money to their customers, who then paid it back to KIT Digital to make it appear as if it was revenue. Reading through the SEC complaint gives you an insight into just how arrogant these guys were, how many schemes they came up with and contains copies of emails where they are openly discussing what they were doing behind the scenes.

There is so much damming evidence against them that even one of the outside accounting firms they used said there was “material weakness” in the company’s internal controls over financial reporting saying that, “KIT digital and Subsidiaries has not maintained effective internal control over financial reporting.” Two former employees told me that some of KIT’s executives instructed them that they needed to be able to “massage the numbers each quarter” and have “more control over the numbers we show”. And between KIT Digital reporting Q4 results and filing their 2011 10-K, $2.14M in cash disappeared that the company could not account for.

KIT Digital was a bomb waiting to go off. The signs were there. The company misguided their revenue, took good will write-offs, missed or delayed multiple SEC filings, restructured their board and management multiple times, delayed putting out public news for days, fired two accounting firms, defaulted on a debt covenant, had cash disappear and acquired more than ten companies. This company screamed warning for a long time. No one should have ever put any trust in Kaleil Isaza Tuzman. I had a back and forth argument with him in the comments sections of a blog post dating back to 2008, when I accused him of making false statements and lies. He responded by saying the company’s “operating/financial results will tell the true story”. They sure have.

Kaleil Isaza Tuzman was arrested Monday in Colombia and Robin Smyth was arrested Tuesday in Australia. Bloomberg lists more details on the dockets with the criminal case being U.S. v. Tuzman and the SEC case is Securities and Exchange Commission v. Tuzman, 15-cv-07057, U.S. District Court, Southern District of New York (Manhattan).

Download My Book For Free: The Business of Streaming and Digital Media

41nm-788k5L._SX331_BO1,204,203,200_Seven years ago one of my books, “The Business of Streaming and Digital Media” was published in the NAB Executive Technology Briefing Series and I am now making the 264 page book available as a free PDF download.

While the terms “streaming media” and “digital media” are very generic terms these days, and mean different things to different people, the business behind online video is the key. The kind of technology used to deliver audio and video, be it streaming, download, live, on-demand etc. really no longer matters. It’s about using the right mix of multiple distribution technologies to reach the right audience with the right type of content and that’s what this book is all about – applying the right business models. You may re-purpose content from the book as you like, as long as you don’t charge for it and you credit the source back to Dan Rayburn.

Thursday Webinar: Engineering an Enterprise Video Network

Thursday at 2pm ET, I’ll be moderating a StreamingMedia.com webinar on the topic of “Engineering an Enterprise Video Network“. Whether you’re starting from scratch, building on an existing internal platform, or looking to an outside vendor, the challenges facing video engineers and IT professionals in the enterprise are more complicated than ever before in an increasingly bring-your-own-device environment. The roundtable will offer a highly technical deep dive into the nuts and bolts of an effective enterprise video network, from support for legacy formats and protocols to security and SharePoint integration, as well as the pros and cons of on-prem, cloud-based, and hybrid solutions. You can register and join this webinar for free.

OTT Services By Apple & Verizon Won’t Have A Big Impact On Akamai, Here’s The Numbers

Last week I saw two reports published by Wall Street analysts suggesting that Apple’s still non-existent live OTT service will mean a lot of revenue for CDN provider Akamai. Of course neither of the reports gave out any numbers, provided any estimates of revenue or even showed the formula anyone can use to get a rough idea of what the business would be worth. These reports need to stop being so vague and provide real numbers considering it’s easy to estimate the value of the business using very simple math, which I detail below.

When it comes to the topic of Apple’s own CDN, I still see quite a few who says that Apple is “reportedly building out its own CDN network”, and they speak about it as if it’s speculation. Apple is not “reportedly” building a CDN, they already built it and continue to expand it. Don’t take my word for it, simply do a traceroute and look at how Apple is delivering their content. Apple has a CDN, with massive scale, which is not debatable. Suggesting otherwise, or implying that they “might” have one is simply ignoring the facts.

In one report an analyst said, “we believe that Apple has used Akamai’s CDN capabilities in the past.” You believe it? Why don’t you confirm it? Doing a traceroute shows Apple is still using Akamai for a portion of their content delivery. They also say that Apple “requires significant scale that its own CDN may not yet be able to handle,” but don’t say what that statement is based on. Stop using generic and vague phrases to make it sound like you know what you are talking about when you have no idea at all as to the size and scale of Apple’s CDN. And while we are on the topic of OTT services, the launch of Verizon’s new OTT service will not be a “big deal” for Akamai or provide any “upside” to the company as some suggest. Verizon’s new service will run over their own CDN EdgeCast and not on Akamai.

Whenever Apple rolls out their live OTT service, there are many ways they can handle the distribution of the video. They could use multiple third-party CDN providers like Akamai and Level 3 to deliver everything, or they could use them in combination with their own CDN, splitting traffic amongst multiple providers. Apple could also use their own CDN as primary, and then use third-party CDNs for overflow and extra capacity. Apple could also simply not be involved with the delivery at all and let the content owners in the OTT service deliver the streams themselves, via contracts they already have with third-party CDNs. There are a lot of ways the video delivery can be managed and based on Apple’s history, it’s a safe bet that multiple providers would be involved.

But for revenue calculations, let’s just assume Apple uses Akamai to deliver 100% of the streams. The average HD stream delivered to a TV is about 3Mbps. To tablets and phones, it’s about 700Kbps. We don’t know how many users Apple would sign up for their service, so I’ll use a number of 3M subs in the first month. Then we need to calculate how many hours per month each sub would use and for this example I’ll use 30 hours a month. Then we have to decide what percentage of playback would be on the TV versus a mobile device. If we assume an even split of the traffic, it means one user, in one month, would consume 24.95GB.

[3000Kbps x 60 minutes = 1.35GB per hour X 15 hours = 20.25GB. For the 50% that comes from mobile, 700Kbps x 60 minutes = 315MB per hour, x 15 hours = 4.7GB]

If the traffic was being priced on per GB delivered, at a penny per GB, the value of one customer would be worth $0.25 to Akamai. Multiply that times 3M users and the value of the contract would be worth $750,000 a month. But, Apple doesn’t pay the CDNs based on a per GB delivered model, they pay per Mb/s (95/5). So if you assume that at any given time only 1M of the 3M users will be watching a stream concurrently, with 50% at 3Mbps and 50% at 700Kbps, it means that Apple would be using about 1.5Tb/s. Even at $1 per Mb/s, which is a high price for Apple to pay, the value to Akamai would be $1.5M a month. The numbers get bigger if you calculate them with more than 3M users a month and/or more than 30 hours a month consumed per viewer. But Apple won’t get 3M subs in the first month and there is almost no chance that Apple would give Akamai 100% of the delivery.

Part of the speculation of Akamai’s involvement with Apple on a new OTT service is due to Akamai’s recent earnings call, where the company said that their CAPEX spend would be higher than normal as they added more capacity for OTT video. But Akamai does not break out the percentage of money spent per service and at no time did they say they were adding capacity for a “new” OTT service. All CDNs, including Akamai, don’t keep a ton of extra capacity available just sitting around unused. They operate their networks with a certain level of overflow capacity, but not more than necessary so as not to incur costs for capacity they aren’t using. Akamai could simply be looking at their customers traffic growth and spending money now to put in place the capacity they need for the overall growth of the OTT business for the next 12 months.

Apple’s OTT service will not be a big revenue driver for Akamai or any other CDN right out of the gate. If Apple signs up tens of millions of subscribers and gives a large percentage of the delivery to third-party CDNs like Akamai, then the numbers could start to add up. But Apple would be lucky to get a few million subs, within a few quarters, with what is expected to be such a pared down selection of content available. Based on my calculations, Apple is already spending between $75M-$100M with Akamai this year. So adding a few hundred thousands dollars more a month to that for a new OTT service, or even $1M more per month, it really doesn’t have a big impact on Akamai’s overall numbers.

New Whitepaper: The Performance Challenges Unique To Mobile

frost_sullivan_whitepaper2Frost & Sullivan has just released a new white paper entitled “A Billion Reasons for Inconsistent Mobile Performance and How to Solve for Them“, which is sponsored by and includes data from Twin Prime. Mobile is now the first screen of choice. It’s the most personal device we have had and is rarely a few feet away from us at any given time. Consequently, the gratification amount and user intent is highest on smartphones than any other computing device. However, many times our mobile experience is poor and can be as bad as content not even loading.

The key question about mobile performance is why does it continue to suffer? What is it that makes mobile so uniquely different from the desktop/PC world? Mobile networks face many challenges that make it hard to deliver a consistent and rich user experience. These challenges are unique to mobile and include:

  • Last-mile latency inordinately affects mobile performance
  • Volatility in the wireless last mile trips transport protocols
  • Multitude of last-mile networking technologies hinder performance
  • Diversity in last mile creates infinite performance scenarios
  • Mobile content severely constrains the efficacy of current acceleration solutions
  • Current acceleration strategies were built for Web, not mobile apps
  • Server-based solutions are insufficient and suboptimal
  • It’s not about a new protocol

In every benchmarking study, mobile performance is at least three times as slow as the Web. For example, Keynote Systems reports that today, on average, an m-commerce app/site loads in about 8 seconds on our mobile devices. This compares poorly with 2-second load times for websites. Read more about mobile performance challenges by downloading the white paper for free from Twin Prime’s website.

Cisco Announces New Royalty Free Video Codec Project To Rival HEVC

With two HEVC patent pools now in the market and a third one being formed, the adoption of HEVC as the successor to H.264 is going to start getting very expensive. With that problem in mind, last week Cisco announced a new video codec project they call Thor. Cisco’s goal is to work with others to provide an alternative to HEVC, with the same or better quality, with no royalty required. While Thor is in the very early stages and is not an alternative to HEVC today, Cisco’s vision is to rally others in the industry to contribute to the project and help make higher-quality video cheaper to deploy, without the need for expensive licensing. As Cisco points out, the total costs to license H.265 from the two current pools in the market today (MPEG LA and HEVC Advance) is up to sixteen times more expensive than H.264, per unit. In addition, Sony, Panasonic, Qualcomm, Nokia & Broadcom all have extensive patents around HEVC and aren’t in either pool. Rumors have it some of them are working to form what would be a third patent pool in the market to license their HEVC technology.

Cisco has hired patent lawyers and consultants familiar with video codecs and created a development process which they say allows them to work through the long list of patents in the space, and continually evolve their codec to work around or avoid other HEVC related patents. Two weeks ago Cisco open-sourced the code and contributed it to the Internet Engineering Task Force (IETF), which already has a standards activity to develop a next-generation royalty free video codec in its NetVC workgroup. This is a group Mozilla has been active in and has been working on technology they call Daala, which they want to be the successor to HEVC.

Cisco is no stranger to video codecs. The company faced a unique challenge with H.264 as they and other players in the video conferencing industry wanted H.264 included in the webRTC standards in order to facilitate interoperability with their install base of conferencing equipment. However, Mozilla simply couldn’t ship H.264 due to both the licensing fees as well as incompatibilities between MPEG-LA terms and its open source nature. To fix this, Cisco offered to help with an interesting solution.

Cisco open sourced its H.264 implementation (www.openh264.org), but more importantly, it also agreed to make a binary build of its implementation available as a module. Mozilla produced a version of Firefox, which upon installation, fetches this module from Cisco and links it into Firefox. In this way, Cisco is the distributor of the module and has to carry responsibility for the license fee. Cisco agreed to foot the bill and paid the full cap to MPEG-LA. This eliminated the need to track downloads of the module since tracking of downloads would be in violation of the privacy considerations that Firefox and the community had. So in the end, thanks to Cisco, everyone won. Cisco and the videoconferencing community got H.264 into the Firefox (and later into the webRTC standards), and Mozilla got H.264 support without needing to actually ship it in Firefox.

Thor is a project; it’s not an actual video codec yet and Cisco and others have a lot of work to do, probably years, before it’s a viable alternative to HEVC. Cisco wants to start getting the word out about project Thor with the hope that others will contribute intellectual property and that technical experts with video codec knowledge will get involved. Hopefully Google takes note of this and contributes VP9 into the NetVC workgroup, which would be great for everyone. The video codec standards development process benefits heavily from having multiple contributions into the process and then evolving them to take the best of all of them for the best overall result. As Cisco points to as an example, the Opus audio codec got to where it is today in exactly the same kind of way; combining two pretty different codecs, Skype’s SILK codec and Xiph.Org’s CELT codec.

It should be noted that both Thor and Daala are far from acting as alternatives to HEVC today, but it shows that when faced with greedy patent pools, companies like Google, Cisco, Mozilla and others will seek out or create alternatives. HEVC patent pools should take note. If they push too hard and get too greedy, they could be outmaneuvered.