HEVC (H.265) Adoption Is At Least Five Years Away For Consumer Content Services

High Efficiency Video Coding (HEVC), also referred to as H.265, is a video standard that is being developed through an ISO/IEC collaboration. HEVC planning was begun way back in 2004, shortly after H.264 was finalized and the topic has been getting a lot of exposure in the industry over the past few months. Many questions remain about HEVC including how quickly it can be implemented into the current video ecosystem and when content owners will adopt it.

The Digital Media group at Frost & Sullivan, which I am part of as a Principal Analyst, has been doing a lot of coverage on HEVC as of late. Our lead analyst on the transcoding side, Avni Rambhia, has published three reports that include details on HEVC including the Global Broadcast and DTT Video Encoders Market, Global Pay TV Video Encoders Market and Global Media and Entertainment Video Transcoding Market. Based on what we have seen in the market and data we have collected from suppliers, here’s our take on why HEVC adoption for consumer services is at least five years away.

The current HEVC draft was put out in July 2012 and the standard is expected to be ratified shortly. MPEG-LA, the licensing coordinator for all MPEG technologies, put out a call for applicable patents in July of last year. Several companies have early demos already available including Mitsubishi, NHK, Cyberlink, Broadcom, ATEME, Ericsson and Elemental Technologies, among others. Integration with chipsets is unlikely to begin until the standard is finalized and even after that, production will probably be initiated until critical mass of demand is achieved. However, although decoding will eventually reach the point of chipset integration, software based playback is not unrealistic given growing smart phone horsepower.

Video is a push industry, the community is constantly pushing the boundaries on resolution, compression efficiency and user experience to drive the media and entertainment ecosystem forward at ever-increasing speed. Two apparently disconnected use cases are the focus of market innovation this year, the explosion of video consumption on portable and personal devices, and rising investment in Ultra HD video at resolutions of 4K and even 8K. Each has its own challenges and technology requirements, but both share one crucial stumbling block – they cannot become global phenomena without a significant leap forward in video compression efficiency. MPEG-4 or AVC is the de-facto video compression standard today, and has played a key role in recent years in enabling Internet video, OTT services, IPTV and HD across all Pay TV services.

However, the technology has matured to the point where any advances are incremental, and prices are correspondingly seeing tremendous downward pressure. MPEG-4 is also not well positioned to enable Ultra HD transmissions in an economical fashion. Furthermore, with video accounting for as much as 90% of total bandwidth usage in North America during primetime – with less than a quarter of the Pay TV subscriber population watching on-demand OTT video – it is clear that lower-end video services cannot be served by AVC in the long-term. This is especially true because growth in HD-capable devices like tablets and the rising trend of watching OTT content on HD and Ultra HD connected TVs is further exacerbating an already challenging bandwidth situation.

HEVC or H.265 has been heralded as the solution to this problem. Offering up to 50% compression efficiency improvement over state of the art AVC codecs, H.265 is poised to disrupt the video ecosystem – both for M&E and for enterprise applications – yet again. Vendor excitement around the technology is high, with a number of announcements for HEVC-enabled products at CES 2013 and energetic R&D efforts underway to develop encoder and decoder cores (both software and hardware) now that the standard is finalized and development of a patent licensing program is underway. Amidst the fervent hype, it is easy to believe that HEVC is an immediate technology whose adoption curve will be soaring upwards in 2013 and 2014. But that’s not reality.

One can draw a parallel between the adoption curve of MPEG-4 as it gradually encroached into the supremacy of MPEG-2. We believe that while token adoptions – such as incorporation into DVB standards for terrestrial broadcasting – will occur in the short-term, and a few channels may also be launched by 2015, a critical mass of adoption will not begin to occur until at least 2016. History indicates this – even a decade after the launch of AVC, MPEG-2 remains a formidable force in Pay TV (particularly cable), owing to the massive footprint of legacy equipment such as set top boxes and transmission infrastructure that is all designed to work with MPEG-2 video.

Cost also remains an issue – many Pay TV operators in regions like Africa, Asia and Latin America are choosing MPEG-2 rather than AVC because of the significantly lower cost of consumer premise equipment (CPE) and video encoders. Considering the massive wave of investment in AVC equipment that we have seen in the last two years, we expect at least 5 more years of equipment life before economically stressed broadcasters and service providers will consider systemic upgrades. Any video technology touches many components as it travels from glass to glass, such as cameras, NLE systems, video indexing systems, statistical multiplexers, satellite transponders, head-ends and (perhaps most importantly) CPEs.

Similarly on the OTT side, transcoders, file formats, streaming protocols, streaming servers, content protection systems, network optimization platforms and end devices all need to support HEVC before an end to end solution becomes broadly viable. In their continual endeavor to fight commoditization and drive demand through continued technological disruption, vendors of video technology and consumer electronics devices alike are engaged in fast and furious product development around HEVC, with many announcements made already and several more significant milestones expected throughout 2013.

Silicon vendors are also looking towards the technology but at a somewhat more sedate pace, both to maintain profitability levels on existing AVC chipsets and also given the considerable challenges of achieving real-time power-efficient encoding and decoding of HEVC content (particularly at higher resolutions). But any large scale migration from AVC to HEVC will take time, much as the transition from MPEG-2 to AVC is still very much an ongoing process.

One can also draw parallels between 3D and HEVC, as technologies that were aggressively marketed before a robust content pipeline was in place. Video technology without adequate, compelling content is like a painting without adequate illumination – it’s very hard for a viewer to see the value. 3D has by wide consensus failed to realize its expected potential, partly because too much was expected too soon, but also because the content community was not able to find a viable business model to justify the expense and disruption, and users failed to see value in the technology – even as television set prices dropped dramatically as vendors struggled to establish demand at expected levels.

From a service provider perspective, the industry has only just overcome the alphabet soup of fragmented video formats to converge around AVC. Interoperability standards like Ultraviolet and MPEG-DASH, streaming technologies like HLS and Smooth Streaming, and pretty much every digital terrestrial transmission and cable standard today all embrace AVC. AVC has been the technology to break the walled garden mentality that pervaded OTT video during much of the 90s, with vendors like Adobe, Apple, Google, Microsoft, Rovi and Sorenson Media all embracing AVC in the interest of optimizing OTT video, despite the fact that each owns their own proprietary video compression technology.

Many of these vendors – particularly those who provide core codec components that power other vendors – are in the process of developing HEVC software cores to power the inevitable innovation that the new technology will catalyze. At the same time, they acknowledge that their service provider customers are loath to disrupt a delivery ecosystem that has only just settled down around converged solutions focused more on delivery and quality of experience than with just connecting the dots.

In terms of services rollouts, Frost & Sullivan expects closed-loop solutions such as enterprise video conferencing (from vendors like Cisco Systems and Vidyo) and Ultra HD broadcast services in the far east (powered by vendors like Cyberlink, Rovi and Samsung) to be among the earliest HEVC-based services to be rolled out. Video on demand services for low bandwidth, such as HD video delivery over cellular networks, are also likely to be early adopters of HEVC due to the significant potential for operating expense savings and growing demand for higher quality of experience over increasingly stressed mobile networks. These are expected to rely on software decoders in the short-term, with more power efficient hardware decoders or video co-processors expected to become available in the 2015 timeframe.

Satellite DTH service providers are also expected to leverage HEVC to roll out Ultra HD channels in the 2014-2015 timeframe, although it is hard to predict the level of uptake they will see in the early years of the technology. Similarly, some pilot DTT channels are expected to roll out in the 2015 timeframe, but the level of uptake remains to be seen. High-end encoding vendors such as ATEME, Ericsson, Fujitsu and NTT, and high-end CPE vendors such as Technicolor are all beginning to add HEVC to their product portfolios. All things considered, while certain applications will embrace HEVC much sooner than the norm and HEVC encoding and decoding cores should mature by 2014, we expect it will be around 2017 before a comprehensive ecosystem of first-generation HEVC-enabled products will come to market by 2017. Furthermore, we expect AVC to remain in widespread use even in 2018, although it will definitely be considered a commodity technology at that point – much as MPEG-2 is today.

The key takeaway from all of this is that HEVC won’t be adopted as quickly as some may think and if they bet big on HEVC too early, that’s one bet they are going to lose. We’ve done a lot of work at Frost & Sullivan on the topic of HEVC and in addition to the three reports we’ve already publish, we’ve done a lot of private research on HEVC for clients. If you’re looking to get more details on HEVC technology, get copies of the reports I mentioned, or need any custom research on the HEVC market, please feel free to reach out to me for more details.

Sponsored by

Netflix To Keynote Content Delivery Summit, May 20th NYC

netflix-logoI’m very pleased to announced that Ken Florance, VP of Content Delivery at Netflix will be the opening keynote speaker at our annual Content Delivery Summit. Taking place Monday May 20th, at the Hilton hotel in NYC, the summit has become the go-to event to learn about the technology behind web, acceleration and media delivery infrastructure. Telcos, carriers, ISPs, MSOs, major content owners and CDNs all gather to present case studies on real-world deployments, see demos of new technology platforms and discuss business model considerations for web acceleration and media delivery.

During Netflix’s keynote, Ken will highlight the company’s Open Connect Content Delivery Network and provide insight into how it works and what the future holds for delivering video via the last mile. In addition, the call for speakers for the Content Delivery Summit is still open for the next three weeks. So get your speaking submission in ASAP if you want to be considered. Some of the topics the event will cover include:

  • Over-The-Top Video Delivery
  • Dynamic Site Acceleration
  • Transparent Caching
  • Application Acceleration
  • CDN Economics & Business Models
  • Managed/Licensed CDN
  • Analytics & Cloud Intelligence
  • Front-End Optimization
  • The Video Ecosystem
  • Telco CDN Deployments
  • Mobile Content Acceleration
  • Optimizing Web Applications
  • The Business of CDN Federation
  • CDN Pricing & Volume Data

Registration for the summit is now open and if you register early and use the special promo code of DR13, you can get a ticket and attend the event for only $395. So don’t lose out on the discount and register early. #cdnsummit

ASUS and Netgear Announce New Streaming Boxes, $130 & $150 Price Points

asus_qubeThere are a lot of options in the market for consumers looking to spend $100 or less to get a dedicated streaming box including devices from Apple, Roku, Boxee, Western Digital, Netgear, Sony, Vizio and D-Link. But that’s not stopping ASUS from jumping into the market with their announcement of a new device called the Qube, that will be more expensive than the others devices, retailing for $150 when it launches in March.

The box comes with the Google TV platform, support for Netflix and Amazon Instant Video at launch and includes 50GB of storage space via the ASUS cloud platform. It’s has two USB ports, HDMI in and out, ethernet, IR out and 4GB of flash storage. The interface for the Qube display functions via a rotating on-screen cube shape, which from the looks of it, appears to be a really bad UI. ASUS wanted to be creative by make the interface a cube, just like the shape of the box, but it appears they picked from over function. I haven’t gotten hands-on with the box just yet, so maybe the UI works better than I think, but from what I have seen in this video demo, I don’t think users will think the interface is very practical.

41IcvVIKefL._SL500_AA300_In addition to ASUS’s new device, Netgear has also announced a new box called the NeoTV PRIME. The box is similar to their NeoTV line including the standard, MAX and PRO, but the PRIME comes with the Google TV platform. It’s also DLNA compatible, supports playback from external USB drives and has a two-sided remote control with built-in keyboard. While Netgear said the device is currently available for sale and retails for $130, Amazon shows the device being released on February 22nd.)

With ASUS and Netgear’s new boxes and the recently announced box from Hisense, here’s my updated list of streaming boxes:

  • Apple TV
  • ASUS Qube with Google TV (coming March 2013)
  • Boxee TV
  • D-Link MovieNite Plus
  • Hisense Pulse with Google TV
  • Microsoft Xbox 360
  • Netgear NeoTV (3 models)
  • Netgear NeoTV PRIME with Google TV (coming Feb 2013)
  • Nintendo Wii U
  • RCA Streaming Media Player DSB772E
  • Roku (4 models + Roku Streaming Stick)
  • Sony PlayStation 3
  • Sony SMP-N200
  • Sony NSZ-GS7 Internet Player with Google TV
  • Vizio Co-Star with Google TV
  • Western Digital WD TV Play (coming Feb 2013)
  • Western Digital WD TV Live Hub

You can check out my comparison chart of streaming devices at www.streamingmediadevices.com. The chart is currently undergoing an update and I’ll be posting the latest revision soon.

Insiders Detail Accounting Irregularities At KIT Digital, Rumors Of A Possible SEC Fraud Investigation

[Updated Dec. 27 2017: Former KIT Digital CEO found guilty of manipulating shares]

[Updated Sept. 8 2015: Former KIT Digital CEO and CFO Arrested: Charged With Accounting Fraud]

Industry vendor KIT digital has had a lot of problems of late, but it looks as if it is about to get worse. Over the past few weeks, multiple sources have detailed for me the lack of controls KIT had in place to properly account for their financials. Some suggested to me that KIT went as so far as making up revenue that didn’t exist, with one person telling me they thought that up to $80M in reported revenue wasn’t real. Other sources tell me that they believe enough fraud will be uncovered that the Federal government is likely to take up an investigation and presumably, prosecutions, on some of KIT’s executives.

I think it is important for me to point out that I don’t have access to KIT’s books and can’t verify on paper the details I have been given, but some of the information comes from employees who were inside KIT’s finance department at the time and had direct knowledge of what was taking place. In addition, several law firms have filed securities class action lawsuits that appear to coincide with the information I have been given and they also have some of the specific details I have been told, from their sources as well.

While the SEC did not return my request for more details on what they may be looking into with regards to KIT, the company is no stranger to the SEC. Their former CEO is being investigated for insider trades, the company delayed the release of its 10-Q and in November, KIT announced they would have to restate earnings for the past three years. The company told shareholders to, “no longer rely upon the Company’s previously issued financial statements” stating that the irregularities stemmed from, “revenue related to certain perpetual software license agreements entered into by the prior management team in 2010 and 2011.” While that’s a fancy way to say that KIT didn’t account for revenue properly, insiders tell me the simpler explanation is that KIT simply made up revenue that did not exist and counted revenue from contracts that were cancelled or expired.

It’s important to note that on March 30th 2012, in a regulatory filing KIT disclosed that their current accounting firm at the time, Grant Thornton, noted a “material weakness” in the company’s internal controls over financial reporting saying that, “KIT digital and Subsidiaries has not maintained effective internal control over financial reporting as of December 31, 2011.” And between KIT reporting Q4 results and filing their 2011 10-K, $2.14M in cash disappeared that the company could not account for.

While these accounting problems could simply be attributed to incompetence and negligence, insiders tell me that some of it was deliberate, with the intent to change KIT’s numbers, which would then make it a fraud. One of the most revealing details is that some of KIT’s senior management purposely kept employees from installing company wide business accounting software inside the company, in particular, a solution from NetSuite. Two former employees told me that some of KIT’s executives instructed them that they needed to be able to “massage the numbers each quarter” and have “more control over the numbers we show”. Instead of using a company wide program that would manage KIT’s financials, each office would deliver Excel documents to KIT’s headquarters, which would then have to manually combine the numbers from at least ten different spreadsheets.

Multiple people also told me of KIT making what some employees called offshore accounts to people that KIT management would tell them to send money to, without any kind of invoicing or tracking of what was being paid. Others told me that some of those who got these transfers were actual KIT employees, which the company described as “commission” checks, even though the revenue they were getting paid for was from customers who had not paid their bills or contracts that had long expired. Whether or not these payments were illegal I don’t know, but it clearly shows a pattern of financial abuse at KIT and lack of control over accounting. And it raises a lot of questions when cash has disappeared and wire transfers are being done, from offshore accounts, with no record keeping.

While KIT recently said that a large part of the confusion around KIT’s financials is around how they accounted for revenue generated from professional services, versus SaaS platform license fees, many say KIT constantly bent the rules to try to get away with as much as possible. Others also told me that one of the reasons KIT fired their original accounting firm was because they would not agree with how KIT was recognizing revenue. Some have also told me that when questioned by others about accounting irregularities that they said KIT’s accounting firm would not approve, select KIT executives told them that they had a good working relationship with their outside accounting firm and they would let them do what they wanted.

If you think about all the red flags at KIT, it’s really amazing just how many there have been, yet a lot of investors still wanted to believe the story, which is their own fault. KIT has misguided their revenue, taken good will write-offs, missed or delayed multiple SEC filings, restructured their board and management multiple times, delayed putting out public news for days, fired two accounting firms, defaulted on a debt covenant, had cash disappear and acquired more than ten companies. This company had screamed warning for a long time.

None of the financial problems with KIT’s business really comes as any surprise to me as every time I would look at KIT’s technology it didn’t work as advertised. I would get conflicting information from the company and even when they would walk me through demos personally, stuff would not work. Talking to some of KIT’s customers they would tell me they were never moved to any KIT platform, were still running on legacy systems from companies KIT acquired, like Multicast Media and theFeedroom, even though KIT was telling everyone that all these systems had been integrated into one platform. Remember the VX-one platform? You don’t hear about that anymore.

On one of my calls with various members of KIT’s management and product team, KIT’s CEO Kaleil Isaza Tuzman gave me names of customers who were using their platform, only to be corrected by another employee on the phone who said that wasn’t accurate. This type of behavior was common at KIT. The company excelled in trying to get away with as much as possible, until someone noticed and questioned them about it. KIT used logos of customers who they didn’t have contracts with on the customer page of their website, made up their own terminology to describe things and skated around simple, direct questions.

Whenever I questioned KIT’s CEO about things that didn’t make sense, I would get responses that was full of marketing language or told that it was “complicated” and I would not be able to understand it since I was not an accountant. Once I was also told that since KIT was an International company and I was based in the U.S., I could not understand how things were done overseas. There was nothing transparent about KIT or the way they did business and that example Kaleil Isaza Tuzman set was something that others followed. They had a pattern of denying, deflecting and trying to confuse the person with a drawn out, generic, high-level marketing response, something that should have raised a lot of red flags.

While many don’t want to talk about it openly, KIT has always been a black eye on this industry and a bad representation of the industry. The sooner they disappear, the better. With their cash reserves running out and employees being let go, 300 employees four months ago and 90 more employees last month, KIT’s time is about up. No one will put more money into the company and it will go under, it’s just a matter of when and how. Some have suggested that KIT can sell off their technology in pieces or sell the company outright, but KIT’s management has been saying this would happen for more than two years with no results. Multiple firms I have spoken with who buy distressed companies and then sell off the pieces have told me they have looked at KIT and didn’t find anything worth buying.

With most of this well-known and these details being easy to come by, I don’t know why anyone ever wanted to invest in KIT. When you put money into a company, you’re not really putting faith into the company itself, but rather the executives that are running it. The fact KIT raised so much money shows that there are still plenty of people out there who can be wowed by a presentation and fooled by people who present well and stir people’s emotions, and greed, for quick money. With KIT’s stock now practically worthless, trading at $0.45 when less than a year ago it was at $12.65 a share, a lot of people have lost a lot of money. But in this instance, no one should feel sorry for any investor, as the warning signs have always been there. And for the Wall Street firms that put big money into KIT and gave them lots of financing, they should not be trusted. They didn’t do the type of fact checking and intelligence gathering they should have.

When KIT’s stock was high, I would get lots of emails from investors telling me how stupid I was not to like KIT, why it was worth more than Brightcove’s market cap and many were angry at me for not wanting to believe the hype. Some even started to get hostile and threatening with me when I first started pointing out problems with KIT, back in 2008, in a back-and-forth exchange I had with Kaleil Isaza Tuzman in the comments section of this Gigaom post. While some investors probably think I like what happened to KIT, I don’t. Nothing good comes from having a company in our industry that makes others look bad, that makes people lose their money or that doesn’t deliver what was promised to customers. But the problem is, those investors didn’t do their homework. They didn’t talk to KIT customers, didn’t get demos of KIT’s products, didn’t ask the right questions or check the facts. The warning signs with KIT have been there since 2008 and even before that when it was ROO Media. If anything good comes from this I hope it’s that investors will learn not to be so blindly trusting of people and do better due diligence.

Whether all of these problems result in KIT’s numbers being false and being a fraud case is too early to know. KIT may not announce the re-statement of financials for many more months or even a year from now, so the story on KIT will have a few more chapters. But whatever the results, the company is dead, it’s running out of cash and it won’t be around much longer.

I want to wrap up this piece by making a very important statement. This post is not a reflection of everyone who has even worked at KIT. The company has had a lot of employees and executives over the years, some of them my friends, and I believe that the vast majority of them had the desire to work for a professional company and were not involved in a lot of the negative actions that have taken place. It’s a shame that some executives, board members and investors ruined their opportunity for long-term job success and made a bad name for the company, but it should not be a reflection of the employees that were not involved. Those are the people who have lost out in what has happened and hopefully, they get opportunities with professional vendors who won’t hold it against them for working at KIT.

If I can assist anyone who is looking for a new job, please reach out to me and I will see if I can help. Also, if any current or former employees want to talk to me about what went on at KIT, I’d like to hear from you. As with this post, none of my sources will be named and I’m also not naming the executives they worked with.

Disclaimer: I have never bought, sold or traded a single share of stock, in any public company. While I am willing to do media interviews about this story, I will not disclose any sources or detail by name which KIT executives were mentioned to me by my sources.

Webinar Jan 24th: Best Practices for Advanced Encoding & Transcoding

Thursday Jan. 24th, at 2pm ET I’ll be moderating another StreamingMedia.com webinar on one of our most popular topics, “Best Practices for Advanced Encoding & Transcoding.” Take your encoding and transcoding knowledge to the next level with this instructional webinar that will highlight the latest advanced encoding techniques. Get details on delivering higher video quality at lower bitrates and how to prepare high-quality video for delivery to any screen using multiple formats.

We’ll have a full Q&A session in which your encoding questions will be answered and as always, all StreamingMedia.com webinars are free. So register here and save the date for this instructional webinar.

Video CDN Market Growth Slowing, But Should Reach $1B By 2015

In addition to my role at StreamingMedia.com, I’ve also been a Principal Analyst at Frost & Sullivan in their digital media group, for the past five years. Part of my job is to track the growth of various products and technologies in the video ecosystem and provide research and analysis on market drivers and restraints. One of the reports I author each year is about the size of the video CDN market. We’ve recently published our latest report and findings and expect the video portion of the CDN market to reach about $1B by the end of 2015. (contact me if you want to learn about how to get a copy of the report)

Some might notice that our 2010 CDN report suggested that the video CDN market would grow to $1B by the end of this year, but as we saw and heard from CDN vendors in 2011 and 2012, traffic growth on their networks didn’t grow as fast as they were expecting. While video traffic is one of the fastest growing segments of the overall CDN industry, it is also the most volatile when it comes to pricing and volume trends and has very low margins. While we originally thought the video CDN market could grow at a compound annual growth rate (CAGR) of 28% until 2015, the latest data shows the market growing by 15.2%. The good news is the market is still growing, but that the rate of growth has slowed.

Our market sizing numbers do not include telco CDN revenue or revenue from licensed CDN services, a separate report we are currently working on. There is still money to be made in the video CDN business, especially at scale, and 2012 saw pricing remain very stable, with my expectations that 2013 will be the same with no major declines in video CDN prices. Add in the new CDN business models we are seeing from the likes of Netflix, along with what service providers are deploying and they is plenty of growth still to come from today’s video delivery infrastructure market. (For a complete list of all the CDNs in the market, see www.cdnlist.com)

Note: We are currently working on an update to our Transparent Caching report and will be putting out a new report this year on the size of the Dynamic Site Acceleration (DSA) market. Contact me if you want to be added to the list to be notified when they are available.

Streaming Video Can’t Scale At Cable TV Quality, Will Never Replace Traditional TV Distribution

Almost five years to the day, the NYT published an article proclaiming that, “TV is becoming obsolete” and was joined by plenty of other media outlets claiming that within a few short years, streaming video could displace the traditional means of video distribution. While some in the industry still want to set false expectations that streaming media technology is somehow going to replace the primary means of delivering video to the living room, the fact remains that five years later, cable TV is here to stay and is still the primary way to get video into the home.

While reading a post on another site on a similar topic, a reader left a comment saying, “streaming is still in its relative infancy”, a false statement I hear often. Some seem to want to use the argument that streaming media technology is new and will improve as a reason why it will one day replace cable TV distribution. But in reality, 2013 marks the 19th year since streaming media was first used on the Internet. And while the industry has made advances and will continue to do so, the improvements in the technology over the last few years have been small. Compression technology gets better, protocols start to get standardized but year-over-year, we aren’t seeing huge leaps and bounds. That because today, the primary building blocks of all streaming media services, like storage, encoding and delivery can’t get that much better. Infrastructure vendors are spending most of their time to make these services more reliable and scalable and others are using these fundamental technologies to build out ecosystems, protect content and monetize it. But it can only go so far.

Streaming media has limitations and that’s not a bad thing, you simply have to apply it to the best set of applications as you would any other technology. But many are hell-bent on the concept that one technology has to replace another, when in fact, most times, one complements the other. Streaming media is never going to be as reliable, scalable or as high-quality as cable TV, even in the future. Those who suggest that do so as they don’t understand how streaming works and the limitations to the technology. Think of all the companies over the past 10 years who said their technology, be it better compression or P2P delivery etc. could solve the issues with delivering high-quality video to “TV sized” audience. They couldn’t. And even with guys like Netflix taking the smart approach of placing caches for free inside ISP networks, there is a limitation to what can be achieved.

In 2009, Akamai, the largest CDN on the Internet peaked at 7.7M simultaneous video streams during the Obama inauguration, for all of their customers combined. It was so much traffic on their network that they had to cap customers and they made it clear that there is no such thing as “unlimited” capacity on the Internet. Last year, YouTube claimed they did the largest live event on the web with 8M simultaneous streams. So three years in between events and the largest live event numbers didn’t grow that much. Of course years later, both Akamai and YouTube’s network have grown and their capacity improved, but the 8M number puts things into perspective.

CBS Sports said the NFL Playoff game (Broncos vs. Jaguars Ravens) from this past weekend had 35.3M viewers and peaked at over 40M. Even if half of those were simultaneous viewers, that would be nearly 18M streams, for one show. Multiply that times the number of shows going on at the same time and the numbers are huge. The Internet can’t and won’t be able to handle that kind of volume, now, or in the future, at cable TV quality. People love to argue the point, but the numbers, facts and data proves otherwise. When you turn on the TV, it works and you know what type of experience you will get from an HD channel. On the web, you don’t even know what HD really means as some call video HD when it isn’t, when compared to broadcast standards.

With streaming, there is no guarantee what the experience will be. Content services like Netflix and Amazon and CDNs like Akamai are working hard to give consumers the best experience they can, but they are working with technology that has limitations when it comes to scaling. Sometimes you can get good quality video with no problems. Many times you can’t, or you don’t have a consistent experience or there is some other computer/browser/player/app issue. There are a lot of moving pieces in the entire video ecosystem as opposed to cable TV which has very few.

It seems many outsiders want to judge the “success” of the streaming media industry and the technology on how quickly it can “displace” cable TV as a broadcast distribution medium. What a false idea. The fact is, the technology is already proven and for the right applications, we’ve already seen plenty of success stories. All streaming services combined, the size of the industry is in the billions of dollars, content owners are starting to get paid and business models evolve each year. That’s the definition of success. For those in the industry who understand this, don’t let those who try to set false expectations of streaming replacing cable TV distract you from staying focused and applying the technology where it is best suited.