Report: 29% Of Viewers Abandon Videos When They Encounter Quality Problems

A new consumer survey report from Conviva titled “How Consumers Judge Their Viewing Experience: The Business Implications of Sub-Par OTT Service” highlights the importance of video experience now that OTT has become a part of everyone’s everyday life. The report, which is based on the responses of 750 viewers ages 26-34, shows that 75% will abandon a sub-par video experience in 4 minutes or less. The worse news for content owners is that one in three are willing to jump amongst content owners if they’re not getting what they came for. Data also shows that 29% of users will close the video and try a different app/platform/website when they encounter a poor video experience.

Screen Shot 2015-03-26 at 11.06.38 PMSomewhat surprisingly, perhaps, the report shows that these viewers are more sophisticated in their judgments than we might be inclined to give them credit for. They clearly understand that a good experience isn’t just about avoiding buffering, but that it also includes a quick start to the content (what we call frustration time), and demands a good-looking picture.  For cord-cutters especially – and this particular survey sample is looking at a generation that almost certainly were watching on their parents’ cable subscription very recently – this is important, because they’re comparing what they watch both to other Internet services, and to the TV itself.

Screen Shot 2015-03-26 at 11.06.17 PMThe survey says that nearly half of them lose brand loyalty as soon as they get a bad experience. Whether that really plays out in real-time is debatable, but if it’s even half as bad as that, it’s a wake-up call for everyone: the quality of your video experience better compare favorably to the competition, or trouble could quickly ensue.

If there’s one big takeaway, it’s that holding the interest of today’s OTT viewers is hard, and getting harder. The proliferation of devices and platforms to be supported continues to accelerate, and the variation in quality between ISPs and CDNs is undeniable – and may be indecipherable, as it shifts and changes on practically a minute-to-minute basis. The survey shows respondents placing blame across the whole ecosystem when they get a lousy experience, but that’s small comfort to a provider whose expensive content is being abandoned.

Note: Conviva also publishes a very detailed report that measures the technical problems with streaming video when it comes to re-buffering, slow video startup, low-resolution and their impact on the viewer experience.

Sponsored by

Pandora CIO To Keynote Infrastructure Event: Learn How They Have Scaled 50X In Eight Years

steve-ginsberg-headshot-smallI’m pleased to announce that Steve Ginsberg, CIO at Pandora has been added to the speaker lineup at our Content Delivery Summit, taking place May 11th in NYC. Partnering with Product Engineering, the Technical Operations team at Pandora has built and extended an efficient, resilient, high-volume architecture to keep the music playing, scaling it 50X in eight years. As our keynote speaker, Steve will share elements of design and insights from building the platform that brings music to millions. He’ll join Mike Dunkle from Valve’s network operations and infrastructure group as our two confirmed keynote speakers for the event.

Now in its seventh year, the Content Delivery Summit is the place to meet those who are building out some of the largest public and private CDN deployments to date. The summit also covers other web acceleration technologies including dynamic content delivery, transparent caching, app acceleration, QoS measurement, front-end optimization, mobile content acceleration and more. You can register for the event using the discount code of DR100 and get an all access ticket for only $395 if you register before April 11th.

Ustream’s Software-Defined CDN Could Become A Bandwidth Exchange For Live Broadcasters

Ustream_Logo_2013Ustream has developed a reputation for its ability to host some of the world’s largest streaming events via their HD broadcast video platform. The company hosted more than 1 million concurrent viewers and 8 million total viewers during the launch of Sony’s Playstation 4 and regularly hosts 70 million viewers per month. And, while it is enough of a challenge to scale to accommodate a large audience when the size is predictable, it is even more difficult to scale fast when the audience size is unpredictable.

As a result of these problems, Ustream decided to create a unique solution that is the result of seven years of development – a technology Ustream calls Software Defined CDN or SD-CDN. If Ustream takes what they have built and offers it to anyone, as a stand-alone service, customers would then be able to buy live delivery services from what is essentially a bandwidth exchange. This would be a very interesting model because to date, there is no easy way for large content owners to buy from multiple CDNs, without going and doing a contract with each CDN individually.

One proven aspect of Ustream’s platform is that it can scale for audiences that start quickly, accelerate quickly and then leave once the live event is over. Large audiences like these bring several challenges:

  • The sheer amount of bandwidth needed often requires using more than one CDN provider to cover all viewers in all parts of the globe
  • It’s important to ensuring a high-quality stream with as little buffering as possible and HD quality playback for each viewer no matter what device they are on or where they are globally
  • Handle the bandwidth and quality needs in the most cost-efficient way possible. While it is always possible to add additional servers or CDN capacity to accommodate the peak usage pattern, this often means during times of lower usage that extra capacity is wasted.

Ustream’s SD-CDN includes the following properties:

  • Ability to scale automatically without manual provisioning of resources, dynamically adding and removing edges and providers on the fly as needed.
  • Ability to leverage a combination of edge resources, which include: CDN providers, transit lines, peering and ad-hoc edges, including those located inside an ISP’s network on inside a private network (for example an enterprise network). A new edge resource can be registered and serving traffic in less than a minute.
  • Ability to flexibility and instantly route traffic amongst any of the above listed sources based on a combination of rules to maximize resiliency, quality and cost.
  • Built-in monitoring to evaluate the performance of and efficacy of the competing sources on a global scale or down to the individual viewer.
  • Ability to tweak business logic in real time if scale, quality or cost are jeopardized by changing conditions.

Ustream’s SD-CDN works by deploying a software layer that transmits, receives and processes metadata between the sources of video content (streaming servers, CDN edges, transit lines, ad-hoc edges) and the destination of the video content – end user viewing devices. Each Ustream player that is deployed has a connection called the Ustream Media Server connection (UMS) which sends back real-time data each second. This creates an enormous amount of data, as each connected player (even if they are not yet playing or have already stopped playing video content) is sending back real-time data about the status of that client. This connection is a two-way connection, so it is used to deliver messages in both directions. For example, the player reports back to the SD-CDN its IP address and logic can be triggered based on that, such as whether a viewer is in a restricted country or not.

This same connection can be used to carry data about whether or not that player is buffering. This data is all processed at a few geographically distributed locations to ensure redundancy and analyzed in real-time using proprietary algorithms so that it can be interpreted. The algorithms contained in the SD-CDN central servers look at things like if a player is buffering, is it just a single viewing client, or is there a pattern of buffering in a specific region? Since switching between sources can occur on the level of a single client, then an isolated issue can be addressed in a client side switch, thanks to the SD-CDN. But if a large number of client-side switches are being reported back to the SD-CDN, then the SD-CDN can make a large-scale switch to completely disable a certain CDN provider or an ad-hoc edge if necessary.

Since the SD-CDN includes monitoring capabilities, the automatic logic can always be overridden or augmented by real-time monitoring at Ustream’s NOC. A network operations expert can spot a pattern not picked up automatically by the algorithms and use the interface of the SD-CDN to have instant control of the entire system. Over time, as new patterns emerge, they are added to automatic recognition algorithms of the SD-CDN.

The SD-CDN is particularly interesting and useful when looked at in concert with Ustream’s other streaming technologies. Ustream offers a cloud transcoding service so you can send a single high-bitrate HD stream and Ustream generates the lower bitrate and resolution versions of the stream from the original stream. When these versions are created, there are time synchronization markers added as metadata in the stream. This is important for switching between bitrates on a single player from a single source to ensure the stream content does not jump or skip back in time, but these markers also become relevant so that via the SD-CDN, the Ustream player can actually perform a seamless switch between streams coming from two completely different sources.

Ustream’s delivery platform can adapt to provide the best possible quality for each viewer or to make changes on a larger scale to pre-empt an issue before the viewing client reports it. In addition, a major function of the SD-CDN is the ability to manage traffic across a network of CDN providers, transit lines, peering and ad-hoc edges based on the cost considerations of carrying traffic of a given volume at any given time.

For those familiar with CDN pricing and contracts (www.cdnpricing.com) a typical situation is that a content provider is expected to predict in advance the amount of usage they will have on a monthly basis and to sign a long-term contract based on that. It is typically a use it or lose it proposition, whereby overestimating usage will result in sunk cost with no ROI, and underestimating usage can lead to steep overage fees. In addition, the provider is incentivized by way of scaled discounts to commit to a larger package because the unit economics become more appealing by buying in bulk.

The rules of this game make it a tricky proposition for any content provider wanting to maximize the value of their investment in a third-party CDN service. This is one of the key advantages that Ustream’s SD-CDN can provide. For several months, Ustream has been able to utilize third-party CDN services at a flat 95/5 usage pattern. This results in an optimal ROI in the contracts Ustream has with third-party CDN and transit providers. This cost savings can then be passed on to viewers, which is one of the reasons Ustream can offer the same scale and quality of delivery as many leading CDNs, even at lower costs.

While Ustream has used this solution extensively for delivery of video content, the company says what they have built is content-type agnostic and can be put to work for any kind of HTTP traffic, such as gaming or any Web-based application acceleration. Ustream is currently using their Software-Defined CDN to control all of its video content and is considering offering the solution as a stand-alone service for others with similar needs. If Ustream starts selling their platform as a stand-alone delivery service, it would be an interesting disruption in the market.

Most third-party CDNs aren’t experts when it comes to the entire ecosystem of live events. They can handle the delivery, but a lot more goes into a live event than just delivering the bits. This is why companies like Ustream, Livestream, Twitch and others who specialize in live broadcasting have all built their own live broadcast platforms. If Ustream can take what they have built and offer it to anyone, customers would then be able to buy live delivery services from what is essentially a brokerage exchange. This would be a new way of buying live delivery services, from multiple CDNs, all based on a specific quality of delivery tied to the time of day. The company hasn’t officially said they will offer it as a product yet, but did tell me they are looking into it. If they launch it as a stand-alone service in the market, it’s a very unique proposition and one I think they could have success at.

Streaming Vendor News Recap For The Week Of March 16th

Here’s a recap of all the announcements from streaming media vendors that I saw for the week of March 16th. I’ll try to do a better job of regurally creating this list at the end of each week:

WSJ Report Inaccurate: Content Owners Not Asking ISPs For “Separate Lanes”

Yesterday, a story in the Wall Street Journal created a lot of stir implying that HBO, Sony and Showtime were asking ISPs for their content to be given “special treatment” by delivering it via a “separate lane” within the ISPs network. After speaking to multiple ISPs and some of the content owners mentioned in the story, they tell me the WSJ post is inaccurate and that they don’t expect any ISP would treat their content differently from another.

Those I spoke were confused as to what exactly the WSJ is implying, when terms like “special treatment” are being used, without any definition of what is “special” about the treatment. There is also no agreed upon definition of what a “managed service” is and the article doesn’t detail how they define it. They also reference a “separate lane” within the ISPs network, but there is only one lane into your house on the Internet. Again, lots of buzz words, no definitions.

The article says the reason the content owners would want to do this is to “move them away from the congestion of the Internet.” The problem with this idea is that neither HBO, Sony nor Showtime owns their own CDN. They rely on third-party CDNs like Akamai, Limelight and Level 3 to deliver their content and these CDNs already have their servers inside ISP networks, or connected directly to them via interconnection deals. That’s the main value of using a service based CDN is to avoid congestion, which HBO and others are already doing. In fact, HBO has been doing this with Verizon since 2010, by allowing Verizon to cache HBO’s content inside Verizon’s network. But that content is not “prioritized” or given any “special treatment” of any kind inside the last mile.

The article also says that media companies feel that the “last mile of public Internet pipe, as it exists today, won’t be able to handle the surge in bandwidth use for all the online-video services.” The problem with that argument is that the congestion we see on the Internet isn’t taking place in the “last mile”, it’s taking place at network access points outside the last mile. To prove that, just look at the latest Measuring Broadband America report by the FCC that measures ISPs advertised speed versus delivered speed. The data shows that there is very little congestion in the actual last mile. So the WSJ argument as to why HBO and other content owners would want to do this doesn’t make sense and take into account the technical details of how it all works.

The WSJ article waits until halfway through the piece to mention that no ISP has actually agreed to whatever it is that the WSJ is suggesting content owners want. The article says that Comcast “wasn’t willing to do anything for any one content provider that it couldn’t offer to every other company.” So the WSJ is saying that content owners asked for something that ISPs said no to. But the piece then goes out-of-the-way to make it sound like this is a potential problem, ties in the topic of Net Neutrality but then never defines, what exactly is being proposed. What does “special treatment” mean? Are they implying the “prioritization” of packets? We simply don’t know as they use high-level terms without any definition of how they are applying them.

Another argument the WSJ makes for why content owners would want this is that some content owners don’t want their service to count against the ISPs bandwidth cap. Problem with that argument is that you don’t need a “managed service” to make that happen. Netflix recently struck deals in Australia where their content does not count against the ISPs cap with no “managed services” taking place.

The WSJ also says, “media companies say the costs of guaranteeing problem-free streaming for users are rising.” What they don’t say is whom those costs are rising for? The content owners? The ISPs? The consumer? It sounds like they are saying the costs to deliver video for the content owner is increasing, but in fact, it’s the opposite. Costs to deliver video via third-party CDNs have fallen at least 15% each year, since 2008. (Source: one, two) Also, there is no way to “guarantee” problem-free streaming no matter how much money you spend so that notion is false. CDNs offer SLAs, but they don’t “guarantee” anything outside their network once it hits the last mile. And ISPs only guarantee customer’s access out of their last mile, which is done on a “best effort” basis. For the WSJ to imply otherwise is inaccurate.

ISPs I spoke to made it clear that they are not in discussions with OTT providers to manage their traffic differently from other content owners or provide them with special treatment of any kind. What they think the WSJ might be confusing is the idea of caching content inside their last mile, but again, that doesn’t come with any kind of “special treatment” or prioritization of any kind. The WSJ story uses a lot of generic undefined words that sound very scary, but when you look at the details rationally, you can see that they simply created controversy where none exists.

Video Platform Provider Voped Looking To Sell Company, OTT Platform Available

Screen Shot 2015-03-18 at 3.15.12 PMThere continues to be a shake out within the tier-2 video platform space, (see Volar Video Selling Stream Stitching & Video Platform Assets) with the latest coming from Voped. I recently heard from Voped President and sole investor Mark Serrano, who tells me that he has decided to offer the platform for acquisition.

Mark tells me the company is already in preliminary discussions with a couple of large companies now, but also wanted to put the word out about their availability considering what’s happening in the space and the technology jump-start that his platform can offer. Voped offers an end-to-end solution to manage, encode, secure, deliver, and monetize video globally on the web, mobile, and other connected devices. So for the right company, acquiring versus buying can give them the advantage of time to market and the extensive experience of the team that built the platform.

Mark sees an advantage to the small size of his team (four original team members; the parent company provides numerous support services separately), in that it will make for an easy transition to bring the technology under a new banner. He says the company has a very efficient turnkey offering and has built it at a fraction of the cost compared to what the large platforms have invested. They have a lot of experience with custom development, from features to larger integrations – such as with Widevine DRM, payment gateways, a turnkey website solution, and custom user interfaces.

For information on Voped’s technology highlights you can check out this PDF deck and for those interested, you can contact Mark Serrano directly.

Free Book Download: Hands-On Guide To Webcasting Production

51UV65ljoKLWebcasting guru Steve Mack and I wrote a webcasting production book entitled “Hands-On Guide To Webcasting” (Amazon), which we’re now giving away as a free PDF download. You might notice that the book was published in 2005 and since that time, webcasting has evolved into the mainstream application it is today. But some of the best practices regarding encoding, connectivity, and audio and video production techniques etc. have never changed. We felt the book could still be a valuable resource to many and we wanted to make it available to everyone, with webcastingbook.com now re-directing to this post.

This book was one of the eight books in my series that combined, have sold more than 25,000 copies, with the webcasting book being the most popular. So we’re happy to have gotten the rights back to the publication to be able to share it with everyone. The help email included in the book still works, so those with questions can still reach out to us, and we’ll try to answer any follow-up questions. You may re-purpose content from the book as you like, as long as you don’t charge for it and you credit the source and link back to webcastingbook.com. Here’s a quick breakdown on the chapters

  • Chapter 1 is a Quick Start, which shows you just how simple webcasting can be. If you want to start webcasting immediately, start here.
  • Chapters 2 and 3 provide some background about streaming media and digital audio and video.
  • Chapters 4 and 5 are focused on the business of webcasting. These chapters discuss the legal intricacies of a webcast, along with expected costs and revenues.
  • Chapters 6 through 8 deal with webcast production practice. Planning, equipment, crew requirements, connectivity, and audio and video production techniques etc.
  • Chapters 9 and 10 cover encoding and authoring best practices. This section also covers how to author simple metafiles and HTML pages with embedded players and how to ensure that the method you use scales properly during large events.
  • Chapter 11 is concerned with distribution. This section discusses how to plan and implement a redundant server infrastructure, and how to estimate what your infrastructure needs are.
  • Chapter 12 highlights a number of case studies, both successful and not so successful. These case studies provide you with some real-life examples of how webcasts are planned and executed, how they were justified, what went right, and possibly more important, what went wrong.

I’ll also be giving away my business book in the coming days.