How To Choose The Right CDN For Mid-Sized Customers In The SMB Market

One of the most common questions I get asked by content owners, of all sizes, is which CDN they should use to deliver their videos. I like to answer their question by throwing one back to them saying, what’s the best car to buy? In both cases, it all depends on exactly what your use case is and without knowing more details, no one can really provide any helpful guidance. This is especially true in the small and medium business (SMB) market where many don’t have robust IT departments, aren’t as up to speed on the technology, don’t know what services cost, or simply need help knowing where to start.

With any CDN project, the first few things you have to identify are the types of video you are delivering (live/on-demand, short form vs. long form); how many viewers you think you will get; the devices you are delivering video to; any custom features you may need; what your monthly budget is; and whether your viewers are regional or global. Having answers to those basic questions will give anyone who is trying to help you enough of a foundation to work from, to try to point you in the right direction.

The budget number helps a lot as it enables vendors to define what size company you are. While that may seem irrelevant, it’s very important as some vendors tailor their solutions towards specific sized companies and verticals. When it comes to web services like CDN, however, there’s little said about where small and mid-sized companies stand. This becomes a problem when customers are reconsidering their current CDN or trying to find their first. It’s also a problem because the general criteria used in many other markets like annual revenue, employee count, etc. can’t be used due to the scalability of the web. For instance, a small five person team behind a video sharing platform could easily use more CDN services than a fifty person company who isn’t distributing viral videos.

Because of this kind of disparity, I’m going to set some loose standards for what small and mid-sized companies are when it comes to content delivery networks. This way those in the process of choosing a CDN will know where they stand in the market and it should help guide them to pick from the right pool of vendors. A small customer would be anyone spending less than $1,000 a month on video delivery and who doesn’t expect their needs to grow much in the next twelve months. Those spending less than $1,000 a month are often solo publishers with little traffic. A midsize customer would be one that is spending more than $1,000 a month and continues to grow their traffic at a rate that would double the value of their CDN contract within a twenty-four month period. Mid-sized customers also tend to have more specific technical requirements, be it for security, analytics, or player integration. Now that you know whether you’re a mid-sized company or not, let’s take a look at some of the things you should consider when choosing the right CDN for your business.

Superior Support
A mid-sized company typically has a lean technical team in comparison to traditional media companies. This is great for profit purposes, but not so great for in-house specialization. For instance, a smaller team behind a video sharing platform may have web developers and market-minded people who know how to create social buzz. But, somewhat ironically, they won’t have a person who specializes in delivering video. Because of the lack of in-house specialization and operations people, you need to rely on your CDN as a partner, not just a CDN provider. This is one of the places where CDNs that target the SMB market can really add a lot of value for customers.

With this partnership principle in mind, it’s the CDN’s duty to do the following: hold your hand through initial setup, show you how to interact with their control panel, offer transparency through status reports, and provide near-instant support in your preferred format (phone, web chat, written tutorials). Not all CDNs do this, however. And very few, that I have seen, take the time to measure their level of CDN support. One exception I have seen is MaxCDN. This content delivery network focuses on the mid-market and is working to establish a standard for support and customer satisfaction in the industry. According to their December 2014 blog post, the company has a transactional Net Promoter Score (NPS) of 80-90+. I mention them because I have yet to see another CDN publicly rank their customer satisfaction scores. Also, CDN customers I speak to who use MaxCDN are always saying how good they are. Given that there is not yet an established industry average NPS in the CDN space, MaxCDN is leading the charge in developing this crucial metric to help customers find the right CDN provider.

On the other hand, some CDNs are infamous for prioritizing their biggest clients and taking hours to respond to requests from other clients simply because their focus isn’t truly on the SMB market. And if they lose you as a customer as a result of poor support, they really aren’t worried. Therefore, when interviewing CDNs and testing out their platform, ask what their average ticket response time is. It should be no more than a few minutes, especially if your company doesn’t have someone fully committed to the intricacies of content delivery.

Easy To Understand Product and Pricing
If what you need is video delivery, go with a content delivery network, be it a regional one, or global. Don’t bother with a general web hosting provider that sells CDN “on the side.” To find out if a CDN provider has a streamlined product that fits your needs, look in these two places: the main feature/product page and the pricing page of the company’s website. If you see multiple offers on the feature/product page that are unrelated to CDN, like web hosting, managed services etc. continue your search elsewhere. However, if what you see is directly related to CDN, continue to the pricing page.

When comparing pricing pages, you’ll notice that there are various add-ons that cost extra. Many of these add-ons are legitimate as content delivery networks are progressing and offering upgrades like SSL, mid-tier caching, or custom caching rules to name a few. The CDN you want is the CDN that isn’t charging outrageous fees for these add-ons. The additional charges should be purposeful and make sense. For instance, if you want SSL (because you need SSL), you shouldn’t have to pay ridiculous sums of money, as serving traffic over SSL is becoming the standard. This leads me to my next point.

Affordable SSL
Companies that don’t serve content over encrypted connections will grow in irrelevance at the same speed as non-encrypted connections. For instance, soon publishers will start demanding ads to be secured from malware. And once publishers and security advocates like Google start enforcing this, you will need SSL. As a mid-sized company, you can probably afford heightened SSL costs. But why do it when you don’t have to?

Some content delivery networks, mainly those that target larger enterprise customer, charge outrageous sums of money for something that will eventually be as common as HTTP. One way CDNs overcharge for SSL is by increasing the cost of bandwidth. Instead of charging a flat fee for custom SSL or wildcard SSL, bandwidth served as HTTPS costs slightly more than that served as HTTP. The increase is usually slim, but when you’re a mid-sized company with a growing base of customers, those slim margins add up to thousands of dollars. If security is something you need, it’s also important to note that the CDN you choose should support SPDY, the secure protocol developed by Google that has lower latency than HTTPS. Most quality CDNs support this, and it’s worth inquiring about when investigating different providers.

Efficient Architecture
Some CDNs will tell you that when it comes to quality of service, it’s all about how many points of presence (POPs) they have. The fact is, POPs are only one of the many attributes of a CDN network that determines performance. But quantity does not equal quality. Many companies are still stuck on the notion that the CDN with the most POPs or locations is always the fastest or provides the best service. However, this isn’t always true. And even when it is, those few milliseconds shaved off of delivery speed, usually have no business ROI with video specifically and can cost your company thousands. Is it worth paying 30% more per month to one CDN just to have your videos startup up two-tenths of a fraction of a second faster? Probably not. In all seriousness though, there’s a POP problem, especially with many of the larger CDNs who are so focused on how many locations they have. That’s great for enterprise customers that need global delivery, but not good for an SMB customer with smaller traffic, less inventory of content, and doesn’t have tons of money to spend. If you are a growing mid-sized company dedicated to scale, you should seek out a CDN dedicated to the same.

Instant Purging
Many customers need to have full control over the content delivered by their CDN. If you’re not currently using a CDN, you’re accustom to this level of control. You simply access the CMS or FTP of your origin server and hit delete. Some content may still live on in browser caches, but the main source the content lives on is wiped clean. Now new or updated content is delivered to users rather than older content. This is how it should be, and this is how it is when you use a content delivery network with instant purging. This feature gives you origin-like control over the content delivered by your CDN, meaning old or unwanted content is purged and inaccessible to users in a matter of seconds.

Using the video streaming company as an example again, say a user uploaded a controversial file onto your server. Based on your algorithm, this file could be automatically pushed out to your CDN for quick user consumption. To protect your brand image, you’d want to remove all traces of this file as soon as you became aware of it. With a CDN that has one-click instant purging, the unwanted video would be removed in a matter of seconds. This wouldn’t be the case with some CDNs though. They often require minutes, even hours to purge content on their servers. This is a major problem in an age when the face of your business hinges on the nature of your content.

Conclusion
If you’re a small customer and only have a few hundred dollars a month to spend, I would argue it really won’t matter who you pick. For mid-sized customers, picking and choosing the right CDN usually means looking past the big CDNs. While they do have some small and mid-sized customers, that’s not really their focus and they are more suited to handle large media, enterprise and broadcast customers. That’s not to say you can’t get good service from them if you are smaller, but you will be paying far more than you need to. Why pay enterprise rates when you’re not an enterprise customer?

For mid-sized customers, align yourself with mid-sized CDNs that specialize in CDN rather than a slew of non-related web services like web design or marketing. A service-based company dedicated to offering content delivery will almost always have better support, better prices, and more CDN-specific features than a traditional web hosting company or solutions integrator. You’ll find many options in the mid-sized CDN space, but as mentioned earlier, companies like MaxCDN, who are specifically serving this space make them a great starting point for you in your search. Bottom line: Whomever you choose, keep the 5 points mentioned above in mind during your CDN decision-making process. This will save your company money and other resources down the road.

The great news for all content owners is that content delivery services are now within reach of all SMB customers and only continue to improve in ease of use and performance each year. If you still need help picking and choosing the right CDN, feel free to send me an email. I’ll gladly walk you through the process and help guide you, free of charge.

Sponsored by

Thursday Webinar – Streaming Major Sporting Events

Thursday at 2pm ET, I’ll be moderating another StreamingMedia.com webinar, this time on the topic of, “Streaming Major Sporting Events.” For broadcasters, content owners, and over-the-top (OTT) video service providers, major challenges include delivering a seamless live viewing experience to a very large audience, as well as implementing profitable monetization strategies. In this roundtable webinar, we will explore techniques successfully used to provide a satisfying live viewing experience coupled with successful monetization models that support various business strategies. Join speakers from Level 3, Vindicia and The Diffusion Group and bring your questions for a discussion on:

  • Live Video Acquisition
  • Media Processing
  • Workflow Automation
  • Flexibility of content and promotion
  • Monetization including Advertising, Pay-per View Processing, Authentication, Subscription & DRM
  • Scaling Architecture
  • Cloud Platforms, APIs and Legacy Systems
  • Security
  • Multiscreen Delivery – Strategies, Workflows, and Quality

Don’t miss this opportunity to hear from industry experts who have conquered these challenges. You’ll learn how to build new revenue streams by delivering live video to many people, in many formats, on many devices, over many networks. You can register for free here.

Mike Dunkle, Network Infrastructure at Valve To Keynote Content Delivery Summit

Screen Shot 2015-02-10 at 11.50.34 AMI’m pleased to announced that Mike Dunkle from Valve’s network operations and infrastructure group will be the opening keynote at the Content Delivery Summit, taking place Monday, May 11th, at the Hilton Midtown in NYC. If you’ve ever wondered how a platform to support 100 million registered users gets built, this is the keynote to attend.

Now in its seventh year, the Content Delivery Summit is the place to meet those who are building out some of the largest public and private CDN deployments to date. Attendees will see case studies on real-world deployments, demos of new technology platforms and discussions on business models for both on-net and off-net delivery. But it’s not just about streaming and last-mile video delivery. The summit also covers other web acceleration technologies including dynamic content delivery, transparent caching, app acceleration, QoS measurement, front-end optimization, mobile content acceleration and more.

The call for speakers for the Content Delivery Summit is still open, so if you have interest in speaking at the event, now is the time to send in a submission or contact me. Online registration for the event will open within the next two weeks. If you would like a discount code for the event, email me and I’ll be happy to send you one.

Streaming Media East Program Now Available, Speaker Placement Starts Today

Screen Shot 2015-02-09 at 2.50.55 PMI’ve just finished laying out the advance program for the Streaming Media East show, taking place May 12-13 in NYC. We’ve got a lot of new session topics and presentations this year and now that all of the topics have been chosen, myself and the moderators will start placing speakers. If you are interested in speaking on a session, download the agenda (word doc) to see which sessions are available and follow the instructions on the document.

Below is a list of all the session and presentation topics, but note that some of these do not have speaking spots available on them. So download the agenda for full details. In addition, all of our sessions and presentations this year are listed within nine different categories:

Screen Shot 2015-02-09 at 3.03.27 PM

  • How OTT Is Disrupting The Pay TV Business
  • Benchmarking Your Broadcast Video Workflow
  • Twitter, Facebook & Snapchat: On-demand Social Video & The Real-Time Feed
  • How To: Bringing Media Channels to Amazon Fire TV
  • 4K/UHD Streaming: Definitions, Challenges, and Champions
  • Codecs, Containers, and Protocols: Digital Media Formats For Online Distribution
  • Creating Revenue Streams From New OTT Services
  • How To: Building A Chromecast Application
  • Demo: Smart TV Platforms In Action
  • Enterprise Delivery: Building an Internal Streaming Solution
  • Dissecting Big Data: Trends In Video Consumption and Behavior
  • Implementing New Technologies With MPEG-DASH
  • Developing OTT Apps for Multiple Platforms
  • How To: H.265 vs. H.264, An Under the Hood Assessment
  • Measuring The ROI On A HEVC Deployment
  • Expanding Outside YouTube: Creating A Multi-Channel Network Strategy
  • Monetizing the Multi-Screen Consumer Experience
  • How To: Creating A Streaming Channel On Roku’s Platform
  • Programmatic Video Advertising Goes Premium
  • From the Classroom to the Athletic Fields: Streaming In Educational Institutions
  • Hands-On With Streaming Devices and OTT Platforms
  • How To: Building a Video Player
  • Best Practices For Adding Redundancy To Live Encoding & Delivery
  • Using Media Optimization to Reduce Data Rates and Bandwidth Costs
  • Business Strategies to Break Out of the OTT Crowd
  • Developing Universal Windows Applications for Xbox and Beyond
  • Replacing Flash: Adaptive Streaming and DRM in HTML5
  • How To: Selecting The Right Video Management Technology
  • Video Management Technology’s Role in Delivering Optimal User Experiences
  • Best Practices For Live Streaming Production
  • The Future Of Branded Content
  • How To: Producing and Distributing HEVC
  • The Transformation of News Reporting Using Digital Video
  • Integrating Streaming, Video Conferencing, and Unified Communications Solutions
  • The Future of Video in a Multi-Screen World

Volar Video Selling Stream Stitching & Video Platform Assets

Screen Shot 2015-02-09 at 10.41.55 AMLexington Kentucky based Volar Video is selling its streaming software and platform that handled over 13,000 live streams and over 2 million viewers over the past six months. The company has asked me to make it public that they are now considering all strategic alternatives, including an asset sale, acqui-hire, or selling a majority stake. You can download an overview of Volar’s platform and technology to get more details on what they have to offer.

Volar’s major clients are primarily in the live sports vertical and include the Mountain West Conference and many other NCAA Division I, II, and III colleges and universities. Volar has also recently worked with Root Sports, Fox Sports Midwest, Time Warner Oceania, as well as Silver Chalice and Encompass Media. Volar was one of the first to figure out stream stitching when they demoed their mid-roll ad solution to me years ago. Volar’s platform functionality includes:

  • a cloud-base live streaming and VOD platform
  • VAST compliant mid-roll ad insertion that works on desktop, mobile SDKs and iOS mobile web
  • advanced multi-party ad inventory management functionality
  • streaming software for Mac and PC
  • real time analytics, SDKs and APIs

Volar has ten engineers that have worked together for two years building the software and platform. Anyone interested in talking to the company can email me and I’ll be happy to make an introduction.

FCC’s Proposed Internet Rules Changes Little, No Real Impact On Interconnection or Choice

FCC Chairman Wheeler released a fact sheet today that outlined the new rules he is proposing for the Internet, which falls far short of solving the main complaints we’ve heard about in the market for so long. Many think it’s a big win for consumers that the proposed laws will prohibit ISPs from blocking, throttling, or prioritization content on their network, yet to date, no ISP has been accused of doing this. It’s nice that these restrictions might be a law going forward, but it doesn’t do anything to address the complaints of what takes place outside the last mile, or all the debate around consumers wanting more choices for broadband services.

In fairness, we haven’t seen the full proposal or all the details, but the fact is, that one of the biggest complaints we read about is that consumers want more choice when it comes to Internet service providers. The proposed rules won’t require any last-mile unbundling, so those that think the rules will foster more ISP services will be sadly mistaken. Think of how many times we read about consumers contending with local monopolies for their broadband Internet service and want more choice. Isn’t that the number one complaint by consumers? These new rules do nothing to address that. Not that I think they should, but this proposal doesn’t unbundle the last-mile and doesn’t regulate rates. So for those that call this a “win” for consumers, I don’t see it. There will be no new competition. The proposed new rules also allow ISPs to do “reasonable network management”, so those that wanted that off the table, won’t be happy either.

When it comes to the topic of interconnection taking place outside of the last mile, which so far Netflix has been the only content owner to complain about, the proposed new rules won’t actually govern them. The little bit of language we have on the topic, so far, says that the “Commission would have authority to hear complaints and take appropriate enforcement action if necessary, if it determines the interconnection activities of ISPs are not just and reasonable.” That’s not a law. It’s simply a way for the FCC to hear any gripes and then try to figure out what to do with them. How does the FCC plan to define “just and reasonable”? Traditionally, “just and reasonable” is defined by reference to the “cost” of providing the service. As a practical matter, this has been accomplished through the use of tariffs and investigations into tariffs. I couldn’t find any prior case where the FCC has assessed whether a non-tariffed rate is just and reasonable.

Who or what will be the authority on what “just and reasonable” market rates are? Will these rates be compared to pricing from transit providers, third-party CDN providers or some other form of alternate distribution? And will the decision only be on cost, or on the quality of service? I find it interesting that so far, in this whole net neutrality debate, people are arguing over capacity and speed, but never bring up quality of service. Capacity means nothing without performance and a good user experience. Also, while this may sound silly, the FCC is going to have to define what they classify as an interconnection. The language makes reference to the “interconnection activities of ISPs”, but what about those who aren’t ISPs? If people truly want an “open Internet” and transparency, it’s not fair that Cogent can secretly prioritize packets and impact the consumer experience, but doesn’t fall under the same rules.

One article I read today said, “without specific rules, ISPs would be tempted to ban, slow down or seek payment from content providers.” Why would they be tempted to do that? They don’t get paid a lot of money from interconnect deals, just look at the revenue numbers Comcast made public. ($40M-$60M in 2013) And by law, Comcast already isn’t allowed to block or throttle content due to their purchase of NBC. So for all the people acting like we have all kinds of blocking or throttling of content, by ISPs, we don’t have a single example of it being done.

Again, why not draft a proposal that deals with the actual complaints of consumers, instead of perceived issues that no consumers are actually dealing with. And before anyone says this is what Netflix has been complaining about, it isn’t. Netflix has never once accused Comcast or any other ISP of blocking or throttling their content, but rather a lack of wanting, and not getting for free, more capacity at interconnection points. Netflix’s CEO was quoted as saying, “it has no evidence or belief that its service is being throttled.” We need to stop using this term of “throttling” or implying that it’s happening to Netflix, or to anyone else, until someone makes that claim, and shows evidence of it happening. Implying it is taking place, only fuels the fire, and makes people debate non-facts, which does not help.

I read one post that said these proposed rules are a “big win for Netflix”, but in reality, that’s not the case. Netflix will have a hard time trying to convince the FCC that they are being mistreated when the interconnection deal they have with Comcast costs them less money than using transit providers and third-party CDNs, improves the video quality for consumers, and comes with an install SLA, packet loss SLA and latency SLA from Comcast. In Q2 of 2014 alone, Netflix paid third-party CDN provider Limelight Networks $5.4M, to deliver a small percentage of their overall traffic. Clearly if the FCC felt interconnect deals were a big enough problem, or that Netflix was truly getting treated unfairly, they would have proposed something much stronger than what is primarily a way to just “hear complaints.”

Another question I have from reading the proposed new rules is how the FCC is going to reclassify mobile broadband, when we have clear language protecting mobile broadband from Title II. I also can’t tell from the proposal if the FCC plans to reclassify retail broadband service only, or those services they provide to edge providers as well. The bottom line is that this outline we have seen today doesn’t really addresses the issues and leaves us with a lot of unanswered questions. We need to see the full proposal to know the details and see the language that will be used, but this is just another step along the way of what is going to continue to be a very long debate on the topic of net neutrality. It brings no real clarity to the debate, still has to be voted on, pass any legal hurdles and be put into practice. That’s not happening anytime soon.

One final thought, it says these new laws are intended to let consumers “access the legal content and applications that they choose online, without interference from their broadband network provider.” That’s funny considering my broadband provider is never what prevents me from accessing content. It’s always the combination of the device, the OS platform and the closed and highly controlled ecosystem that run on these devices.

The Super Bowl Stream Wasn’t As Bad As Many In The Media Said It Was

I’ve read quite a few blog posts about NBC Sports live stream of the Super Bowl and it’s clear that the vast majority of the media don’t understand what the workflow for a live event looks like, the pieces that are involved and the various factors that determine the quality of the live stream. A post on DSLReports.com says the Super Bowl stream “Struggled Under Load”, yet provides no details of any kind to back up that claim. The fact is, capacity wasn’t an issue at all. [Update Tuesday 10:58amDSLReports.com has changed the headline of their post to no longer reference any kind of capacity issue.]

NBC Sports used third-party CDN provider Akamai to deliver the stream and had Level 3’s CDN in backup mode in case they got more traffic than expected, but never had to use them. Media members that complained about the stream didn’t provide any tech details of how it worked, how it was delivered, the companies involved and didn’t speak to any of the third-party companies that were monitoring the stream in real-time. They really made no effort to learn what was really going on with the live stream or speak to the companies responsible for it, which is just lazy reporting. Cedexis data shows Akamai’s availability did drop during the game, to 98.81% in the Northeast, but not significantly.

Screen Shot 2015-02-02 at 10.09.58 PMNBC Sports said that the live stream peaked at 1.3M simultaneous, which isn’t a big number for Akamai. Six years ago, Akamai’s network peaked at 7.7M streams, 3.8M of which came from the Obama inauguration webcast. Akamai has plenty of capacity to handle the live stream of the Super Bowl and has done live events, including those for Apple, that make the Super Bowl look small in comparison. Slate Magazine’s post called the Super Bowl webcast a “disaster”, with the biggest complaint being the fact the live stream had a delay, when compared to cable TV. Clearly the author doesn’t understand how the Super Bowl stream worked or he would realize that based on the setup, the delay was unavoidable.

The video was encoded in the cloud using Microsoft’s Azure platform, which adds a delay. On top of that, using HLS adds an additional delay and doing HLS over Akamai adds even more. Talk to any of Akamai’s largest live customers and they will tell you the number one complaint of Akamai, when a live stream is using certain parameters, is the delay in Akamai delivering the stream. Akamai’s network requires a lot of buffering time, for both HLS (and RTMP), otherwise you can get audio drop-outs on bitrate switches. NBC Sports used both HDS (HTTP Dynamic Streaming) for desktop and HLS (HTTP Live Streaming) for devices. So before some members of the media start blaming NBC Sports for the delay, learn all the pieces in the live workflow and understand how it all works. Even twenty years later, there are limitations in the technology. I simplified the workflow, but there were many more pieces involved in making the stream possible and many of them can add a delay in the live stream.

For those that want more tech details on the encoding, the average bitrate for the stream was 2.5 Mbps with an average viewing duration of 84.2 minutes. Also of note is that NBC Sports optimized their in browser display at 2.2 Mbps for target video size with a max bit rate of 3.5 Mbps in full screen mode.

The Slate piece also goes on to say that “NBC was dealing with huge traffic for its Super Bowl stream” and that the “traffic would be tremendous.”Again, statements that want to imply there would be capacity issues, which simply wasn’t the case. A piece by Mashable called the stream “slow” saying it was a “bit disconcerting for anyone who wants to keep up with up-to-the-second plays on social media”. Not all forms of content get delivered, in real-time, at the same speed. If you want “up-to-the-second” then the video stream is not for you. But it’s not the fault of the live stream, which has to get ingested, encoded, delivered and then played back with a player/app. Compare that workflow to a tweet, it’s not even remotely similar.

As for my experience with the Super Bowl stream, I did experience some problems, in regards to the actual video quality. I worked with NBC Sports tech team, giving them specs on my setup and they looked up my IP and tracked me throughout the game, having me test various links and setups. While we still don’t know what my issue was, it only appeared when I was using Verizon, but didn’t crop up when I used Optimum. So one thing people have to remember is that it’s not always the CDNs fault. Many times, it’s things down at the ISP level. As an example, I was having a lot of issues with streaming YouTube and Google looked into it and found there was a specific issue inside Verizon that was causing it.

During the live stream, NBC Sports was using multiple third-party quality measurements platforms, including those by Conviva and Cedexis that. Conviva’s is in real-time and can show everything from buffering times to failed stream requests. The media needs to learn more about these third-party platforms as you’ll notice, they don’t know anything about them, nor ever seem to look at their data after a large event. Stop coming up with “theories” around capacity and dig into the real data. While NBC Sports isn’t going to give out all the data we want, any member of the media who has connections could have easily talked to some of these third-party companies and gotten info or guidance of what they saw and any impact it might have had on performance. For the majority of users who turned into the live stream, it worked and worked well. There were some like myself and others that did experience intermittent problems, but we were the minority and in many cases, problems down at the ISP and WiFI level always causes quality issues with both live and on-demand video. Media members who considered the live webacst a failure due to it not being real-time, or lack of certain ads shown, should then be focusing on the business aspects of the stream, not the technical ones.

One last thing. For all the people reporting that the Super Bowl stream was a “record”, it wasn’t. The raw logs are not verified by any third-party company, there are many different ways to count streams, (simultaneous, unique simultaneous etc.) and if you want to look at just the sports vertical, you have events by ESPN, eSports and others that did more than 1.3M simultaneous. Quantity is important, but it’s not the single biggest piece of methodology that should be used to determine the success or failure of a live webcast. There is no such thing as the largest when it comes to live events as many times, numbers aren’t even put out after the event. Just look at Apple’s live streams, we don’t know their stream count and on the days they do a live webcast, Akamai takes down their real time web metrics chart that shows the live stream count on their network, just so no one knows how many streams Apple is doing.

If there is one thing the Super Bowl stream did reinforce, it’s that streaming video technology can’t replace traditional TV distribution, for quality, or scale. Yes, I know some will want to argue that point, but if you talk to those who are smarter than me, building out these networks to deliver content, not only are their many technical limitations, but there are just as many business ones as well.