Announcing The “Live Streaming Summit”: New Conference On The Live Video Ecosystem

lss_20150708-250Live streaming is one of the hottest topics in online video, and so we’re announcing the Live Streaming Summit—a brand new two-day event produced in conjunction with the Streaming Media West show, taking place November 17-18 in Huntington Beach, Calif. Live streaming has always been a big part of our program at each Streaming Media show, but now we’ll have two dedicated tracks to the subject and more speaking and presentation spots than before.

The Live Streaming Summit focuses entirely on the technologies and strategies required to take large-scale live event and live linear video and deliver it to viewers watching on all manner of devices—computers, tablets, mobile phones, set-top boxes, and smart TVs. This isn’t about cameras and video production, but rather all of the backend pieces that make up the live streaming workflow.

No matter what content area you might be working in—entertainment, news, sports, gaming, worship, or live linear channels—we’ve got you covered with two days of panel discussions and presentations focusing on best practices and insights in seven topic areas:

  • Best practices for backhaul, transmission, and ingest—satellite, fiber, cellular, and more
  • Encoding and transcoding—on-prem, cloud, and hybrid
  • Management—metadata, content protection, stream stitching, and preparation for syndication and post-event VOD viewing
  • User experience—video players, APIs, clip sharing, and social media integration
  • Monetization—advertising, subscription, and pay-per-view
  • Distribution—content delivery networks, real-time analytics, QoS, and QoE
  • Post-event evaluation—how to determine if your event was a success

We’ll also feature case studies from leading content publishers and service providers highlighting real-world success stories. If you are a technical or business decision-maker whose job depends on delivering successful large-scale live events, then the Live Streaming Summit is a must-attend conference.

If you’re interested in presenting or speaking on a panel in one of the above topic areas, submit a proposal via our Call for Speakers page before August 31. I’ve worked with StreamingMedia.com’s Editor Eric to create an outline of what the show will cover and Eric is now organizing all of the speakers and chairing the program. So hit up the call for speakers page, or email Eric or myself with any ideas or questions.

Sponsored by

Why I Bought Stock In Netflix; Company Could Have 100M Subs By 2018

When writing about public companies on my blog, I have said many times that I have never bought, sold or traded a single share of stock, in any public company, ever. While my blog is not a site for financial advice or guidance, I still wanted to disclose that yesterday I purchased stock in Netflix. For me, it’s a long-term play as Netflix really seems to be firing on all cylinders right now and I like their road map for international expansion.

Their subscriber growth outside of the U.S. is starting to see some nice gains, they continue to add new content to their inventory and they are doing a good job of licensing content for the geographic regions they are expanding into. With $455M in revenue from international subscribers for Q2, Netflix’s well on their way to becoming a company with 100M subs within a few years. Most on Wall Street estimate it will take Netflix until 2020 to reach 100M subscribers, but I think they will do it sooner than that. With just over 65M customers, Netflix only needs to add 3M new subs a quarter, over the next 13 quarters, to reach 100M subs by the end of 2018.

Some argue that will be hard once Apple comes out with a streaming subscription service of their own, but until Apple actually has something in the market, if it ever happens, and we can judge whether or not it can compete with Netflix, there is no threat. And while Amazon competes with Netflix, to date, they have not shown they can keep Netflix from expanding. Netflix likes to talk a lot about how they see HBO as their biggest competitor, but HBO’s catalog of content is very small and the one thing consumers have shown us is that they want a lot of choice. Netflix’s content catalog offers real depth and breath of choice, whereas HBO’s is very limited.

What I’d really like to know is what methodology Netflix is using to judge the impact of their original content strategy on their business. They keep saying it helps their business, but we have no details on the direct impact. Does original content produce new subscribers, or is it simply a way to reduce churn? I think the latter, but until Netflix gives out some viewing statistics, which I don’t see them doing any time soon, we have no real way to measure it. But, they would not be spending $100M to create one season of House Of Cards if they didn’t see positive results on their business, so one has to simply trust that they know how to properly measure the impact of original content creation, good and bad, on their business.

Netflix has gotten so big now, with a market cap of almost $49B, that it almost guarantees that they will remain independent. Many have speculated that Netflix’s long-term strategy was to be acquired, but considering how big they are now, who would acquire them? There is almost no one who could afford them and the few that could, think Apple, Amazon, Microsoft, Facebook and Alibaba, have already missed the boat. Netflix is almost too big now for it to make sense for even any of them to acquire the company. And with Netflix only getting stronger, bigger and growing their business well outside the U.S., every day that goes by, Netflix’s value continues to rise. It really is amazing what Netflix has accomplished in just seven years and how fast they have growth.

Streaming Vendor News Recap For The Week Of July 6th

“Study” of ISP Slowdowns Conveniently Ignored Real Cause, ISPs Not At Fault

Last week, so-called consumer advocacy group Free Press announced that according to data collected via their BattlefortheNet website, major ISPs were “slowing” content services at key interconnection points. Free Press pitched their agenda to the media and some news outlets wrote stories saying major Internet providers were slowing traffic speeds for thousands of consumers across North America. But as it turns out, the Free Press came to the wrong conclusion when they accused the ISPs of being responsible. The main provider having the problem with the ISPs, GTT, confirmed they were given extra capacity by some ISPs, dating back more than six months ago, and that GTT simply hasn’t turned up that extra capacity yet. Updated Tues June 30: GTT and AT&T have entered into an interconnection agreement.

The data that Free Press is highlighting shows that GTT, a transit provider that connects to many ISPs, was having capacity problems with AT&T, Comcast and other ISPs in select cities. Naturally everyone assumed it must be the ISPs fault and interestingly enough, GTT told me that not a single member of the media of the Free Press contacted them for more details. I reached out to GTT and they were happy to set up a call and very easy to talk to. While GTT could not disclose full details on their peering agreements/relationships, I did confirm that multiple ISPs provided GTT with extra capacity, which the company is still in the process of turning up. But it doesn’t stop there.

GTT actually turned down capacity at interconnection points as they are shifting their flow of traffic because of acquisitions they have done in the market and consolidating how they connect with ISPs. In the last six years, GTT has acquired five companies (WBS Connect, PacketExchange, nLayer Communications, IP network Tinet from Inteliquent, UNSi) and a few months ago, announced an agreement to acquire their sixth, MegaPath.

As a result of the acquisitions, the nLayer side of GTT has been shutting down their connections, specifically AS4436, and moving that traffic over to the Tinet connections, AS3257. To make it simple to understand, GTT is simply consolidating networks and shifting how they connect to ISPs through different connections, while terminating others. So the capacity issues that Free Press data shows is a result of GTT essentially shutting down those connections and not because of any wrong doing on the ISPs part. The M-Labs data, that the Free Press is using, is measuring a problem that GTT owns and as GTT told me, extra capacity was made available to them before M-Labs even started their measurements.

Taking it a step further, public info that details GTT’s network shows that GTT/Nlayer (AS4436) now has all traffic behind the SFI relationships of GTT/Tinet (AS3257) and is no longer connecting with other networks. When you look this AS up in the peering database GTT says, “We are no longer accepting Peering requests for this ASN”. GTT has a lot of AS numbers and looking at all of them it shows the consolidation taking place and the reason they are no longer accepting peering requests for AS4436, since it is being shut down.

Data also shows that GTT/Nlayer (AS4436) was once connected to multiple networks and likely paying for connections to Tier 1 networks. These paths still exist and the BGP information is still available, but will likely be gone soon. GTT/Tinet (AS3257) is a 1Tbps+ network with “balanced” traffic and GTT/Nlayer (AS3257) is a 1Tbps+ traffic source with “mostly outbound” traffic. Of course none of this is info the average consumer would know how to look up or even understand, and that’s exactly what the Free Press and others want. It was not hard to find out the cause of the performance issues if you simply asked GTT, looked at public network info and asked the ISPs.

The take away from all of this is that many are far too quick to judge who is at fault when it comes to network performance topics, without talking to all the parties involved and having all the facts. GTT is making changes to their network, working closely with ISPs, already has the relationships in place and is working to solve any performance problems. While some like to say that these networks can just “flip the switch” to fix issues, it does not work that way, especially when you are consolidating networks, like GTT is. Many are quick to want to lay blame on ISPs just because it is fashionable to want to hate service providers or push an agenda like the Free Press.

It’s clear that the Free Press should not be trusted as they used wrong conclusions from the data to push their agenda. Even if they didn’t do it on purpose, it shows the Free Press has a complete lack of understanding of the data being collected. They don’t understand that when a Tier 1 network or CDN makes changes to their infrastructure, it impacts the data that is being collected. Don’t point fingers unless you talk to all the parties involved and review ALL of the data available in the market, not just a slice of it.

It should also be noted that GTT told me that no one from the Free Press ever contacted them, before the Free Press laid blame on the ISPs. If they had, GTT would have been able to inform the Free Press of some more details, which the Free Press neglected to even look into. I also find it interesting that while Free Press says they are “fighting” for consumers rights to “communicate”, the Free Press doesn’t allow any comments, on any of the posts they publish on their website. To try to discredit me, the Free Press has also called me a “paid cable-industry operative”, which isn’t true and is laughable to suggest, considering that I have written two detailed blog posts calling out Verizon for “poor business practices“, just within the past three months. Apparently, sticking to the facts is not something the Free Press does very well.

The Free Press has no wiggle room on this and if they don’t edit their blog post to correct their accusations, then it only proves they care about their agenda, not the truth.

Note: For those that say I am standing up for the ISPs, I’m not. They can speak for themselves. This is about getting to the real root cause of the problem, not the “perceived” problem that someone like the Free Press likes to promote. Like many, I am tired of all the vagueness and double-speak around discussions involving interconnects, peering and transit topics. We need more clarity, not more politics.

The Guardian’s Story About ISPs “Slowing Traffic” Is Bogus: Here’s The Truth

On Monday The Guardian ran a story with a headline stating that major Internet providers are slowing traffic speeds for thousands of consumers in North America. While that’s a title that’s going to get a lot of people’s attention, it’s not accurate. Even worse, other news outlets like Network World picked up on the story, re-hashed everything The Guardian said, but then mentioned they could not find the “study” that The Guardian is talking about. The reason they can’t find the report is because it does not exist.

In an email exchange with M-Labs this morning, they confirmed for me that there is no new report, since their last report published on October 28th, 2014. So The Guardian wrote a story about a “study released on Monday”, referencing data from M-labs, but provides no link to the so-called study. The Guardian does cite some data from what appears to have been collected via the BattlefortheNet website, using M-Labs methodology, which uses tests that end users initiate. Tim Karr of the Free Press, one of the organizations that makes up BattlefortheNet is quoted in The Guardian post as saying that, “Data compiled using the Internet Health Test show us that there is widespread and systemic abuse across the network.”

What The Guardian story neglects to mention is that this measurement methodology that the Free Press is highlighting, was actually rejected by the FCC in their Measuring Broadband America report. They rejected it because the methodology wasn’t collected in a real-world fashion, taking into account all of the variables that determine the actual quality consumers receive, as others have shown. (one, two, three) Updated 1:10 pm: M-Labs just put out a blog post about their data saying, “It is important to note that while we are able to observe and record these episodes of performance degradation, nothing in the data allows us to draw conclusions about who is responsible for the performance degradation.” M-Labs did not include a link to any “study” since they didn’t publish one, but you can see a Google Docs file of some of the data here. It’s interesting to note that the document has no name on it, so we don’t know who wrote it or published it to Google Docs. Updated 2:28 pm: Timothy Karr from Free Press has deleted all of the data that was in the original Google Docs file in question and simply added two links. It’s interesting to note that they published it without their name on it and only edited it once it was called into question.

Updated 2:07 pm: M-Labs has confirmed for me that they did not publish the Google Docs file in question. So the data and text that Free Press was showing the media, to get them to write a story, has now been erased. This is exactly why the media needs to check the facts and sources instead of believing anything they are told.

If the Free Press is referencing any “study” they may have put out on Monday, using M-Labs methodology, it’s nowhere to be found on their website. So where is this “study”? Why can’t anyone produce a link to it? Mainstream media outlets that picked up on The Guardian should be ashamed of themselves that they didn’t look at this “study” BEFORE they ran a story. This is sloppy reporting when you reference data in a story you haven’t seen yourself or even verified that a “study” exists.

Adding even more insult to injury, The Guardian piece has no basic understanding of how traffic flows on the Internet and the difference between companies that offer CDN services versus those that offer transit. The Guardian piece calls GTT a “CDN” provider when in fact, they are nothing of the sort. GTT is an IP network provider, they offer no CDN services of any kind and don’t use the word CDN anywhere on their website. At least one other news site that also incorrectly called them this has since corrected it and gotten the terminology right. But once again, some news outlets simply took what The Guardian wrote without getting the basics right or checking the facts. Others did a good job of looking past the hype.

The Guardian piece also says that, “Any site that becomes popular enough has to pay a CDN to carry its content on a network of servers around the country“, but that’s not entirely true. Netflix doesn’t use a CDN, they built one itself. So you don’t “have” to pay to use a third-party CDN, some content distributors choose to build and manage their own instead. The Guardian piece also uses words like “speed” and “download” interchangeably, but how these words are used have very different meanings. Speed is the rate at which packets get from one location to another. Throughput is the average rate of successful message delivery over a communication channel.

Even if The Guardian article was trying to use data collected via the BattlefortheNet website, they don’t understand what data is actually being collected. That data is specific to problems at interconnection points, not inside the last mile networks. So if there isn’t enough capacity at an interconnection point, saying ISPs are “slowing traffic speeds” is not accurate. No ISP is slowing down the speed of the consumers’ connection to the Internet as that all takes place inside the last mile, which is outside of the interconnection points. Even the Free Press isn’t quoted as saying ISPs are “slowing” down access speed, but rather access to enough capacity at connection points.

It should be noted that while M-Labs tells me they had not intended to release an additional report, because of The Guardian post, M-Labs will be putting out a blog post that broadly describes some of the noticeable trends in the M-Lab data and “clarifies a few other matters”. Look for that shortly. M-Labs blog post is now live.

Thursday Webinar: Effective Multiplatform Delivery – Formats, Players and Distribution

Thursday at 2pm ET, I’ll be moderating a StreamingMedia.com webinar on the topic of “Effective Multiplatform Delivery“. You can register and join this webinar for free.

Streaming Vendor News Recap For The Week Of June 15th

Here’s a list of all the releases I saw from streaming media vendors for this week.