Streaming Media Magazine Readers’ Choice Awards: Nominations Closing

Rca08logolarge
Over at StreamingMedia.com, we’re hard at work preparing for the next Streaming Media West show taking place September 23-25th in San Jose. Once again, we’ll be having a reception for the winners of Streaming Media magazine’s readers choice awards. Last year’s awards were a smashing success, with 92 companies nominating more than 120 products, and more than 3,000 Streaming Media readers logging their votes. Winners will be featured in a special section of the October/November
issue of Streaming Media magazine as well as a video feature to be
posted on StreamingMedia.com. The deadline for end users or vendors to submit nominations closes out today so get your nomination in!

The editorial staff at StreamingMedia.com will evaluate all submissions to make sure they’re legitimate and proper fits for the categories in which they’re submitted, and the final list of nominees will be posted on July 23. Voting will begin on July 23 and remain open until August 25. Winners will be notified directly after voting closes.

Please note: I am not involved in any awards decisions so all inquires should go to Eric, the Editor of StreamingMedia.com

Sponsored by

Amazon’s Cloud Not Being Used For New Video On Demand Service

Last week, numerous sites were reporting that Amazon would be storing and delivering the movies for their newly announced video on demand service from the Amazon cloud computing infrastructure. As of now, Amazon is not storing or using any of it’s cloud computing infrastructure to store or deliver any of the movies and Amazon has confirmed that their Web Services group is not involved with supporting the video on demand offering. While this could always change between now and when then product rolls out of beta, today, the streaming and storage of content is being done via Limelight Networks.

This really comes as no surprise since Amazon’s Web Services offering is not setup to do streaming, does not have a global footprint and as we saw this past weekend, still has problems with major outages. Not that traditional CDNs don’t have down time, but I can’t remember the last time an Akamai or Limelight was down for fours hours at a time.

I’ve also learned that even when the Amazon video on demand service rolls out of beta, HD quality videos will not be available. No one will confirm the time frame for when HD quality will be added, but as of now, it won’t be anytime soon.

Amazon Not Building Out AWS To Compete With CDNs

Over the past few months, there has been a lot of speculation on what Amazon might be working on in regards to their EC2 and S3 service and whether or not their AWS offering is going to compete with content delivery networks. Adding to the confusion was the fact that Jeff Bezos made a brief announcement of a new upcoming streaming service but was scant on details.

Today, Amazon’s new video on demand content offering went into beta. The new offering is not a streaming service that has anything to do with AWS and is simply a new content offering that enhances the functionality of Amazon’s legacy Unbox service. The Amazon Web Services group is not supporting the new streaming content offering and the content is being delivered by Limelight Networks.

That being said, Amazon’s EC2 and S3 service continues to be a good option for developers who want to deliver their own video and provides a cheap and flexible way to do so. The Amazon service is not going after the same size customers the CDNs are and does not provide many of the elements a CDN does. Amazon is not going to be taking any major business away from the CDNs for numerous reason.

Amazon’s EC2 service is located in the U.S. and S3 is only located in the U.S. and Europe. There is no global coverage with either service although in an interview I did this week with Adam Selipsky, VP, Product Management and Developer Relations for Amazon Web Services, he did say that down the road they will expand into the Asia Pacific market with coverage. Many customers who use CDNs due so in order to take advantage of the global delivery. The average CDN customer is also not a developer, which is exactly who Amazon’s customers are. In order to use the EC2 and S3 service, you have to be a customer who wants to be very hands on, do a lot of the work yourself and in the case of streaming, license the server software from Adobe or Microsoft. Amazon does not provide any video specific reporting tools for raw logs and AWS customers have to deal with parsing raw logs through a third party system, specific for video.

The Amazon service also does not work well for live streaming as EC2 and S3 are not setup for edge delivery and was designed for storing objects, something that does not take place when the stream is live. There are also no additional content services in the ecosystem that many customers need like transcoding, authentication or stream protection. The bottom line is that right now, the EC2 and S3 products are for a very different set of customers than the CDNs are targeting. If you are a developer that needs U.S. based delivery, Amazon could be a really good fit and they publish their pricing on their website, something no major CDN does.

What I think Amazon should do is license the Adobe Flash Media Server directly from Adobe and then rent it out on an hourly basis like Amazon already does with other third party platforms. This would enable more developers who need the FMS server to look at Amazon as an option, instead of having to go and buy their own license from Adobe.

While Amazon’s AWS service is not a fit for most of those who use a CDN today, it is interesting to see how some of the CDNs are using Amazon’s service to their advantage. Digital Fountain is building their streaming only, U.S. based CDN on Amazon Web Services and other CDNs like Voxel.net have direct integration with Amazon’s S3 API.

Note: Werner Vogels, Amazon’s CTO, will be one of our keynote speakers at the Streaming Media West show in September in San Jose. Registration is open and all keynotes are free. Register now for your pass to see Werner talk more about Amazon’s Web Services.

YouTube Coming To TiVo, But For Less Than 750,000 Users

Updated Post: TiVo and YouTube announced a deal today that brings YouTube content available to TiVo Series 3 customers. While it’s good to see TiVo add more content to their box, something I think they have been severely dragging their feet on, unfortunately it won’t have any major impact for either company since it is only available for TiVo customers with a Series 3 box connected via broadband. While TiVo has about 4 million subscribers, only about 750,000 of those are on a Series 3. And out of those 750,000, how many are connected to a broadband connection and not just a phone line? I know of many friends who have a Series 3 but only connect to a phone jack as they don’t have a TiVo WiFi adapter and their phone jack is near the TiVo and easy to plug into. While TiVo does not give out exact numbers on how many Series 3 boxes they have sold, TiVo did e-mail me on 7/21 to say that they have 750,000 Series 2 AND Series 3 TiVo’s connected via broadband, excluding DTV and Comcast. Since they are more Series 2’s out in the market, the number of Series 3 TiVo’s that make up the 750,000 number is far lower than that. So while the new YouTube offering is available for less than 750,000 customers, a more realistic number is probably under 300,000 if you figure that half of the broadband connected TiVo’s, if not more, are Series 2. (NewTeeVee has a video of the new service here)

While this will be nice for some customers, the majority of those who view YouTube content will still do so from a computer and not from the TV. Since the Apple TV announcement with YouTube, I have not seen any reports by either company as to what kind of viewing numbers YouTube is getting on Apple’s device and clearly it is small. This new announcement is a nice to have option, but won’t enable TiVo to sell more units and will make no impact in helping YouTube try and make money from eyeballs.

BT To Launch FiOS Like Fiber Service In The UK

Images
Yesterday, BT announced that it would spend roughly $US 3 billion to roll out a new fiber based broadband service capable of supporting up to 100Mbps with the goal of making  it available to 10 million homes by 2012. Similar to AT&T’s UVerse service, in most cases the BT fiber will link directly to the street cabinet, called FTTC (fiber-to-the-cabinet), and will not be installed to the home, called FTTP (fiber-to-the-premises), like Verizon’s FiOS service is. The Olympic village and other select locations will have the fiber connected directly to the premises, but those look to be less than 10% of the installs.

Reports say that BT will provide customers with an initial speed of 40Mbps with it increasing to 60Mbps based on accelerating the technology. Today, BT’s standard fast broadband speed now tops off around 8Mbps with ADSL2+ offering a maximum speed of 24Mbps. However, a large portion of UK broadband customers of BT still complain they see nowhere near the 8Mbps rate advertised and are being mislead. Virgin Media is hitting back at BT’s announcement saying they have already spent $26 billion to make 50Mbps rates available to 12 million customers in the UK by the end of this year.

Reports say that the UK still lags behind economies such as France, South Korea and Japan in terms of the maximum broadband speeds available to consumers. The bottom line for folks in the UK is that more competition is coming to the area which means speeds should increase and prices should go down, over time. More competition for truly fast broadband connections are needed in the UK and they need to be rolled out in outlying areas and not just major cities, something BT is saying they are committed to doing.

Q&A With John Dillon, CMO of Hybrid CDN Velocix

John_dillon_cmo_velocix_headshot
Following up on my post last week about hybrid-based CDN Velocix, I spent some time chatting with their CMO John Dillon about the company’s hybrid offering and what he’s seeing from P2P customers in the Europe and how that compares with the U.S. market.

Question: When do you expect the company to be profitable?

John: We don’t make any forward looking statement about our financial position.  Velocix is privately held and backed by two of Europe’s leading venture capital firms – 3i and Amadeus.

(Note from Dan: To date, Velocix has raised just over $40 million and is not yet profitable. I estimate they will do between $6-9 million dollars in
revenue for 2008.
)

Question: How is the European market for P2P services different than the U.S. market?

John: Outside of the UK, the understanding and level of interest in the use of P2P technology for delivery of legitimate commercial service is fairly consistent. It it widely accepted that this technology will play a fundamental role in shaping the future of online video. The reality is that the majority of the accounts we have today are interested in P2P but are using our traditional http and streaming services. Right now, they just want to get launched so that they can start to build an audience.   

The benefits of Hybrid P2P is not always obvious at the outset.  It is only when online services begin to gain traction with significant audiences, that the cost of delivery and scalability become significant factors. This is when P2P begins to look increasingly attractive as an option. In the UK however, there is a micro-climate around a number of the major broadcasters. The BBC, C4 and Sky have all successfully launched P2P-download based catch-up TV services. 

Question: What percentage of your revenue comes from the U.S. today, and how do you expect that to grow moving forward?

John: Our business splits out at approximately 40% U.S., 50% EMEA (Europe, Middle East, Africa), 10% Asia Pacific. The US is a key growth market for us. As a European headquartered company, it is fair to say that we currently have a stronger market presence in our home market than elsewhere. However, we have just secured a number of key strategic wins in the U.S. and will be looking to accelerate our growth plans in this geography off the back of these deals.

(Note from Dan: Some of these U.S. based wins John references are significant and this is not a case of the CMO just giving marketing speak. I’ll detail some of these wins at a later date, when I am allowed to talk about them.)

Question: Why do you think so many broadcasters in UK had started to use P2P in some form, but no major broadcasters in the U.S. have adopted it as of yet?

John: I referred to this phenomenon earlier as a micro-climate surrounding leading broadcasters in the UK. The reality is that these guys were the pioneers. There were few if any other examples for them to follow at the time. They were blazing a new trail. In late 2005, the BBC began, what was at the time, one of the first commercial P2P trials. The first to launch however was UK satellite TV provider Sky, with their Sky by Broadband service, subsequently re-branded as Sky Anytime. Next to launch was Channel 4’s with their 4oD catch-up service. Finally, and arguably most significant of all was the launch of the BBC iPlayer service last year, augmented with streaming services this past Christmas.

I’m guessing that in these early days, ideas and plans were shared and they all ended up taking a very similar approach. What is interesting to note is that looking forwards, they plan to officially collaborate together, learning from their collective experiences to-date, to unify their approach with a project announced and code named Kangaroo.

Question: What is the barrier to entry for CDNs to make their
stand alone CDN offering a hybrid one and what is the cost/development
time?

John: A number of traditional CDNs have made noises about hybrid-P2P. Some
have made technology acquisitions and others have formed strategic
partnerships. Few have actually launched commercially available
services however, and little if any focus or marketing is evident. This
is most likely due to both economics and technology.

Firstly economics. Say for example, a major customer of a
traditional CDN provider could achieve 30% peer efficiency (30% of
delivery services from peers rather than from CDN caches) on average.
This would be a 30% reduction in revenue and a significant reduction in
profit contribution for the CDN. Significant market uptake would
challenge quarterly driven publically quoted CDN providers, placing
intense pressure on existing business models and cost structures.

From a technological perspective, bolting a P2P client network onto
an http caching infrastructure is clunky at best, with caches providing
“fill-in” via http byte-range requests. Custom routing and delivery
logic and algorithms are required both at the client and server end, to
force the network into performing unnatural acts to fulfill the
delivery requirement. Hybrid-P2P required a company to have the
appropriate business model and a network architected in a fundamentally
different way. It took us roughly 18 months to build out our network so
barriers to entry are significant.

Question: Please explain a little bit about how your network was built to support P2P from day one and how that is different from the other providers.

John: From the outset, we wanted to create a CDN optimized for delivery of large assets such as video, software and games. We realized very early on that the http protocol is not great for this. Http is ideal for serving web pages where connections are maintained for a few seconds. Even if a request fails, simply clicking refresh is acceptable and usually fixes the problem. There are two fundamental limitations:

1) For delivery of larger assets, like video for example, connections can last anything from a few minutes to several hours. Clicking refresh for a failure mid way through is not an acceptable option. Http is a single source protocol that represents in a single point of failure.

2) CDNs essentially replicate popular content in cache severs located around their networks. This is relatively straight forward for small files, but becomes problematic for larger multi gigabyte files. A significant shift in traffic profile can blow existing CDN routing and caching algorithms out of the water! Distributing and storing multi gigabyte files on a caching server network is a major operational challenge, particularly for traditional CDN providers who created and optimized their networks for website acceleration. 

What is interesting to note is that http limitations are essentially where P2P’s strengths lie. P2P is a proven technology optimized for delivery of massive files over time, rather than web pages in a few seconds. P2P protocols are designed to take content from multiple source locations rather than a single source, eliminating a single point of failure. Also, rather than the entire file being the smallest unit of currency, P2P slices large files into thousands of pieces, making them much easier to propagate across highly distributed networks.

These observations served as the design goal and fundamental architectural principle for the build-out of our CDN. Within our network, our cache and storage servers communicate using P2P protocols. All routing and management intelligence is based on P2P principles. We essentially have a high performance P2P Cloud network, where the peers are high performance cache servers. 

When a file is requested, our network uses sophisticated cache selection algorithms to identify a number of suitable cache servers for the delivery. This selection is made using both performance and economic criteria to maintain the required delivery speed at the lowest possible cost. These delivery cache servers communicate with each other and also “chatter” with other servers on our network, to make sure they have the content required to service the delivery need.  The delivery process dynamically blends content and bandwidth from the selected cache servers, ensuring that the resulting bitrate meets committed service levels. If at any time, performance from a delivery cache server degrades, either the others up their output or an alternative cache server is brought on stream.

Which Newswire Service Is Best For Journalists?

I’m trying to get some feedback on which newswire service is best for journalists who want to be notified via e-mail the moment a release from a particular company or with a particular word hits the wire. While I already get so many releases sent directly to me, and via RSS feeds and Google Alerts, I’m interested to hear what others may be using. I am looking at PRNewswire.com’s service for journalists, but so far have not been impressed with the releases it is sending me. I welcome any feedback in the comments section or to me directly.