Archives

Shifting Video Viewing Behavior Is Forcing Publishers To Revamp Their Cross-Device Programming Strategy

Data from Adobe has shown that tablet and smartphone viewing accounted for nearly 40 minutes of daily viewing in 2015. This growth has not come at the expense of desktop or connected devices as mobile will continue to be a major story in 2016 as it drives overall growth in video consumption. While this is good news overall, it does present a number of new challenges that will face publishers in 2016.

Screen Shot 2016-02-03 at 6.47.41 PMWhat this data shows is that video viewers are increasingly accessing content through multiple entry-points throughout the day. These entry points, by nature of technology and context have unique user experiences. What works on desktop, can be intrusive, clunky, and bandwidth hogging on mobile. Those 37 minutes of desktop and connected device viewing are more continuous than the hop on / hop off viewing habits of mobile.

Screen Shot 2016-02-03 at 6.49.31 PMAnonymous data from a publisher that began with a mobile to desktop split of 81/19 in September 2015 and moved to close to 50/50 by year-end by increasing overall desktop views.

Iris.TV shared with me some data from one of their customers that publishes to both web and mobile environments and saw desktop views increase to a level where by month four, shares of views was split evenly on desktop and mobile. There was an increase in Video Lift and reduction in bounce rate across the board as well. Iris.TV measures the average video engagement in the viewing session as Video Lift, which is the measure of recommended video views divided by the initial clicked views. Bounce rate is the measure of viewers exiting the viewing experience prior to completion of the initial clicked video. Mobile bounce rates began at 84% but over time reduced to 75%. Video Lift increased from 22% to 39%.

By better engaging the user in mobile, publishers can organically drive them to desktop where there is higher video adoption and lower bounce which translates into more videos completed per session and more ads served. The data shows that publishers can grow overall value by programming content across devices to engage viewers. User engagement is a critical driver of adoption and retention across platforms, but especially on mobile. For the most part, programming a consistent user experience across devices has been the domain of the TVE/SVOD/OTT offerings from Netflix, Amazon, and the like. Viewers of films and serialized TV are able to pick up where they left off across devices with authentication via logins. YouTube and a few other mobile apps have been able to offer cross devices experiences, but they are few and far between.

The portion of the market that should be innovating around cross-device programming are publishers of short-form ad-supported video. Upwards of 70% of short-form videos are directed from social media, followed by audience development and organic traffic. Social is a mobile medium with Facebook leading the way. Publishers need to manage the mobile social video experience to drive engagement but not become too dependent on these platforms for video. By not programming better on their O&O mobile web, they are leaving money on the table and missing opportunities to increase user engagement and retention across all entry-points.

So what can publishers do? Budgets are tight and not everyone can invest in video-centric apps. But publishers can and should be utilizing data-driven business intelligence to determine what videos perform well and in what context. The focus should be on video performance with respect to content category, device, content length, social engagement, and time-of-day, which would improve the user experience.

Sponsored by

Internet Measurement & Optimization Provider Cedexis Raises $22M In B Round

Cedexis, best known for their internet measurement & optimization platform has raised $22M in a B round, led by Ginko Ventures. Foxconn, Nokia Growth Partners (NGP), Citrix Systems Ventures as well as Cedexis’ Series A investors, Advanced Technology Ventures and Madrona Ventures also contributed to the funding. Cedexis has now raised $33M and has established themselves as the company to watch in this segment of the market. Customers that are adopting interesting and complex cloud technologies are relying on the Cedexis platform, where QoE measurement is vital to their business.

While other vendors provide some of the features Cedexis has, none of them have the measurement data that Cedexis has, from 800+ enterprises and every major cloud and CDN provider in the world. Cedexis is at the forefront of enabling cloud-based application architectures that focus on end-user QoE as a core design consideration, and I’ve yet to come across a single Cedexis customer that doesn’t absolutely love what they do. If I had to make a list of the top five companies doing something truly unique and valuable tied to content delivery, I’d probably have Cedexis at the top of the list.

Cedexis plans to use the money to expand their sales, marketing and engineering headcount, adding about 20 new employees this year. The company also plans to expand their services and reach into Europe and Asia and add more solutions for video, Software-as-a-Service (SaaS) services and web application delivery. Cedexis is seeking Big Data, Cloud and Networking Infrastructure Engineering talent, as well as Sales and Marketing professionals with expertise in Enterprise SaaS software. Anyone interested in learning more about the positions Cedexis has open should visit their career page. You can also send me an email with your resume, and I’ll pass it along to the company.

 

Recent CDN Market Sizing Reports Are Flat Out Wrong: Terrible Data, Lots Of Errors

Over the past two months, I’ve seen quite a few market sizing reports released about the content delivery market. Unfortunately, the reports are really bad, with factual errors that are so obvious, it makes you wonder how any of the companies publishing the reports would expect anyone to buy them.

For example, multiple reports list vendors that are covered in the competitive section of the report, that haven’t been in the market in years. Contendo was acquired by Akamai four years ago and should not be on any competitive list. Skytide, which isn’t even a CDN, was acquired over two years ago. Another report lists Telestream, Kaltura, Cisco and Adobe as leading CDNs, which of course they aren’t.

Many reports also spell vendor names wrong. They don’t know if the company name is one word or two and don’t know how to properly capitalize the company name. That might seem trivial, but it shows they really don’t know the market if they don’t even know how to list vendors names properly.

Another problem is that these reports total revenue and market sizing from products and services that don’t have anything to do with CDN. For instance, this report gives out the size of the CDN market, but includes revenue from “data security”, “cloud storage”, “transcoding” and “digital rights management”. Those aren’t CDN services.

And the press releases promoting these reports sound like they were written by someone who doesn’t understand English. You don’t deliver “contents”. And I like this line that says, “The recent improvement in bandwidth cost has vastly increased the consumption of internet.” You don’t “consume” the Internet, you consume content over the Internet.

It’s also a dead giveaway that something is wrong with the report when they don’t list the name of the author who wrote it. If you don’t know the expertise of the analyst who wrote it, why would you buy it? And yet, these firms will gladly accept your money, usually $3K-$5K, and sell it to you anyway. Bottom line, don’t waste your money on any CDN report that doesn’t list who the analyst is, doesn’t list the leading vendors properly and doesn’t describe the methodology used to provide the market sizing numbers. Any good analyst will be happy to describe their methodology to you, before you buy the report.

So what is the size of the CDN market? All depends on which CDN service you are looking at. The size of the market for video delivery, is very different from the size of the market for dynamic content delivery, which is different from the app acceleration market. In 2015 the size of the CDN market for media and software delivery was $4B dollars. Other segments of the market are harder to nail down, but using the right methodology, you can come up with good estimates. If you have questions on the size of any segment of the CDN market, feel free to reach out to me at any time. I’ll share what I have.

Samsung’s SmartThings Home Automation Platform Is So Unreliable, They Should Stop Selling It

Screen Shot 2016-01-14 at 1.36.05 PMIn addition to being interested in streaming media technology, I also test out and use a lot of home automation platforms and security cameras. I have systems from Canary, Sharx, Logitech, D-Link, Nest, Arlo, Piper, Archos, Wemo, Smart Things, Wink, Wirelesstags.net and others that get well tested in my household. And while I don’t usually blog about home automation technology, my experience with the SmartThings platform has been so horrible, I wanted to make my comments public in the hopes that Smart Things, or someone from Samsung, will actually care enough to fix it.

For over a year now, SmartThings has struggled to do something simple, turn on and off my lights. I have three lights setup to go on at 4pm and scheduled to turn off at 11pm. And yet, twelve months later, SmartThings still isn’t reliable. Sometimes the lights work perfectly, other times they never work for days at a time. Sometimes only one will turn on and off, while the others don’t do anything. Making the problem worse, SmartThings, which is owned by Samsung, doesn’t even have a support number you can call for help. All support is done in a chat window, which makes the support process very cumbersome.

Months ago, after complaning on Twitter to the CEO of SmartThings, he had someone from support call me, and since then, I’ve had three different calls with support. And yet still, the lights don’t turn on and off correctly. On each support call, they have acknowledged that something on their end isn’t working right, and they’ve never fixed it. I’ve been told that the issues I have are due to some “platform issues” as well as  “outages” and that sometimes “devices act up.” After they think they have fixed it, they have emailed me to say that “things should be considerably smoother now than the last week or so,” except they aren’t.

The SmartThings platform simply does not work. And if I can’t rely on it for turning lights on/off, how would I ever trust it to do things involving security in my home? Another thing that doesn’t work right is their iPhone app. Every couple of days, it asks me to login, not keeping any of my info stored. And many times, the light icon will show green, indicating it is on, when in fact, it is off. Even SmartThings support told me that the way their system works, if the action does not trigger, it won’t then see any future scheduling you have set up. So it if breaks once, it won’t let any of the other actions fire that you have scheduled. Of course, this makes no sense at all and Smart Things support agreed that it is something they need to “work on”.

SmartThings advertises their platform as “intelligent living” and yet, it can’t even turn on/off lights reliably. There is nothing smart or intelligent about their platform at all. It is hands-down the most unreliable technology I have in my house, other than the Nest smoke alarms, which I already sent back last year. Samsung should make SmartThings fix their platform, or just stop selling it all together.

Facebook Details How Their CDN Works, Discusses Latency, Scaling, and Caching

In an interesting post yesterday, Facebook took to their blog to shed some light on how their CDN works for delivering live video for their recently released Live for Facebook Mentions offering. The post discusses how they handle scaling, latency, encoding, edge caches and proxies. You can read more about it here.

Streaming Video Alliance Ends 2015 With 38 Members, New Working Groups

In November, members of the Streaming Video Alliance meet at Fox Studios in Hollywood to define the future of online video and celebrate the Alliance’s one-year anniversary.

SVA at Fox Studies Nov 18 2015

The alliance welcomed new members including Concurrent, Encompass Digital Media, IneoQuest, Mobolize, NBCUniversal, Verimatrix and Vubiquity. Total members in the alliance now stands at 38, with more coming on board in the new year.

Screen Shot 2016-01-12 at 8.56.57 PMThe association is hard at work evaluating many subjects for working groups including Ad Insertion and Audience Measurement, Client Dev Framework, Encryption/Privacy, Geo Caching, Accessibility and Scaling amongst others. We will announce the formation of the final working groups in the new year. If you have any questions on joining the alliance, please reach out to me at any time.

Find Anomalies: It’s Time for CDN’s to Use Machine Learning

CDNs play a vital role in how the web works, however the volume, variety and velocity of log files that CDNs generate can cause issue detection and mitigation to become exceedingly difficult. In order to overcome this data challenge, you need to first gain an understanding of your normal CDN patterns, and then identify activity that deviates from the norm—in real time. This is where machine learning can take CDN services to the next level.

The Challenges
CDN operators ought to be immediately notified about sudden increases in bandwidth consumption at the PoPs or proxies in a network in order to take corrective action. The identification process begins by understanding which sources (customers) were the causes of abnormal peaks in bandwidth consumption. Without the ability to quickly obtain insights from fresh data, you won’t be able to foresee and prevent issues from escalating into full blown headaches.

Another challenge arises from CDN system upgrades. In order to run A/B testing on specific segments of proxies, it is best to gradually deploy system upgrades. That way, you can better prevent the possibility of widespread errors. Today, CDN providers still lack the real-time visibility that is needed to address basic issues such as an increase in HTTP errors, IO access or cache churn rates. This pitfall results in delayed upgrade releases, which in turn directly impact CDN providers’ pace of innovation.

I recently had an interesting discussion about these challenges with David Drai, Founder of Anodot. In the past, Drai co-founded Contendo, a system that optimizes CDN consumption. In 2012, Contendo was acquired by Akamai and Drai became the CTO of EMEA at Akamai. David said one of the main cases he remember revolved around a version release. “We had this bug that we didn’t uncover during an A/B test of the version and we released it to all network proxies. As the person in charge, it was a complete disaster.”

Log Analytics Is Just Not Good Enough
In order to cope with these challenges, CDN providers leverage log analytics systems that gather and record billions of transaction logs from relevant proxies. In most cases, these tools are built in-house and are used to run queries and retrieve insights about network performance. But that doesn’t suffice. In some cases, these are legacy systems that don’t scale and these tools are typically not intelligent enough to automatically provide results in real time. Therefore, a report that is generated may be based on relatively old data, which, in turn, would result in delayed or outdated responses.

“Say a CDN operator wants to get information about a specific customer’s RPS consumption rate per proxy and per PoP for a whole month. To obtain this information, the log management solution needs to scan billions of customer logs and extract the desired customer’s transactions, which can take days,” Drai explained. Another challenge CDN providers face is related to visibility into the operator’s network performance. CDN providers use tools such as Keynote, Gomez and Catchpoint that measure network latency, to switch to other providers if and when the need arises. However, although these solutions provide insights in real time, it is still a challenge to correlate current issues with an operator’s performance.

David says that “when dealing with CDN issues, time is of the essence. The user download rate of one of our gaming customers at Contendo decreased by 10-15% due to an issue that took us almost a week to detect. In the world of CDN, that kind of delay can significantly damage the CDN provider’s reputation.” Last but not least, one of the main issues with most traditional analytics systems as well as modern log management tools is manual, preconfigured dashboards, reports and alerts. In the dynamic world of CDN, there is no limit to new issues and information that can be extracted from the vast amount of available data.

We Need a Different Approach
What if we could predict a bottleneck in an internet router not solely based on simple BGP rules, but on true science? Over the last decade, data analytics technologies have evolved from complex and cumbersome solutions to modern and flexible big data solutions such as Hadoop. Over the last few years, these big data technologies, including Cassandra and MongoDB, have gained the industry’s trust and have become an important component in every IT environment. The next step involves incorporating new analytics solutions that allow you to run queries on top of these data engine – think about Google analytics for CDN providers.

However, with all of the great advancements that are being made in the realm of big data, the existing monitoring tools aren’t enough. As noted above, current monitoring tools are based on human analysts that define and create flat reports and dashboards. Even with numerous different reports, when it comes to CDN patterns, reality has proven that you can’t cover all cases and be notified in real time about current abnormal behavior development. It is simply not feasible when you talk about tens of thousands of different data points in multiple dimensions.

The next step in the world of analytics is machine learning. The ultimate solution is to automate data-based learning, then develop insights and make relevant predictions. This new discipline involves running pattern recognition algorithms and predictive analytics. And while it may seem far fetched, it is already in the works by Drai and his team at Anodot. They are aiming to solve the CDN challenges outlined above as well as additional use cases that predictive analytics solutions can help with. David says Anodot’s algorithms learn and continuously define CDN normal behaviors, and can therefore send out alerts about anomalies and automatically correlate between different data points. For example, the system will alert only if there is an increase in the number of HTTP errors across several proxies. “The key is zero human configurations.”

Screen Shot 2016-01-12 at 5.36.25 PMFinal Note
In a previous article, I wrote about Apple’s multi-CDN strategy. Think about a world where advanced analytics systems predict bottlenecks and automatically route traffic to its most appropriate CDN and optimal proxy. Predictive analytics can be a great solution to the challenges that CDN providers have faced for years now and it seems like machine learning solutions such as Anodot will be able to take this industry to a new level, creating great new development opportunities into a world that has suffered from complexity and a lack of visibility for years.