Brightcove’s Jeremy Allaire To Step Down As CEO, Transition To Board Position

Brightcove just announced their Q4 and fiscal year 2012 earnings, with total revenue coming in at $88M for 2012. The company also announced that CEO Jeremy Allaire will step down as CEO and turn the position over to David Mendels, Brightcove’s current President and COO. Jeremy will transition out of his CEO role by the end of Q1 and will become Executive Chairman of the Board for the company.

While some are already asking me if Jeremy is being pushed out of the company, I have no reason to believe he is. Even though Brightcove’s stock price is near its 52 week low, I don’t think that has anything to do with the decision, like some have suggested. Jeremy is quoted saying that David is “the right person to lead Brightcove into its next stage of growth” and David Mendels was previously the SVP and GM of Adobe’s Business Productivity Unit, which generated over $1B in revenue. So the guy clearly knows something about building and ramping revenue.

While I haven’t had the chance yet to speak to either Jeremy or David about the news, this simply sounds like Brightcove taking the steps they think are best to be able to grow the company over the coming years. Brightcove needs to ramp revenue and get their company to that next growth stage and it sounds like they think David is the guy to do that. Plus, Jeremy’s been leading Brightcove for the past eight years, which is a long time to be CEO in any industry and bringing in new blood isn’t always a bad thing. So while some are suggesting to me that this change is bad news for Brightcove, or that Jeremy stepping down is due to something negative at the company, I see no reason why that would be the case. They are putting a new executive in the CEO role who has experience in growing and managing products and services that generated $1B in revenue. I don’t see that as a bad thing.

For 2013, Brightcove’s says they expect revenue to be in the $102M to $105M range.

Sponsored by

Amazon Launches New Cloud Based Video Transcoding Service

This morning Amazon announced a new cloud based video transcoding service called Amazon Elastic Transcoder. As with other AWS services, there are no contracts or monthly commitments and customers simply pay based on the number of minutes they need to transcode and the resolution of the content being transcoded (SD vs. HD). Amazon is offering the transcoding service in multiple regions including three in the U.S. (Oregon, Northern California, N. Virginia) as well as Ireland, Singapore and Tokyo. Pricing for transcoding content in standard definition (less than 720p) ranges from $0.015 (one and a half cents) for U.S. based regions to $0.018 from the Tokyo region. For HD content, Amazon charges 2x what SD costs. Amazon’s service is for on-demand content only and does not transcode live streams.

It was only a matter of time before Amazon added a cloud based transcoding service to their offering as it’s a natural way to enable content owners to get more of their content onto the Amazon network. But a lot of questions remain about how scalable Amazon’s new service is and how it works. On their FAQ page Amazon says that, “if a large number of jobs are received they are backlogged” but they don’t say for how long. Amazon also hasn’t said if their transocding service will output completed files to a third party CDN or only to Amazon’s own CDN CloudFront. [Update: You can only putput files to an S3 bucket.] One limitation of their service is that they only support H.264/AAC/MP4 for output formats. While those are the most popular formats today, customers who have legacy platforms or devices to support are limited in their choices. Also, the service does not currently offer the ability to create segmented output files, which is required for HLS streaming. Amazon says there is no SLA for the service at this time, so there’s no guarantee what a customer gets from a quality perspective or turn around time.

Screen Shot 2013-01-29 at 10.44.37 AMFrom quickly testing the service this morning, all video files first need to be stored in an S3 bucket and delivered back to your S3 bucket. I don’t see anyway you can upload the videos directly from your computer without first putting them in an S3 account. Amazon’s service comes with a bunch of preset transcoding templates, which you can see on the left. You can also create your own custom templates but twice when I tried to do that I got a page with an error of “an error occurred when we tried to process your request”. Transcoding however was quick with a 79.5MB four minute file taking just under 5 minutes to transcode and be delivered back to my S3 bucket.

There’s already a few cloud  based transcoding services built on top of AWS, like Encoding.com, that have been around for years and have reliable, guaranteed, service with a support number you can call to ask questions and get help with your transcoding job. So I don’t see Amazon’s new service taking share away from anyone else in the market any time soon. Amazon’s new service has a long way to go to being as easy to use or robust as Encoding.com’s cloud based service, but it makes sense for Amazon to offer it as they have tens of thousands of CloudFront customers already and know a certain percentage of them need a cloud based transcoding service that fits into the AWS ecosystem.

For those who want to try Amazon’s new transcoding service, the company is currently allowing anyone to transcode up to 20 minutes of content each month for free.

Looking For “How-To” Presentations at Streaming Media East

sm-west-arowsThe program for the 2013 Streaming Media East show (#smeast), taking place May 21-22, is coming along nicely with the program nearly complete. I’ll publish the advance program in about a week and will then start placing speakers for all of the round table panels. But I still have room for a few more how-to presentations, which are stand-alone speaking spots that instruct the audience how to do something. So if you’re interested in teaching the audience about a facet of the industry, reach out to me now with your proposal. Here is a list of the presentations already confirmed:

  • How To: Building a DASH-264 Client
  • How To: Choosing An Online Video Platform
  • How To: Best Practices For a Live Streaming Production
  • How To: Evaluating the Effectiveness Of Your H.264 Encoder
  • How To: Encoding Video for iDevices
  • How To: Choosing an Enterprise-Class Video Encoder
  • How To: Integrating Social Media With Webcasting at HuffPost Live
  • How To: Using VOD Content To Create A Live Channel

In addition to the how-to presentations, I’ve already confirmed speakers from HBO, Comcast, AOL, Yahoo!, YouTube, BBC, HuffPost, Boeing, Roku, Netflix, Viacom, Starz and others. The program is shaping up nicely and speakers are being confirmed much earlier than last year, so if you missed out on sending in a submission, reach out to me now.

HEVC (H.265) Adoption Is At Least Five Years Away For Consumer Content Services

High Efficiency Video Coding (HEVC), also referred to as H.265, is a video standard that is being developed through an ISO/IEC collaboration. HEVC planning was begun way back in 2004, shortly after H.264 was finalized and the topic has been getting a lot of exposure in the industry over the past few months. Many questions remain about HEVC including how quickly it can be implemented into the current video ecosystem and when content owners will adopt it.

The Digital Media group at Frost & Sullivan, which I am part of as a Principal Analyst, has been doing a lot of coverage on HEVC as of late. Our lead analyst on the transcoding side, Avni Rambhia, has published three reports that include details on HEVC including the Global Broadcast and DTT Video Encoders Market, Global Pay TV Video Encoders Market and Global Media and Entertainment Video Transcoding Market. Based on what we have seen in the market and data we have collected from suppliers, here’s our take on why HEVC adoption for consumer services is at least five years away.

The current HEVC draft was put out in July 2012 and the standard is expected to be ratified shortly. MPEG-LA, the licensing coordinator for all MPEG technologies, put out a call for applicable patents in July of last year. Several companies have early demos already available including Mitsubishi, NHK, Cyberlink, Broadcom, ATEME, Ericsson and Elemental Technologies, among others. Integration with chipsets is unlikely to begin until the standard is finalized and even after that, production will probably be initiated until critical mass of demand is achieved. However, although decoding will eventually reach the point of chipset integration, software based playback is not unrealistic given growing smart phone horsepower.

Video is a push industry, the community is constantly pushing the boundaries on resolution, compression efficiency and user experience to drive the media and entertainment ecosystem forward at ever-increasing speed. Two apparently disconnected use cases are the focus of market innovation this year, the explosion of video consumption on portable and personal devices, and rising investment in Ultra HD video at resolutions of 4K and even 8K. Each has its own challenges and technology requirements, but both share one crucial stumbling block – they cannot become global phenomena without a significant leap forward in video compression efficiency. MPEG-4 or AVC is the de-facto video compression standard today, and has played a key role in recent years in enabling Internet video, OTT services, IPTV and HD across all Pay TV services.

However, the technology has matured to the point where any advances are incremental, and prices are correspondingly seeing tremendous downward pressure. MPEG-4 is also not well positioned to enable Ultra HD transmissions in an economical fashion. Furthermore, with video accounting for as much as 90% of total bandwidth usage in North America during primetime – with less than a quarter of the Pay TV subscriber population watching on-demand OTT video – it is clear that lower-end video services cannot be served by AVC in the long-term. This is especially true because growth in HD-capable devices like tablets and the rising trend of watching OTT content on HD and Ultra HD connected TVs is further exacerbating an already challenging bandwidth situation.

HEVC or H.265 has been heralded as the solution to this problem. Offering up to 50% compression efficiency improvement over state of the art AVC codecs, H.265 is poised to disrupt the video ecosystem – both for M&E and for enterprise applications – yet again. Vendor excitement around the technology is high, with a number of announcements for HEVC-enabled products at CES 2013 and energetic R&D efforts underway to develop encoder and decoder cores (both software and hardware) now that the standard is finalized and development of a patent licensing program is underway. Amidst the fervent hype, it is easy to believe that HEVC is an immediate technology whose adoption curve will be soaring upwards in 2013 and 2014. But that’s not reality.

One can draw a parallel between the adoption curve of MPEG-4 as it gradually encroached into the supremacy of MPEG-2. We believe that while token adoptions – such as incorporation into DVB standards for terrestrial broadcasting – will occur in the short-term, and a few channels may also be launched by 2015, a critical mass of adoption will not begin to occur until at least 2016. History indicates this – even a decade after the launch of AVC, MPEG-2 remains a formidable force in Pay TV (particularly cable), owing to the massive footprint of legacy equipment such as set top boxes and transmission infrastructure that is all designed to work with MPEG-2 video.

Cost also remains an issue – many Pay TV operators in regions like Africa, Asia and Latin America are choosing MPEG-2 rather than AVC because of the significantly lower cost of consumer premise equipment (CPE) and video encoders. Considering the massive wave of investment in AVC equipment that we have seen in the last two years, we expect at least 5 more years of equipment life before economically stressed broadcasters and service providers will consider systemic upgrades. Any video technology touches many components as it travels from glass to glass, such as cameras, NLE systems, video indexing systems, statistical multiplexers, satellite transponders, head-ends and (perhaps most importantly) CPEs.

Similarly on the OTT side, transcoders, file formats, streaming protocols, streaming servers, content protection systems, network optimization platforms and end devices all need to support HEVC before an end to end solution becomes broadly viable. In their continual endeavor to fight commoditization and drive demand through continued technological disruption, vendors of video technology and consumer electronics devices alike are engaged in fast and furious product development around HEVC, with many announcements made already and several more significant milestones expected throughout 2013.

Silicon vendors are also looking towards the technology but at a somewhat more sedate pace, both to maintain profitability levels on existing AVC chipsets and also given the considerable challenges of achieving real-time power-efficient encoding and decoding of HEVC content (particularly at higher resolutions). But any large scale migration from AVC to HEVC will take time, much as the transition from MPEG-2 to AVC is still very much an ongoing process.

One can also draw parallels between 3D and HEVC, as technologies that were aggressively marketed before a robust content pipeline was in place. Video technology without adequate, compelling content is like a painting without adequate illumination – it’s very hard for a viewer to see the value. 3D has by wide consensus failed to realize its expected potential, partly because too much was expected too soon, but also because the content community was not able to find a viable business model to justify the expense and disruption, and users failed to see value in the technology – even as television set prices dropped dramatically as vendors struggled to establish demand at expected levels.

From a service provider perspective, the industry has only just overcome the alphabet soup of fragmented video formats to converge around AVC. Interoperability standards like Ultraviolet and MPEG-DASH, streaming technologies like HLS and Smooth Streaming, and pretty much every digital terrestrial transmission and cable standard today all embrace AVC. AVC has been the technology to break the walled garden mentality that pervaded OTT video during much of the 90s, with vendors like Adobe, Apple, Google, Microsoft, Rovi and Sorenson Media all embracing AVC in the interest of optimizing OTT video, despite the fact that each owns their own proprietary video compression technology.

Many of these vendors – particularly those who provide core codec components that power other vendors – are in the process of developing HEVC software cores to power the inevitable innovation that the new technology will catalyze. At the same time, they acknowledge that their service provider customers are loath to disrupt a delivery ecosystem that has only just settled down around converged solutions focused more on delivery and quality of experience than with just connecting the dots.

In terms of services rollouts, Frost & Sullivan expects closed-loop solutions such as enterprise video conferencing (from vendors like Cisco Systems and Vidyo) and Ultra HD broadcast services in the far east (powered by vendors like Cyberlink, Rovi and Samsung) to be among the earliest HEVC-based services to be rolled out. Video on demand services for low bandwidth, such as HD video delivery over cellular networks, are also likely to be early adopters of HEVC due to the significant potential for operating expense savings and growing demand for higher quality of experience over increasingly stressed mobile networks. These are expected to rely on software decoders in the short-term, with more power efficient hardware decoders or video co-processors expected to become available in the 2015 timeframe.

Satellite DTH service providers are also expected to leverage HEVC to roll out Ultra HD channels in the 2014-2015 timeframe, although it is hard to predict the level of uptake they will see in the early years of the technology. Similarly, some pilot DTT channels are expected to roll out in the 2015 timeframe, but the level of uptake remains to be seen. High-end encoding vendors such as ATEME, Ericsson, Fujitsu and NTT, and high-end CPE vendors such as Technicolor are all beginning to add HEVC to their product portfolios. All things considered, while certain applications will embrace HEVC much sooner than the norm and HEVC encoding and decoding cores should mature by 2014, we expect it will be around 2017 before a comprehensive ecosystem of first-generation HEVC-enabled products will come to market by 2017. Furthermore, we expect AVC to remain in widespread use even in 2018, although it will definitely be considered a commodity technology at that point – much as MPEG-2 is today.

The key takeaway from all of this is that HEVC won’t be adopted as quickly as some may think and if they bet big on HEVC too early, that’s one bet they are going to lose. We’ve done a lot of work at Frost & Sullivan on the topic of HEVC and in addition to the three reports we’ve already publish, we’ve done a lot of private research on HEVC for clients. If you’re looking to get more details on HEVC technology, get copies of the reports I mentioned, or need any custom research on the HEVC market, please feel free to reach out to me for more details.

Netflix To Keynote Content Delivery Summit, May 20th NYC

netflix-logoI’m very pleased to announced that Ken Florance, VP of Content Delivery at Netflix will be the opening keynote speaker at our annual Content Delivery Summit. Taking place Monday May 20th, at the Hilton hotel in NYC, the summit has become the go-to event to learn about the technology behind web, acceleration and media delivery infrastructure. Telcos, carriers, ISPs, MSOs, major content owners and CDNs all gather to present case studies on real-world deployments, see demos of new technology platforms and discuss business model considerations for web acceleration and media delivery.

During Netflix’s keynote, Ken will highlight the company’s Open Connect Content Delivery Network and provide insight into how it works and what the future holds for delivering video via the last mile. In addition, the call for speakers for the Content Delivery Summit is still open for the next three weeks. So get your speaking submission in ASAP if you want to be considered. Some of the topics the event will cover include:

  • Over-The-Top Video Delivery
  • Dynamic Site Acceleration
  • Transparent Caching
  • Application Acceleration
  • CDN Economics & Business Models
  • Managed/Licensed CDN
  • Analytics & Cloud Intelligence
  • Front-End Optimization
  • The Video Ecosystem
  • Telco CDN Deployments
  • Mobile Content Acceleration
  • Optimizing Web Applications
  • The Business of CDN Federation
  • CDN Pricing & Volume Data

Registration for the summit is now open and if you register early and use the special promo code of DR13, you can get a ticket and attend the event for only $395. So don’t lose out on the discount and register early. #cdnsummit

ASUS and Netgear Announce New Streaming Boxes, $130 & $150 Price Points

asus_qubeThere are a lot of options in the market for consumers looking to spend $100 or less to get a dedicated streaming box including devices from Apple, Roku, Boxee, Western Digital, Netgear, Sony, Vizio and D-Link. But that’s not stopping ASUS from jumping into the market with their announcement of a new device called the Qube, that will be more expensive than the others devices, retailing for $150 when it launches in March.

The box comes with the Google TV platform, support for Netflix and Amazon Instant Video at launch and includes 50GB of storage space via the ASUS cloud platform. It’s has two USB ports, HDMI in and out, ethernet, IR out and 4GB of flash storage. The interface for the Qube display functions via a rotating on-screen cube shape, which from the looks of it, appears to be a really bad UI. ASUS wanted to be creative by make the interface a cube, just like the shape of the box, but it appears they picked from over function. I haven’t gotten hands-on with the box just yet, so maybe the UI works better than I think, but from what I have seen in this video demo, I don’t think users will think the interface is very practical.

41IcvVIKefL._SL500_AA300_In addition to ASUS’s new device, Netgear has also announced a new box called the NeoTV PRIME. The box is similar to their NeoTV line including the standard, MAX and PRO, but the PRIME comes with the Google TV platform. It’s also DLNA compatible, supports playback from external USB drives and has a two-sided remote control with built-in keyboard. While Netgear said the device is currently available for sale and retails for $130, Amazon shows the device being released on February 22nd.)

With ASUS and Netgear’s new boxes and the recently announced box from Hisense, here’s my updated list of streaming boxes:

  • Apple TV
  • ASUS Qube with Google TV (coming March 2013)
  • Boxee TV
  • D-Link MovieNite Plus
  • Hisense Pulse with Google TV
  • Microsoft Xbox 360
  • Netgear NeoTV (3 models)
  • Netgear NeoTV PRIME with Google TV (coming Feb 2013)
  • Nintendo Wii U
  • RCA Streaming Media Player DSB772E
  • Roku (4 models + Roku Streaming Stick)
  • Sony PlayStation 3
  • Sony SMP-N200
  • Sony NSZ-GS7 Internet Player with Google TV
  • Vizio Co-Star with Google TV
  • Western Digital WD TV Play (coming Feb 2013)
  • Western Digital WD TV Live Hub

You can check out my comparison chart of streaming devices at www.streamingmediadevices.com. The chart is currently undergoing an update and I’ll be posting the latest revision soon.

Insiders Detail Accounting Irregularities At KIT Digital, Rumors Of A Possible SEC Fraud Investigation

[Updated Dec. 27 2017: Former KIT Digital CEO found guilty of manipulating shares]

[Updated Sept. 8 2015: Former KIT Digital CEO and CFO Arrested: Charged With Accounting Fraud]

Industry vendor KIT digital has had a lot of problems of late, but it looks as if it is about to get worse. Over the past few weeks, multiple sources have detailed for me the lack of controls KIT had in place to properly account for their financials. Some suggested to me that KIT went as so far as making up revenue that didn’t exist, with one person telling me they thought that up to $80M in reported revenue wasn’t real. Other sources tell me that they believe enough fraud will be uncovered that the Federal government is likely to take up an investigation and presumably, prosecutions, on some of KIT’s executives.

I think it is important for me to point out that I don’t have access to KIT’s books and can’t verify on paper the details I have been given, but some of the information comes from employees who were inside KIT’s finance department at the time and had direct knowledge of what was taking place. In addition, several law firms have filed securities class action lawsuits that appear to coincide with the information I have been given and they also have some of the specific details I have been told, from their sources as well.

While the SEC did not return my request for more details on what they may be looking into with regards to KIT, the company is no stranger to the SEC. Their former CEO is being investigated for insider trades, the company delayed the release of its 10-Q and in November, KIT announced they would have to restate earnings for the past three years. The company told shareholders to, “no longer rely upon the Company’s previously issued financial statements” stating that the irregularities stemmed from, “revenue related to certain perpetual software license agreements entered into by the prior management team in 2010 and 2011.” While that’s a fancy way to say that KIT didn’t account for revenue properly, insiders tell me the simpler explanation is that KIT simply made up revenue that did not exist and counted revenue from contracts that were cancelled or expired.

It’s important to note that on March 30th 2012, in a regulatory filing KIT disclosed that their current accounting firm at the time, Grant Thornton, noted a “material weakness” in the company’s internal controls over financial reporting saying that, “KIT digital and Subsidiaries has not maintained effective internal control over financial reporting as of December 31, 2011.” And between KIT reporting Q4 results and filing their 2011 10-K, $2.14M in cash disappeared that the company could not account for.

While these accounting problems could simply be attributed to incompetence and negligence, insiders tell me that some of it was deliberate, with the intent to change KIT’s numbers, which would then make it a fraud. One of the most revealing details is that some of KIT’s senior management purposely kept employees from installing company wide business accounting software inside the company, in particular, a solution from NetSuite. Two former employees told me that some of KIT’s executives instructed them that they needed to be able to “massage the numbers each quarter” and have “more control over the numbers we show”. Instead of using a company wide program that would manage KIT’s financials, each office would deliver Excel documents to KIT’s headquarters, which would then have to manually combine the numbers from at least ten different spreadsheets.

Multiple people also told me of KIT making what some employees called offshore accounts to people that KIT management would tell them to send money to, without any kind of invoicing or tracking of what was being paid. Others told me that some of those who got these transfers were actual KIT employees, which the company described as “commission” checks, even though the revenue they were getting paid for was from customers who had not paid their bills or contracts that had long expired. Whether or not these payments were illegal I don’t know, but it clearly shows a pattern of financial abuse at KIT and lack of control over accounting. And it raises a lot of questions when cash has disappeared and wire transfers are being done, from offshore accounts, with no record keeping.

While KIT recently said that a large part of the confusion around KIT’s financials is around how they accounted for revenue generated from professional services, versus SaaS platform license fees, many say KIT constantly bent the rules to try to get away with as much as possible. Others also told me that one of the reasons KIT fired their original accounting firm was because they would not agree with how KIT was recognizing revenue. Some have also told me that when questioned by others about accounting irregularities that they said KIT’s accounting firm would not approve, select KIT executives told them that they had a good working relationship with their outside accounting firm and they would let them do what they wanted.

If you think about all the red flags at KIT, it’s really amazing just how many there have been, yet a lot of investors still wanted to believe the story, which is their own fault. KIT has misguided their revenue, taken good will write-offs, missed or delayed multiple SEC filings, restructured their board and management multiple times, delayed putting out public news for days, fired two accounting firms, defaulted on a debt covenant, had cash disappear and acquired more than ten companies. This company had screamed warning for a long time.

None of the financial problems with KIT’s business really comes as any surprise to me as every time I would look at KIT’s technology it didn’t work as advertised. I would get conflicting information from the company and even when they would walk me through demos personally, stuff would not work. Talking to some of KIT’s customers they would tell me they were never moved to any KIT platform, were still running on legacy systems from companies KIT acquired, like Multicast Media and theFeedroom, even though KIT was telling everyone that all these systems had been integrated into one platform. Remember the VX-one platform? You don’t hear about that anymore.

On one of my calls with various members of KIT’s management and product team, KIT’s CEO Kaleil Isaza Tuzman gave me names of customers who were using their platform, only to be corrected by another employee on the phone who said that wasn’t accurate. This type of behavior was common at KIT. The company excelled in trying to get away with as much as possible, until someone noticed and questioned them about it. KIT used logos of customers who they didn’t have contracts with on the customer page of their website, made up their own terminology to describe things and skated around simple, direct questions.

Whenever I questioned KIT’s CEO about things that didn’t make sense, I would get responses that was full of marketing language or told that it was “complicated” and I would not be able to understand it since I was not an accountant. Once I was also told that since KIT was an International company and I was based in the U.S., I could not understand how things were done overseas. There was nothing transparent about KIT or the way they did business and that example Kaleil Isaza Tuzman set was something that others followed. They had a pattern of denying, deflecting and trying to confuse the person with a drawn out, generic, high-level marketing response, something that should have raised a lot of red flags.

While many don’t want to talk about it openly, KIT has always been a black eye on this industry and a bad representation of the industry. The sooner they disappear, the better. With their cash reserves running out and employees being let go, 300 employees four months ago and 90 more employees last month, KIT’s time is about up. No one will put more money into the company and it will go under, it’s just a matter of when and how. Some have suggested that KIT can sell off their technology in pieces or sell the company outright, but KIT’s management has been saying this would happen for more than two years with no results. Multiple firms I have spoken with who buy distressed companies and then sell off the pieces have told me they have looked at KIT and didn’t find anything worth buying.

With most of this well-known and these details being easy to come by, I don’t know why anyone ever wanted to invest in KIT. When you put money into a company, you’re not really putting faith into the company itself, but rather the executives that are running it. The fact KIT raised so much money shows that there are still plenty of people out there who can be wowed by a presentation and fooled by people who present well and stir people’s emotions, and greed, for quick money. With KIT’s stock now practically worthless, trading at $0.45 when less than a year ago it was at $12.65 a share, a lot of people have lost a lot of money. But in this instance, no one should feel sorry for any investor, as the warning signs have always been there. And for the Wall Street firms that put big money into KIT and gave them lots of financing, they should not be trusted. They didn’t do the type of fact checking and intelligence gathering they should have.

When KIT’s stock was high, I would get lots of emails from investors telling me how stupid I was not to like KIT, why it was worth more than Brightcove’s market cap and many were angry at me for not wanting to believe the hype. Some even started to get hostile and threatening with me when I first started pointing out problems with KIT, back in 2008, in a back-and-forth exchange I had with Kaleil Isaza Tuzman in the comments section of this Gigaom post. While some investors probably think I like what happened to KIT, I don’t. Nothing good comes from having a company in our industry that makes others look bad, that makes people lose their money or that doesn’t deliver what was promised to customers. But the problem is, those investors didn’t do their homework. They didn’t talk to KIT customers, didn’t get demos of KIT’s products, didn’t ask the right questions or check the facts. The warning signs with KIT have been there since 2008 and even before that when it was ROO Media. If anything good comes from this I hope it’s that investors will learn not to be so blindly trusting of people and do better due diligence.

Whether all of these problems result in KIT’s numbers being false and being a fraud case is too early to know. KIT may not announce the re-statement of financials for many more months or even a year from now, so the story on KIT will have a few more chapters. But whatever the results, the company is dead, it’s running out of cash and it won’t be around much longer.

I want to wrap up this piece by making a very important statement. This post is not a reflection of everyone who has even worked at KIT. The company has had a lot of employees and executives over the years, some of them my friends, and I believe that the vast majority of them had the desire to work for a professional company and were not involved in a lot of the negative actions that have taken place. It’s a shame that some executives, board members and investors ruined their opportunity for long-term job success and made a bad name for the company, but it should not be a reflection of the employees that were not involved. Those are the people who have lost out in what has happened and hopefully, they get opportunities with professional vendors who won’t hold it against them for working at KIT.

If I can assist anyone who is looking for a new job, please reach out to me and I will see if I can help. Also, if any current or former employees want to talk to me about what went on at KIT, I’d like to hear from you. As with this post, none of my sources will be named and I’m also not naming the executives they worked with.

Disclaimer: I have never bought, sold or traded a single share of stock, in any public company. While I am willing to do media interviews about this story, I will not disclose any sources or detail by name which KIT executives were mentioned to me by my sources.