AppDynamics the World Leader in APM and DevOps

AppDynamics Blog

Subscribe to AppDynamics Blog: eMailAlertsEmail Alerts
Get AppDynamics Blog via: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn

Related Topics: Enterprise Application Performance

Blog Feed Post

When Is Research Not Research?


This week we have seen that a new APM “research” report has been released by Research in Action, a small Independent Research and Consulting organization in Germany, which consists of one individual. This report focuses on ‘Next Generation Application Performance Management: The Top 20 Vendor Scope: Global 2015’.

Firstly, it’s important to state that AppDynamics declined to participate in the research conducted by Research In Action, in which we were given the option of spending €15-20k in order to distribute this report. We declined because in 2013, Research In Action published a similar report titled ‘Deep Application Transaction Management’, which contained a number of inaccuracies, making us wary about actively taking part in further research or briefings with this company due to concerns about the quality of analysis being carried out.

Nevertheless we take any report or article about AppDynamics or the market seriously. We support ‘fair and factual’ based research into this market as there are a number of vendors with differing solutions, but upon analysis, we are dismayed again by the research conducted by Research In Action and worry that it is misleading to any business interested in APM solutions or the market as a whole. Our reasoning follows:

The Vendor Selection Matrix Is Misleading

We first became concerned about this report when one of our customers mistakenly took the Vendor Selection Matrix used in this report to be Gartner’s much respected Magic Quadrant. A mistake some of our competitors would likely welcome. Obviously we were concerned to hear this, having not seen the results or been offered the chance to comment on preliminary findings prior to the publication, as is typical with all other research companies in my experience.

Also, other analyst firms have found alternative and innovative methods to depict this information, such as the Forrester Wave, or the IDC MarketScape, which helps differentiate analyst opinions. Unfortunately this report strikes such a close resemblance that it can easily be misleading.

The Evaluation Criteria Results Don’t Sync With Analysis Done By Other Research Companies

Vendors were evaluated based on two dimensions, each broken into 4 sub-criteria. A number of the results jump out as being inaccurate, for example:

  1. Market Share and Growth. In this blog, my colleague Jonah Kowall, analyzes the APM 2014 market growth figures by Gartner. In the research the author states: “There are more than 400 active software and SaaS vendors generating around $ 4,6 B in annual revenue”. This number is significantly higher than published numbers from IDC ($3b) and Gartner ($2.6b). Gartner also concludes that the market share growth for AppDynamics was 246.5%. Equally, IDC’s APM market share 2014 report tells a similar story, showing that AppDynamics is growing faster than any of the APM vendors listed. The explosive growth of AppDynamics reported by these other analysts brings the low score and comments in these areas from Research in Action into question.
  2. Customer Satisfaction and Mindshare. Here at AppDynamics we are fanatical about customer satisfaction. We pride ourselves in our customer support function and invest heavily in this area. This is why we have a Net Promoter Score (accepted industry way of measuring customer satisfaction) of 85 and have maintained a score of over 80 for each successive six month period over the last three fiscal years. The low scores attributed in this area are therefore again surprising.
  3. Company Viability and Execution Capabilities. Along with our market share growth, we have consistently achieved strong financial growth. In fiscal year 2015 we doubled our bookings to over $150 Million and in 2014 we experienced a 175% increase in bookings. This should demonstrate that our company viability and execution, based on these figures, are without question. And is why we are backed by leading investors such as Greylock, Lightspeed, Kleiner Perkins & Byers (KPCB), Institutional Venture Partners, Battery Ventures, ClearBridge and Sands Capital. Again, it’s hard to understand how Research in Action arrived at the scores in this category. The author also includes vendors in the research which are clearly not APM vendors. Micro Focus is a mainframe company who does not offer any APM products, yet they are included with oddly large revenues in the market. Similarly network performance monitoring vendor Infovista who focuses on carrier network instrumentation is included. If the author expands APM to include network vendors such as Infovista it would be a disservice not to include market share leaders like Netscout, JDSU, and Network Instruments in this research.
  4. Forecasted revenue and market share percentage. The report has a table showing forecasted revenue and market share percentage for vendors analyzed. AppDynamics has not provided any figures or information for Research In Action to base these figures upon making their validity questionable, especially based on the market share growth numbers by Gartner and IDC explained in (1).

Analysis Of AppDynamics Is Out Of Sync With Our Publicly Stated Growth

Given the points above in regards to the assessment criteria, it is difficult to understand how any conclusion would state that “AppDynamics has recently come to a more or less abrupt halt”? This statement is highly inaccurate based on publicly reported facts.

The analysis also includes an unidentified AppDynamics customer quote. We are not sure which one of our customers Research In Action allegedly spoke with, however the quote states that “a customer” is concerned about the accounting and reporting principles of the company. As with any private company we report to the market only limited financial figures and it is therefore challenging to understand how or why anyone would bring the accounting principles into question.

The Primary Survey-Based Methodology Is Dubious And Unscientific

The report claims that that it is ‘unique’ in the fact that it uses a ‘survey-based’ methodology for comparative vendor evaluation. The summary then states that ‘Roughly 60% of evaluation results are based on enterprise buyers’ survey results’.

So what does roughly mean here? When I was analyst I understood that when you are assessing or analyzing results, a fair process means that your methodology has to be scientific and comply with a prescribed set of rules. The survey consisted of two parts, a 900 participant IT buyers telephone survey in Q1 and a 700 IT buyers online survey in Q2 2015.

These are two separate surveys and it’s not clear from the methodology as to how the telephone survey was structured or carried out. As a former analyst in this space, I would expect that the type of questions typically used in both types of survey would be different. In a telephone interview, it is convention to focus on open questions, for example, “So tell me what features would be important to you in an APM solution?” while in an online survey you would use more closed questions, for example “Pick the five features, in order, that you would look for in an APM solution”.

It therefore seems illogical to combine the results to generate a single set of conclusions as they would be unscientific. But it seems likely that Research in Action’s assessment has done precisely this, as there is a chart which supposedly shows the key requirements for a ‘Next Generation Application Performance Management solution’ generated from results that seemingly combine both sets of data (1600 IT managers). If this is the case then the survey results, supposedly used to make conclusions, are not scientifically valid.

This result is that the analyst will likely have had to interpret some answers subjectively – as such, it’s really hard to believe that even 60% of the evaluation of results were based on buyer survey results or quite frankly that the results are valid at all. Further, the sourcing of data for the charts is inconsistent as the methodology states that surveys were targeted at IT buyers, but then some charts reference ‘IT and Business Managers’ while others state ‘IT Managers’. So how the segmentation was done is not obvious. On top of this as a global report, surely the surveys would have been conducted globally? Again this is not obvious.


So when is research not research? For those unfamiliar with the world of Analysts, there is an interesting blogpost by Aneel Lakhani, former Gartner research director which describes the various types of analysts and the content they produce.

“Many analyst firms are pay­-for-­play. Many will write white papers and ‘advertorials’ that are underwritten by vendors. I find these (after too much experience) highly suspect and particularly worthless.”

Our investigations have lead us to believe other vendor(s) have likely undermined the independence of Research in Action and by extension therefore the report itself.

At AppDynamics, we are in full agreement with Aneel that the deficiencies outlined above unfortunately render this report worthless as it not an equal and unbiased evaluation. Despite AppDynamics being labeled as a “Market Leader” in the paper, we therefore cannot recommend it to our readers as a good source of information if you are evaluating the APM marketplace.

The post When Is Research Not Research? appeared first on Application Performance Monitoring Blog | AppDynamics.

Read the original blog entry...

More Stories By AppDynamics Blog

In high-production environments where release cycles are measured in hours or minutes — not days or weeks — there's little room for mistakes and no room for confusion. Everyone has to understand what's happening, in real time, and have the means to do whatever is necessary to keep applications up and running optimally.

DevOps is a high-stakes world, but done well, it delivers the agility and performance to significantly impact business competitiveness.