I draw a distinction between analysts who do the work needed to provide value into the market and those that do something else.
Those who fall into the latter camp I call anal-ysts. They will often be found endorsing some piece of ‘research’ which has been funded by a vendor. If you know how these things work then what often starts out as a good idea quickly becomes sullied by commercial interests.
The anal-yst will claim pure as the driven snow independence. Those of us who know better quietly chortle as we think of ways to eviscerate research outcomes. Customers are wising up to this tired tactic and so while we now see ‘research papers’ larded with disclosure around who’s paying the bill, what we don’t see is the extent to which the research sponsor is tweaking the research.
This lack of transparency is of concern because without knowing where the editorial control starts and stops, there really is no way of telling whether the work is biased or otherwise. The gut check reaction is that it must be biased, even when that turns out to be unjustified.
There is a commercial reality. It is getting harder and harder for analysts to develop and create original data driven research. The reason? Companies on both the buy and sell sides don’t want to pay for the outcomes.
That leads to a race to the bottom where the only viable research model is that done for commercial vendor purposes. Don’t get me wrong, research isn’t dead, but it is harder to justify the economics.
Standards fall, and in some cases, the questions that form the backbone of vendor sponsored research are so obviously designed to deliver a specific outcome, you almost have to ask who wrote the question framework. Or, in some cases, was it take your pick?
Net-net, we learn very little of actionable value or anything that adds substantially to the body of knowledge that’s freely available on the interwebs.
An anal-yst example
I’ve had my run-ins with IDC before so there is nothing new here. They claim independence but what is perhaps less clear is that services are bundled to create the impression of independence when in reality there is an element of you scratch my back, I fill your purse.
Nowadays, I mostly ignore their commercial stuff but the latest material to cross my desk came from a rather persistent PR. I give her credit for doing her job the best way she knows but the outcome fails on multiple fronts.
Problem 1 – throttling access
I get an email that says:
You can download the full report here:
Except you can’t unless you’re prepared to hand over a good amount of information. I’m a media person, not a prospect yet they want me to fill in forms? Two can play that game so I made up all the information submitted. I wont be the only one.
Next, the report is 100% IDC copyright so I can’t quote a thing out of it. Great.
Problem 2 – stale data
I noticed that while the report is dated July 2014, most of the report data is dated August, 2013. This is mobile we’re talking about and a year in that market is like several dog years. What I endeavored to parse in 2013 is largely irrelevant today. You get a sense of the vintage from this piece out of the infographic Kony helpfully prepared to go alongside the research.
40% to support Blackberry out of a population of 400? I find that astonishing at a time when I regularly see die hard Blackberry users ditching for iPhone or Samsung devices (mostly.) That was almost unheard of a year ago. I’ve yet to see a Windows phone in the enterprise. If Microsoft’s recent results are anything to talk about, that’s not a surprise. But then I spend a disproportionate amount of time with forward thinking businesses so my observations will be skewed.
Unfortunately, the research provides almost no demographic data against which to assess the methodological soundness of the anal-yst’s report. I am betting the data’s vintage plays heavily into this result.
This is important because the mix of assets a company holds or manages plays directly to the topic in question. Similarly, there is no discussion around the BYOD debate where the sands are constantly shifting. This recent report on BYOD backlash is a case in point. Coincidentally, while the meat of that story doesn’t of itself lend credence to a generalized Blackberry resurgence, it adds fuel to the BYOD bonfire.
Problem 3- stating the obvious
Moving on, we see other attention grabbing but meaningless statements. Check this one:
How many times have you heard this rallying cry in the last two years? Better still, how many vendor sales pitches have you heard in the last two years that DIDN’T have ‘enterprise-led strategy’ somewhere in the slide deck? Definitely one for the ‘No shit Sherlock’ department methinks.
Problem 4 – misleading representation
And then there are other issues with the infographic which draws directly upon the research. Check this out:
This graphic makes no sense whatsoever and the scale chosen ensures you focus upon the best results. Tim Kitchen, who is programme manager and innovation BCS, Chartered Institute of IT made this observation on Facebook:
Two negative grades and only one positive distorting the data? Inconsistent use of ‘our’? Presumably impenetrable definitions of business units or departments? Gotta assume the scale was originally defined to contextualise each colour clump and subsequently warped…but this doesn’t seem to be the case…so suspect they just meant ‘number of respondents, not %…but even that doesn’t look quite right. Ho hum.
What is truly astonishing is that this infographic element is a direct lift – with some style tweaking – from the IDC report. Now, if someone like Kitchen doesn’t get it or can find holes, then you’re in trouble.
I have a different take. The way they’ve used the stacked chart suggests that mobile software deployments are almost always wildly successful. But they’re not and in that sense the graphic is incredibly misleading. You have to unstack the charts in order to get a better perspective – assuming of course they are correctly labeled in the first place.
The four problems I have identified are surprisingly common. Forget the fact IDC provides a ton of narrative around the research which, while copyright, could be of some value in the public domain. The fact this is vendor sponsored means it is useless to any other vendor. Why not just drop it into the public domain and be done with it? At least provides the opportunity to start a conversation around the topic. Right now, all you’re getting is me eviscerating the thing.
Why don’t the analyst firms do more to sense test what the vendor wants? IDC does enough of these to understand the construct. In order to deliver value, you have to show how the data fits together, not just call out some headline grabbing numbers. In that sense, they’re doing their client no good at all, adding to their anal-yst credentials.
Having signup forms you can game is a sure fire indication that your marketing is off base. The right way to do it is to acknowledge the form has been filled with an email message that provides a link. Simply letting me at the report having gone to the trouble of trying to identify me is a guaranteed way of ensuring valuable data is lost.
PR should take some responsibility. They see these things day in and out and should have enough information to understand what will fly and what won’t. Wonky data will never fly. Their chances of getting anything other than rehashed PR out of this are virtually nil because of the problems inherent in the model. Worse still, when someone like me turns up, what started out as a good idea suddenly becomes wholly questionable.