A few years ago, I asked if the enterprise has a fake news problem. The short answer is: kind of. The good news? We work in an industry where our careers are defined by the success of complex, high stakes projects. We simply can't afford to get duped too often.
Enterprise software is not an impulse purchase - and there is plenty of informed content, including peer review sites, that gives a realistic view of vendors before a deal can be closed.
The bad news? Our industry is also chock full of well-funded media and marketing organizations. These folks are very good at reframing news in the interests of one vendor over another.
Tech vendors with deep pockets have a vested interest in promoting insatiable technological optimism - the shiny new toy syndrome. If we're not careful, we'll find ourselves sinking big dollars into unproven solutions, long before they are mature enough (see: early blockchain adopters). We risk losing track of the people and process issues at the heart of our problems.
When the news cycle comes fast and furious, making the right project choices can be tough. Consider the furious pace of news that unfolded around Zoom this spring, amidst heavy adoption. Is it secure? How has the company responded? Do Zoom's changes apply to all versions of the product? Is end-to-end encryption across all dial-ins a realistic expectation? And what about Zoom's potential ties to Chinese markets and data centers?
Almost every week, a new Zoom glitch or PR gaffe came out. Pieces also came out praising Zoom's responses. Diginomica contributor Kurt Marko did an excellent job detailing this story and taking a firm position. Still, you can sympathize with any software buyers assessing Zoom's pros and cons amidst that amount of media noise. The story of Zoom's pursuit of enterprise-grade security - and trust - is still unfolding.
The necessity of puncturing hype balloons is here to stay. But: we can all become savvier at evaluating data and questioning vendor hype. Last time around, I detailed two ways of doing that:
- sharpen our BS filters
- break out of our filter bubbles, into more open conversations
Recently, I ran into one of the best BS filters I've seen: scientist Carl Sagan's Baloney Detection Kit. But how much of Sagan's kit applies to the enterprise? Sagan, of course, honed his BS detector amidst scientific inquiry. In their Sagan piece, authors Brain Pickings and Maria Popova write:
Carl Sagan’s rules for critical thinking offer cognitive fortification against propaganda, pseudoscience, and general falsehood.
Sagan's fine art of baloney detection - an enterprise review
Pickings and Popova pulled Sagan's rules from his book, The Demon-Haunted World: Science as a Candle in the Dark, via the chapter "The Fine Art of Baloney Detection." Here's nine rules of scientific inquiry Sagan believes we can apply to everyday life - along with my take on the enterprise relevance of each.
Wherever possible there must be independent confirmation of the “facts.”
Yep - never trust a single source on any issue exclusively - build an information network that cross-checks itself.
Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.
Indeed. We can do this by seeking out communities with diverse constituents, where issues are debated openly.
Arguments from authority carry little weight — "authorities" have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.
Push back on "enterprise guru syndrome." Put ideas and tech to the real world test.
Spin more than one hypothesis. If there's something to be explained, think of all the different ways in which it could be explained. Then think of tests by which you might systematically disprove each of the alternatives. What survives, the hypothesis that resists disproof in this Darwinian selection among "multiple working hypotheses," has a much better chance of being the right answer than if you had simply run with the first idea that caught your fancy.
Avoid reliance on one vendor or services partner, no matter how much we're invested in them, or their platform. Check: my diginomica series on why independent consultants matter.
Try not to get overly attached to a hypothesis just because it's yours. It's only a way station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don't, others will.
Rejecting easy/lazy enterprise narratives is a good start. "This vendor is legacy", or "Customers love this SaaS vendor" - simplistic sentiments don't help us. Most vendors have strengths and weaknesses; one size never fits all. Beware: industry analysts can become attached to their narratives around the rise or fall of certain vendors - another habit to push back against.
Quantify. If whatever it is you're explaining has some measure, some numerical quantity attached to it, you'll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations. Of course there are truths to be sought in the many qualitative issues we are obliged to confront, but finding them is more challenging.
The biggest gap between Sagan's tips and our world. So much of the quantified "research" in the enterprise market is vendor-funded. This doesn't mean it's useless, but the data can be framed in self-serving ways. Cross-checking reports from multiple sources helps. As do peer-based discussions to sense test results. Wariness of quantified data is just as important as the data itself. Take, for example, the conflicting reports on whether remote workers are more or less productive. Three more from Sagan:
If there's a chain of argument, every link in the chain must work (including the premise) — not just most of them.
Occam's Razor. This convenient rule-of-thumb urges us when faced with two hypotheses that explain the data equally well to choose the simpler
Always ask whether the hypothesis can be, at least in principle, falsified. Propositions that are untestable, unfalsifiable are not worth much. Consider the grand idea that our Universe and everything in it is just an elementary particle — an electron, say — in a much bigger Cosmos. But if we can never acquire information from outside our Universe, is not the idea incapable of disproof? You must be able to check assertions out. Inveterate skeptics must be given the chance to follow your reasoning, to duplicate your experiments and see if they get the same result.
"You must be able to check assertions out." Seek out those who challenge your views. Find those who tell you what you don't want to hear. Build a network of advisors across companies and roles.
On disclosure, tech media noise and the pursuit of context
I'm betting that if Sagan were alive today, he'd have more to say about "junk science" and the perils of scientific research bought and paid for. As you polish your enterprise BS filter, that's an aspect you'll want to hone in on.
Last time around, I picked my top issues that prevent enterprise clarity. They included:
- Financially funded or vendor-biased stories tend to get disproportionate exposure on social networks.
- Lack of disclosure can obscure the financial ties between "research" reports and media coverage.
- Wall Street regularly misunderstands enterprise software, causing fluctuations that are not accurate to the long term health of the vendor.
- Fast-moving stories, such as the aforementioned Zoom rollercoaster, can be obscured by vendor PR campaigns, or social hashtag frenzies.
- The big tech news outlets are primarily chasing eyeballs/ad revenues, and therefore cater to what you are most likely to click on - regardless of whether the article gives you a better context for your project.
I think what we're all after, really, is a context - one that helps us absorb a wide range of data and apply it. We want a context that is flexible enough to shift quickly, one that can wade through noisy news cycles, and one that balances a hefty dose of skepticism with curiosity for what proper technical innovation accomplish.
Each of us who writes for diginomica would put this differently. We don't always get it right either. But I can tell you my colleagues here talk a lot about these issues, and our obligation to provide a context that helps people see through the enterprise BS, and find the use cases that matter. Improving ourselves while improving our projects is probably a lofty goal, but it beats the heck out of flogging smart locks.