Just as “war is too important to be left to the military,” so is it now clear that the cloud is too important to be left to the IT department.
Paradoxically, of course the most rigorous IT skills are vital to victory in the cloud, just as much as the skills of a general on the battlefield – but asking specialists to win a war is not the same thing as asking them to choose its time or define its purpose.
The time for the cloud is now, whether an IT department thinks the cloud is sufficiently ‘mature’ or not: the purpose of the cloud is not merely to speed the pace, or cut the cost, of any of our past conceptions of what IT does or what IT is for.
If you treat the cloud as merely a way to modernize your IT, that’s all you’ll get. No matter how creatively and competently that modernization is done, it will not be enough. Broadening the perspective requires enlarging the community of stakeholders and lengthening the list of criteria for success.
Don’t be confused by the obvious
It’s both easy and common to see the cloud as merely the straightforward evolution of IT, along a familiar path defined by oft-misquoted “laws.” One can argue, with abundant justification, that what we call “the cloud” is just the inevitable result of the crossing of curves of demand, capability, and cost.
In particular, it’s clearly the case that during the past 30 years, processor speeds have risen at a compound annual growth rate of roughly 25% in the behavior that many misrepresent as “Moore’s Law” – ignoring Moore’s actual statement that had to do with the number of transistors on a chip, not its computational power. Regardless of what it’s called, however, this is a phenomenal record of sustained improvement – until placed in perspective against the rise of connectivity.
Bandwidth to homes and offices has skyrocketed, from 300-bps modem to 30M-bps fiber connection: an annual rate of roughly 45%. It follows that at various points along the way, an application that in 1983 “could not be adequately responsive” unless running on the user’s desktop might very likely have crossed a threshold, and become more useful as a remotely delivered service.
Any number of business tasks can benefit from always-current access to continually validated data, not in a batch mode of periodic updates but simply as a matter of constant connection to shared truth. Any number of collaborative efforts can work more smoothly when synchronization is not an occasional operation, but a continuous and automatic behavior. Everything from in-car navigation systems to concurrently editable documents bears witness to this change.
Throughout the workplace, we can see the aggregate result in rapid de-emphasis on desktop PCs—often burning more than 100 watts, perhaps with further support from a ziggurat of icecap-melting middleware servers—replaced by tools that do their work, more efficiently, elsewhere as commanded by a thin-client tablet or smartphone via wireless connections.
It might seem, in consequence, that cloud adoption is mainly a matter of re-architecting applications to make secure and efficient use of the shared facilities that are now the more attractive environment for many workloads. Beware, however, the temptation to measure progress in cloud adoption as the percentage by which the application portfolio has been thus migrated. It’s an easy measure of what is, but it ignores what could have been.
“The Cloud” is the canvas, not the picture
If an organization measures its progress in terms of application cloudification, it may briefly enjoy the resulting illusion of innovation – only to wake up one morning and discover that its IT stack has become a lean, mean, but irrelevant machine. The cloud, it turns out, is not a work in itself, but only a medium for what’s actually a strategic objective: superior connection.
How can connection truly transform a process? To borrow the verb du jour from higher education, connection lets us “flip” the process, turn it inside out, rather than merely speeding up an unsustainable mess.
In education, the notion is to flip the roles of classroom and study room. Rather than bringing students together for mass, one-to-many delivery of lectures, why not let those students receive that lecture content at the time and place most convenient to them?
- Why not let any student replay any segment of the lecture as needed, rather than risking the embarrassment (and consuming the time) that goes with any request for repetition in front of dozens or hundreds of others?
- Why not embed quick pop-up questions in the lecture material, not to test the student, but to test the lecture and determine if key points are being successfully communicated?
- Why not take the resulting data, and use it to guide small-group follow-up sessions to focus on what the largest fraction of students understood least well?
This is not merely “making digital” the existing process, but redesigning the process around the availability of connection and the ability to offer content on demand to different people at different times. If someone wants to call that a “cloud lecture capability,” so be it.
Is this just a conversation about education? Clearly not. Consider the same approach to medicine. Rather than bringing patients to a doctor’s office for performance of routine tests, most of which will most of the time report a result of “test not needed,” why not place basic measurement capability at the edge of the system instead of the center? Why not collect enough information to detect a change in a patient’s condition that calls for a more elaborate test, and only in that event schedule an office visit and conduct a more elaborate analysis?
The same approach can be taken to industrial and consumer-product settings. Rather than changing the oil in a car every N months, where this is a proxy for estimating M miles of driving causing Q degree of wear and tear, why not have a simple sensor for oil condition built into the vehicle: a detector that unambiguously says when an oil change is needed, and reports that state to the service provider to trigger an invitation for an appointment?
Why should automakers take on the cost of putting this added hardware in the car? Because in many cases, routine scheduled maintenance is now being included in the price of the vehicle for the first several years after initial sale. Doing fewer unnecessary procedures therefore lowers both the cost to the auto maker, and the nuisance to the vehicle owner, without compromising lifetime or performance of the vehicle: a win-win scenario.
Again, it’s important not to think too timidly about the transformations thus enabled. When data on machine behavior is collected, unobtrusively and accurately, the resulting insights can be significant. Pre-failure signatures can be discovered that allow proactive service before there has even been a visible symptom, let alone a breakdown of a factory machine or a home appliance – or a human body.
The very nature of a service relationship can be elevated from damage control to proactive customer care. In a world of social network connection, where the consistently delighted customer who sings a product’s praises is the only credible “convincer” for prospective buyers, this is not an option. It is a necessity.
None of these things is an obvious evolution of IT as we have known it. These are transformations of process and redefinitions of product. If one shops for cloud services to cut the cost of familiar IT activities, these novel opportunities will not call attention to themselves: they will only be recognized when competitors turn them into crushingly effective tools for stealing away the only customers worth having, which is to say those who are eager to pay more for a superior experience.
The existence of such customers is a proven fact, as research published by American Express shows a growing fraction of the customer base expressing a preference for better service even if it comes at a premium price: from 2010’s figures of 58% of customers accepting the idea of a 9% premium, this group has grown to 2012’s profitability lode of 66% of U.S. consumers willing to pay an average premium of 13%.
The cloud is the only canvas on which this picture can be painted. Without the cloud’s asymptotic approach toward connection everywhere; computation at peak workload intensity without the need to own peak capacity; and instant global deployment of experimental prototypes of new engagement models, these things will not merely happen more slowly. They will not happen at all.
Engage the entire organization
As we ravenously clamor for vast volumes of rich data, and as we struggle against fundamental physical limits of storage density and computational speed, we are inexorably forced into super-scale service hubs as the only places where Moore’s Law still reigns – enabled by spectacular growth in the speed and ubiquity of our connections.
Even though this triumph of bandwidth over core count is a measurable fact, it should not be the driver of cloud strategy, any more than the invention of new weapons demands finding a war that offers opportunities to test them. The purpose of IT is not to store things or compute things, but to achieve business (or scientific, or governmental, or educational, or medical…) outcomes.
Lines of code written, petabytes stored, and cycles executed are not outputs produced, but activities performed – and to borrow from Peter Drucker, it is the cardinal sin of the talented technologist to do more efficiently what should not be done at all.
Let’s talk, then, across the entire organization—not just within the walls of the IT department—about what to seek, and what to get, from the cloud.