A favourite word of mine - not least because, to the best of my knowledge it is one of my own creation - is de-cerebralisation. Basically, it means make things a ‘no brainer’ to understand, to use, and to get benefit out of in a timely manner.
It is something that IT businesses have not been good at. Indeed, they collectively seem to have revelled in the increasing levels of complexity associated with their products and services. And when a major cloud services provider can have around 1,700,000 different service configurations, each with its own pricing, it is maybe time to call a halt.
In practice, trying to stop the actual complexities that underpin what the technology can now offer users is not possible, as well as being very counter-productive. But there certainly is a case for vendors to start moving what they offer users up the levels of abstraction, away from pure technology for its own sake and on to functionality. What is more, that functionality needs to relate far more directly with what their customers do as a business and why some product or service might be useful to them.
Even before the arrival of the Coronavirus pandemic, 2020 showed itself as the year the penny started to drop for the vendors and the pandemic has only added pressure to this. With all businesses having to change and adapt very fast, tech vendors have realised clarity for the users in their terms is the way forward. So it has been good to see that this year has been the one where simplification and redirection of the message has started to sink in to the mindsets of technology vendors.
Should CIOs now admit `defeat’ and push the job of understanding that level of infinitesimal detail to the likes of the major – and not so major niche specialist – systems integrators? Indeed, is this really where the Cloud Service Providers themselves should be heading, providing services and support that are really tailored to the needs of their customers rather than the cost-effective benefit of their own revenue streams?
Why? Early in the year `Technology Live’, a series of tech seminars, highlighted the tech complexity issue by promoting some sessions that were so abstruse not even the speakers really understood what they were talking about. The value of quoting from this story here is that it still sums up the growing need for much greater simplicity. Then a motif popped into my head: CIOs are a bit like sea captains, they need to know where they are on the sea, which direction they are heading and how much sea is beneath them. But what if the `sea’ is fractal? Many are getting dragged into such waters and finding that the sea has no bottom. Like a fractal, as they approach the limit of what they think they will be able to find, there is always more of it, plunging deeper into evermore detailed complexity.
I would say that the bigger problem is complexity and the cycle of creative destruction that is getting shorter and shorter. The idea of substituting one technology for another, of rebooting/refreshing technology much faster, upgrading, migrating. Many of these things are not well done. We're not good at lifecycle management.
Why? This was how Dheeraj Pandey, retiring CEO of hyper-converged services specialist Nutanix set out one of the key problems that building cloud services has created. Pulling together onto coherent platforms the types of services that make functional sense, and making them available on all the main cloud service providers, was an obvious move. In addition, making software licences portable gives users the chance to move tasks to the most appropriate service – including on-premise – without tripping over legality. Adding Microsoft Azure to Google Cloud and Amazon AWS availability was the obvious last step to take.
In many ways this may look like another ‘businesses partner up’ story, but to me it also sets out what could be an important worked example of how the ‘platform-ization’ of cloud-delivered services will create complete service packages, where CIOs need pay less and less heed to the specification of required technologies and much more to the efficacy of achieving the required business goals. And in a fast-changing business world, that can only help improve the overall business agility.
Why? This is a good practical example of what is possible – and of how the role of the CIO is changing from being a ‘head of applied technologies’ to a ‘broker of available services’. China’s Alibaba Cloud service is the technical backbone of the Alibaba online retailing giant and its services are available in a range of options - from access to the technology through to a complete, soup-to-nuts- sales and marketing platform. Alibaba, partnering with Aryaka to exploit the latter’s software-defined WAN technologies, gives users the chance to manage where and how their own networks running on Alibaba are connected, orchestrated, optimised and secured, together with the availability of the always essential analytics, all available as a packaged and tailorable service.
We'll let you have the products you choose across your enterprise and it's our job to work with them; it's not your job to make your products work with us. And secondly, your data should be wherever you decided to put your data, you don't need to move it into our format or into our database or into our cloud. You can leave it where you want. And just imagine if you didn't have to re-tool your enterprise to match Appian. You would be free of the leverage trap that big tech has set for you.
Why? This is a classic example of moving the goal posts CIOs normally search for to a new part of the field set on higher, easier to see ground. As Appian CEO Matt Calkins observed, CIOs can fall into the trap that, if all they have is a hammer, then everything looks like a nail. This often means approaching all problems in the way they know best, with the technology the understand most. For Appian, which mixes low-code application building with AI, workflow and Robotic Process Automation, the next level of abstraction is to make its platform open, so that users can still exploit their favourite tools where they feel the need, but set them in a far wider, more elevated environment. In an interesting analogy, Calkins likens the modern workflow to being the equivalent of the operating system of automation, where digital workers and human workers are like the applications that plug into that O/S.
[NoOps] is a term that usually creates a lot of friction, or let's put a positive spin on it, a lot of interest. The whole autonomous cloud project consists of two levels: the enabling of the technology, and obviously Keptn is a key part here, but also the continuing cultural change in an organisation. We want to help people build a system where most of the manual processes they are doing today and have been doing over the last 20/30 years, are automated.
Why? As the complexity of cloud systems grows, one simple truth emerges as an important sub-text to the ‘gee-whizz’ glamour of what the technology can achieve - managing such systems, reliably, in a modern production environment is now just about beyond the wit of any individual human, and indeed, most groups of them working together, as any event that steps outside of their experience promptly stalls the management process as the inevitable committee decides what to do. It is the contention of Dynatrace that this growing problem has not yet been met by the appearance of suitable management tools, which has led the company to the creation of a NoOps capability that captures the lessons that companies learn and the best practices they have developed during its own development of a road to building a NoOps environment: in effect, an autonomous cloud service known as Keptn.
The reality is that people always have a unique set of comfort levels around how they want to see their data…What's happened in the analytics market is a lot of people have tried to create the perfect analytics tool, and what they find is that it works really well for a particular set of users. But it's pretty shitty when it goes outside of that. People bolt onto it, and it becomes a big, bloated mess…So with hyper-converged analytics, we're embracing the fact that there are all these different personas out there.
Why? Sometimes companies make acquisitions for wider reasons that just buying marketshare and instead look towards changing market dynamics. This, arguably, was the case when TIBCO set out to acquire Information Builders. For TIBCO COO Matt Quinn, the dynamic is about changing the emphasis on databases and the technology underpinning data, instead moving it to be about the data itself being considered as an addressable, exploitable entity. The technical issues of doing analytics have broadly been solved and the issue now is about helping users to achieve much better results by improving and simplifying the way they decide what it is they want to analyse; why they want to analyse it; where they find it; how to pull different but relevant streams of it together; and how they deliver results that are exploitable.
Should a wrong decision be made and you have an impact assessment in place, then you can revert back to it and actually say, `okay, why didn't you follow these? Because if you were to follow these recommendations, then some of those things would have been avoided’.
Why? There is certainly a danger that an act of regulation could be used by any nation state to exercise its particular political or cultural biases. It is their prerogative and they are legitimised to do that and indeed some are moving in that direction. This is, however, quite possibly the worst move any nation state can make, for the more that do it, the more complex any form of modern communications will become. Dr Konstantinos Komaitis, Senior Director of Policy, Strategy and Development at the Internet Society, suggested it is time for business managers to step outside of the straight-jacket of arguing about which technology to use, and then which technology should be used to manage that technology, and then what technology should be used to monitor the technology managing the technology you are using but are starting to have doubts about – and onward into nano-granularity of technologies. One important tool here would be the availability of independently-conducted impact assessments as both a possible brake, and as a fall-back position.
We look at the data fabric as two things. It brings the simplicity and flexibility of the cloud to the data center and it is also about taking optimisation and enterprise data services to the public cloud as well. It gives people the ability to feel more confident and more secure in terms of running some of their mission-critical applications inside the big public cloud providers.
Why? Back in the days of the three-yearly update of major applications and software tools, each occurrence was followed by the task of optimising the software to both the hardware being used and the specific requirements those applications and tools were intended to meet. Each time this was a long, hard and complex job that staff were glad to see the back of eventually. But that regime is fast disappearing as cloud services allow rapid creation of applications, fast turn-round of applications in use and applications that only have a lifecycle lasting a few weeks. Optimising for this, so that the best level of service and performance can achieved as quickly as possible, requires a new approach. Storage systems specialist NetApp is now pitching continuous optimisation as an answer. This is part of the company’s move to providing users with a new data fabric, as the company’s Chief Technology Evangelist, Matt Watts, explained.
We see it as a way to increase the size of the analytics talent pool in organizations and scale data-driven decision-making without having to dramatically increase the number of data scientists. By building capabilities such as best-practices templates, automated analysis and plain-language interpretability into our tools, we make it possible for users to do analyses that previously would have required advanced skills. This frees up data scientists to work on highly complex projects and serve as a best-practices team.
Why? The results of simplification in action can be seen in the way that for a few years, Big Data analytics was seen as the bleeding edge of technology. Now the talk is finding candidates for the role of ‘citizen analyst’ working in the many thousands of businesses that could readily exploit democratized data if the right additional tools are available. Jim Goodnight is keen to show that such tools are now available from SAS as a cloud service via Microsoft Azure. These are no experimental toys. He stressed that one important aspect of data democratization would be the over-riding importance of maintaining rigorous data and analytic governance, as well as ensuring on-going staff training on best practices for analysis. He saw this as critical to ensuring the right questions are asked, the right data is used, and that ultimately the work leads to usable results and better decisions.
Cloud is definitely not a technology issue, there are a lot of people and process changes required before modernisation begins.
Why? Three surveys, from Nutanix, CAST and Mulesoft, earlier in the year coincided to demonstrate why any steps towards simplifying technology and climbing the levels of abstraction make it easier for users to identify what technologies go beyond the hype and add real, usable value. This is particularly important for those businesses looking to digitally transform their businesses. Back in 2017 Gartner Research was issuing clarion calls for CIOs to “embrace the urgency of digital transformation” and these surveys pointed to a potential stalling of the transformation process. The obvious short answer is a slide into the well documented Gartner Trough of Disillusionment, but it does also give some evidence as to how that trough emerges. There are certainly vagaries in the marketplace itself, and even then there are sufficient business uncertainties to make run-away- and-hide seems like a plausible business strategy. But these reports also showed evidence that many businesses were still unclear about what they are hoping to achieve or how to achieve it, while those that they should turn to – the systems vendors – were still far too oriented to the tendency of playing ‘confuse-a-cat’ with the hokum of technological detail. Progress has been made this year, but much more is still needed on both sides.