An interesting, and quite possibly fundamental, question emerged in Paris last week. It was not stated in the way I will put it here, but this is the essence of it: now that cloud services utilization has become widely accepted as the baseline for all digital transformation projects, are businesses opening up their operations to increasing levels of danger to their operations?
The question was the running, implicit, sub-text of just about all the presentations at the annual conference of CAST Software, held in the French capital. It kept cropping up because the company’s area of specialism is Software Intelligence. That, unfortunately, is a term open to some free form interpretation, and is now not helped by the leap to prominence of Artificial or Augmented Intelligence tools. That it is not, though AI and machine learning does form a component in its mix.
The subject was even addressed at the conference by Chairman and CEO, Vincent Delaroche, with a glancing blow off the side of the analyst community:
One difficulty is that the company does not fit into the typical boxes that analysts create. It is not about security, or analytics or narrow subjects like that, it is about knowing your software.
With Cloud services come a couple of increasingly `given’ realities. One is that many of the applications, tools and service components used built on open source code. The other is that many of those applications and tools are now written using agile code development practices in Dev/Ops environments.
The advantages of these are well-known. They are extremely cost-effective in most cases, they have expanded granularity of choices so users are far more likely to be able to source just what they want, and they have shrunk development times to shadows of their former selves.
There is a growing awareness, however, that they carry a bunch of downsides, ranging from the switch of risk from the developers to the users, and a dog’s breakfast of legal issues regarding factors such as a multiplicity of open source licensing options – often in the same application.
That sentence is easy enough to write, but for any user to find out how this might affect its software and applications portfolio, it could be a major problem. It could be even more of a problem to assume it doesn’t matter.
Finding those problem areas in software, and learning what can – and ought – to be done to resolve or remove them – is the intelligence that CAST’s software tools set out to provide.
Until a couple of years ago CAST’s customer base was amongst the larger global corporations, especially in areas such as finance and pharmaceuticals where the dependence on high quality IT services is high. But since 2017 there has been a marked upward trend in the contribution made to the company’s revenues by the larger consultancy organizations such as Boston Consulting Group and EY (Ernst and Young as was). In 2017 such companies provided 10% of the revenue, while last year this had doubled to 20%. Within two years Delaroche expects this share to rise to 50%.
This is a sign that the growth in cloud services will bring with it a commensurate growth in both open source and agile development problems, coupled with a widening use of such services by businesses without the skills base to tackle the problems that are likely to follow in their trail. And if they are not amongst the victims of such problems many will likely see the sense of taking proactive action to prevent being hit. This in turn should provide a new market for the consultancy companies equipped to analyse customer applications portfolios and remediate any problems found.
Use Agile and open source – and be out of control
The need for businesses to be able to examine the inner workings of their applications and set them against known comparative metrics is a subject now rearing its head when it comes to using agile development approaches.
Harold van Heeringen, a senior consultant at Metri, a Dutch software performance measurement advisory firm, is an expert in working with Agile development teams to improve their performance. He told delegates of some of the issues starting to emerge when using this approach. Despite becoming one of the 'must have' development options, especially for rapidly changing cloud services, there are aspects of it that are not all good, especially when it comes to measuring and managing developer/supplier performance and the shift in ownership of the financial risks involved:
The old ways of application development were based on time and materials, essentially body shops where the financial risk is on the supplier side. This was followed by fixed price contracts, where the financial risk is still mainly on the development suppliers, but is now shared with the customer if they are poor at evaluating potential development costs. But with agile development nearly all the financial risk now on the customer’s side. Power goes to the development teams and they decide how they do the work. So they are potentially unmanageable. In addition, there is a lack of standardised metrics with which to measure the jobs.
He now sees senior managers sliding down the famous Gartner Hype Cycle towards the Trough of Disillusionment about agile development as they see no way to control it. This is why Metri has decided to partner with CAST to access tools that measure code function points in an internationally recognised standard way, and even automates the process:
It allows users to work with suppliers to measure the performance of the software product and the development process. It also allows them to target areas that need attention and those that can be de-emphasised.
The issue with open source was addressed by Roberto di Cosmo, who is Head of Software Heritage at the French Institute for Research in Computer Science and Automation, (INRIA), with a simple observation on an increasing common saying:
If software is eating the world, then open source is eating the software world.
According to a recent survey by Dev/Ops business, Sonatype, between 80-90% of new application code employs reused components, and they can come from just about anywhere. di Cosmo said:
It is a kind of a mess. Do you know where your software comes from? The stuff you acquire, the stuff you ship? As an example, INRIA recently found an app with 14 instances of a compression library, and each one was a different version. And there would be legal liability on all of them, if the wrong one gets used.
His job has therefore been created to collect up the source code of every open source application and component ever written and make them it all freely available. So far, he has collected 200Tbytes of separate Blobs from 85 million projects, using 5.5 million unique code items. The system has built in deduplication, intrinsic, un-forgeable identifiers at all levels, plus simplified traceability of important subjects such as licencing.
“I feel I am building the code equivalent of the Library of Alexandria.”
Open source licencing and its relationship to commercial use, especially in large global organisations, is fast becoming a subject over which many of them could now trip up. The need to be able to identify the licence type and date for every open source component used in an application or systems management environment is now paramount, for a company may now easily find that what it felt sure was its own IP, as vested in the whole application, is in fact owned by several different open source licence holders.
There were unconfirmed hints at the conference that CAST may be in the process of addressing this issue so that the real licence status of all open source components can be identified, even if written details in the source code have been erased.
Curating the new
The fact that applications development now plays one of the most important, central roles in the activities of all businesses now means that management, measurement and control are all crucial to understanding their efficacy within those businesses. It follows that apps development can no longer be considered an `art’ (though that can still sometimes have its place). Now it is the process around which most business activity hinges and has to be effectively curated, well understood and well managed.
Abhijit Lahiri, Senior Vice President and Chief of Transformation with Tech Mahindra, outlined for delegates ways in which this issue can now be addressed:
Customers now want to see the business outcomes before anything is written. This is standard in many other industries and has to be the way with applications, with ways of overseeing the process, the results of each step, and assessments of the quality of the components. It also means setting all that against wider value judgements such as the cost of the process or the time it takes.
He sees building an effective curation process as a vital step, particularly with the growth of cloud services, where as much as 80% of the software required can be built on the containerised re-use of components. He acknowledged that this will often involve multiple suppliers, which in turn highlights the need for curation and transparent management. This then allows every contributor to understand what he called the "flight path" of any project and are able to check and monitor its progress.
There is also a need to upskill people, despite any natural reticence towards such an investment. He offered an interesting anecdote to make that point:
A CFO says to a CEO, 'What happens is we invest in developing our people and they leave?’. The CEO responds to the CFO, 'What happens if we don’t and they stay?.’”
The key here, he suggested, is micro-upskilling – running many project focused short courses on specific topics, making use of AI enabled teaching tools, and overseen by what he called "full stack Scrum commandos", small teams of multi-skilled people with the aim of getting projects right first time.
This curation process is also leading to the growth amongst major global corporates of focusing development into large in-house development centers, as observed by Malay Shah, Executive Director with EY.
These are where the companies have decided to build their own resources for development work. They are not just centres for core products, either, but also conduct R&D on software applications and technologies, often ending up as producers of patented developments. It is also perhaps worth noting that 43% of these centres are located in India, with the Philippines next at 13%, Poland has 7%.
The goal now is to build global capability centres that can become digital Centers of Excellence, and trusted partners for third party customers, a step that is only possible be they now have the ability to measure and analyse code quality, cloud readiness, and code structures, as well as provide developer team guidance.
Many will no doubt talk about the cramping of innovation and flexibility just as agile development and widespread re-use of open source components threatens an explosion of new application development. But at the same time, business managers need to remain aware of the fact that those same applications will, not be optional extras and nice-to-have fripperies. They will now form the heart, soul and muscles of what makes the business tick. For many those applications will be the business.
If you have no idea how they work, what logical, operational or security frailties they have, or who actually owns vital components and could quite happily stop your applications from being used at all, then your business could well be riding for a fall.