One of the turns of phrase already commonplace in the lexicon of edge computing is that of ‘The edge is going to eat the data center'. A bit of thought about the changes that edge computing is likely to bring along with it certainly make that a possible scenario, but is it likely?
At the recent conference hosted by Zededa, the question was put to the test from the start with an opening keynote panel session, featuring some of the sector's early entrant great and good, who set about trying to pronounce a judgement on the assertion.
The panellists were Keith Basil, VP of Cloud Native Infrastructure for Linux distribution house, SUSE; Muneyb Minhazuddin, VP of Edge Computing at VMware; Tony Shakib, General Manager of Azure IoT at Microsoft; and Said Ouissal, Zededa’s Founder and CEO.
The moderator was Christian Renault, IoT Research Director at analyst firm, 451 Research, who set out the key point at issue - will edge computing create a completely decentralised, distributed environment with the cloud, as it is currently known, fading into the background, or will there be some new point of equilibrium established?
The collective wisdom of the panellists came down in favour of the latter, but this outcome in turn would be likely to create a new requirement for CIOs in the shape of some detailed thinking and business analysis, asthat point of equilibrium would be different for every individual company. There is unlikely to be a single metric that will define that point for all and any business. This suggests that CIOs and others in the C-suite will need to be involved in significant levels of new analysis about business processes, technologies and market changes.
Some obvious divisions
Some of the pointers the panellists identified are broadly obvious. For example, with the cloud offering unlimited compute and storage capabilities, it is the right place to work with really advanced workloads such as training machine learning models. Their application, however, is likely to be a different issue, with many applications running at the furthest edges of operational networks, where real time control of manufacturing tools and processes cannot accommodate the latency the cloud introduces. What this will require, therefore, is a well-orchestrated way to distribute the workloads so they run where it makes most operational and economic sense.
Minhazuddin offered up some statistics from VMware research across its user base which suggest that in the next five years, workload distribution will split somewhere along the lines of 30% to the data center, 40% to public cloud-based services and applications, and 30% out at the edge. Right now, deployments at the edge stand at about 5%. So while the edge won’t ‘eat’ the cloud, it will change it, as Zededa’s Ouissal observed.
I think the edge is an architectural change of the cloud. Now, will it change the business models? Potentially. Will it change technologies? For sure. Will it change the companies that are going to be leading in this space? Absolutely, because it's one of the fastest shifts we're going to be seeing. The cloud started in 2005 and look at how much it's now taking out of the budgets today. I think edge is going to go faster.
This does raise an interesting side question for the user community. There are still many businesses for which getting into the cloud and implementing business transformation are still on the ‘to do’ list, so the edge means they risk being left behind. For some this will not matter, their business model is tailored to the needs of their marketplace and they change only slowly. But in practice it is a rare business which operates in a marketplace that does not change, and an even more rare one where that state remains constant and self-sustaining. This is particularly the case following the changes forced on businesses and markets by the last 18 months of the pandemic.
Industrial sector’s ‘old’ is becoming the new ‘new’
In this context, the panel did agree that one of the important drivers of fast edge growth is the fact that industrial sectors which traditionally lag behind in the application of the latest technologies are now amongst the bleeding edge users. With their widespread use of sensors, monitors and control/actuation devices, the need to connect their assets to each other and cloud services is making the edge real and them the main drivers of its current growth.
The current growth of edge computing can be seen as a move into digital and business transformation as the technology is available and the value can be seen and grasped. What can be expected, however, is that this growth will also lead to the development of a number of optimized Kubernetes-based architectures capable of driving a range of established workloads.
One of the most important, and quite possibly complex, decisions that edge computing throws up concerns autonomy and ‘air-gapped’ systems. For a large number of edge applications, the ability to work autonomously will be vital, not least because the cost, inconvenience and bottle-necking that will stem from requiring constant connectivity with a traditional data center will have serious negative impacts on business. Indeed, in another part of the conference, a clear indication emerged on one of the major changes likely to affect the current perception of cloud services and data center operations.
Large, centralized data centers will be incapable of providing anything like real time control needed for edge-based operations. Instead, the estimate is now that practical real time operations require a maximum ‘round trip’ time – including all processing required right at the edge and processing required at a data center – of 1 millisecond. In practice, this will equate to a data center no more than 10 miles away – a significantly different architecture to major, multi-client data centers servicing continental land masses. That points to most businesses remaining – or returning to – the on-premises model (which will include localized, bare metal co-location service providers) for the 30% share of overall workload expected to be dedicated to edge computing.
This ability to operate in an ‘air-gapped’ mode, with increasing levels of autonomy, will also be an essential element of all edge compute architectures, even if it is not specifically required. It will be necessary because the edge will contain many applications where danger to life, limb, property or business itself is a tangible risk. The obvious risk here is then having an unplanned, uncontrolled air-gap incident – which do happen with reasonable regularity. Workflows will, therefore, need to be designed to operate with either an air-gap as the default architecture or default to instant shut-down without impact to operators, processes or business.
One last factor to note is that there is one area where the edge is not like the cloud. The latter is heavily geared to a limited set of technologies – essentially x86-architecture hardware running variants of Linux. Out at the edge, there are a wide range of different old and new hardware and software technologies, many of them obscurely proprietary. The other anomaly from an IT perspective is that some 90% of the current control systems run Windows, with many still running heavily modified versions of Windows XT.
The watchword for CIOs, therefore, will be accommodating co-existence, which is where conference host Zededa will be claiming a niche for itself. It has introduced a light-weight operating system called EVE designed to work with a wide range of both new and old management, reporting and control systems, including old PCs already in place or being re-purposed for this new role.
Two fundamentals for CIOs to keep in mind seem obvious here: the first is that, while the statement ‘the edge will eat the data center’ is a great, if over-stated vox-pop, there is a chance that your business might just be the case where it becomes the real truth of the matter. What is important is that there will be a shift: on average around 30% of a company’s workload will shift from being best in the cloud to being best at the edge – and that edge may well be more logically considered as being `on-prem’ than anywhere else. The way it divides up – finding that point of equilibrium – will be different for every company, but it will be important for CIOs to find it, and get it right, otherwise the operations of a business could well get seriously out of kilter.
The second fundamental is that this change is likely to happen quickly, far quicker than the time taken for cloud to move centre stage. Accommodating the change will be more significant that many might imagine for, while edge computing can appear to be just an extension of the cloud as we now know it, in practice it is something quite different.