Much of the real work, and large amounts of the important data, that goes to make up a successful business are generated out at the edge of its network, domain or, as it used to be called in my young day, the coalface. Moving the data generated at that coalface back up to a central point for processing used to be the only way to find out what was happening down there, and how the business was doing.
As businesses get bigger, however, that data gets further away and grows in volume. It is very easy to start to lose touch with it all, as well as for it to become increasingly expensive to move the vast quantities of data now generated back up to head office. It is slow and wasteful, not least because a reasonably large percentage of the data moved need not travel – and indeed can probably be deleted once collected and reported.
This is why moving much of the data processing out to the edge is becoming the key infrastructure challenge that most businesses are going to face. It is also the reason a large number of vendors, of all descriptions, are adding the phrase ‘we do edge’ to their marketing spiel. In practice however, getting there is usually going to be more complex than installing a few new applications, as well as expensive and time consuming to sort out if you get it wrong.
For many businesses, that complexity is going to mean that help is needed, and there are not that many vendors out there with the portfolio of products and services to have a good feel on what is required in each individual circumstance, or the range of partnerships with other vendors needed to specify a complete package, provide it and, if required, manage it into the future.
One such is Dell. Its Infrastructure Solutions Group has a portfolio that spans everything from cloud infrastructure all the way through distributed server technology, network devices, and indeed, endpoints. According to its UK Vice President Tim Loake, one of the biggest problems that needs to be faced is not even technical, it is physical.
To get on, a platform is useful
Dell is pulling together all the services, products, and partnerships together into what is, in effect, a single SKU, its recently introduced Native Edge Platform. This is designed to provide users with a central operational management framework and console that allows blueprints of design and deployment into a remote infrastructure in a in a zero trust way. Loake explains:
Our portfolio gives us some insight into some of the challenges that organisations will face as they try to take advantage of the edge opportunity and then think about how they actually implement that without it costing the earth. The single biggest issue with getting to a distributed edge infrastructure is that most of the places that you want to put edge technology, you won't have IT staff. You may not have people at all, which gives you a problem when it comes to set up deployment, maintenance, administration, and so on.
The second biggest issue is security/reliability, especially as it runs counter to the traditional notion of centralising data as the best security approach. Working with data directly at or close to source, out at the edge, creates a new range of attack points that are often in intrinsically insecure locations.
Loake’s view is that Dell now has the technology and product portfolio to fit most of the use cases that businesses will have today for deploying into an edge environment which may be inhospitable. Just the physical location can be challenging, with fresh air cooling, varying degrees of temperature, moisture, humidity, and dust as well:
And you have to think about how you protect your cables from the rats, which has been a unique challenge, it can seem like an exercise in futility. But we have borrowed from our experience and connections with the telco providers to deal with exactly some of those issues. We're partnered up with BT with Vodafone, and all the telcos in one way or another. And in some cases in the edge space as well, because they do have a lot of the distributed network infrastructure that obviously will connect these devices together.
The Native Edge Platform shortens, and often eliminates, many of the set-up processes that edge computing introduces. So if a device is changed, through failure, upgrade or change of purpose, plugging it in to the network will allow it to dial directly into the into the Operation Centre and download its personality, including all the applications it needs to operate. This means the device can be shipped directly from its manufacturer’s factory and installed by local staff without special training. So long as the device has power from the network, the Platform can work with everything from bare metal backwards. Devices can also be refreshed as and when required, even if that means multiple times per day.
But as Loake points out, Dell also has one of the more comprehensive of deployment options available, ranging from direct sales to consumer end users, through the channel and out to major data center installations, as well as working with both ISVs and services organisations. When it comes to edge computing, however, the company is keen to make its technology so it can be deployed by anybody capable of servicing a machine tool.
Increasingly, this now also means Dell becoming involved with the vendors in the design of those tools and systems that require monitoring and/or management. To that end, it now has an OEM business for designing, building and integrating the necessary IT systems and services.
Edge and the Windows analogy
The appearance of the Native Edge Platform at this time raises an interesting question that could have a long-term impact on how edge compute services develop, and especially how they get deployed. Edge devices are going to be deployed in their tens of millions, and that is probably a severe under-estimate. In addition, they are going to have to work with a far more diverse range of existing (and future) technologies than anything before in IT. In addition, it is clear that no single vendor is going to be able to claim to ‘do it all’. Making edge work effectively will require a growing degree of standardisation, at least in terms of creating readily collaborative environments.
This then comes back to what might be called the ‘Microsoft Question’ - is there a need in edge computing for an equivalent of Microsoft Windows, or indeed MS/DOS ie: a standard environment or platform which provides applications developers with a common interface to both end users and the computer and storage facilities that were available? This created the notion that developers could ‘write once/run on many’. There are already several options available in the edge applications arena that can now handle that, especially in terms of the same core application running in the same way on both ‘full-fat’ data center instances and ‘ultra-skinny’ versions running on small resources working right down at the farthest edges of a company’s domain.
But there is still the much wider issue of pulling together dynamically-variable, technology-agnostic, collaborative environments that can be (to under-value what will be a vital task) effectively ‘thrown together’ in a strong analogy of consumer users loading a PC, Apple Mac or mobile phone with apps and tools and expecting them to work together properly. Right now, at the edge, that still requires serious amounts of skilled technical ability applied over a long haul.
Is there, I asked Loake, a need for something similar at the edge? And could it be that other vendors with similar capabilities to Dell’s could all come together to start mapping out common rules and tools, so that `collaboration’ could lead to users building business services that are greater than the sum of the parts. Being a cloud service based on open software, it is already well set up for such a role. He replied:
Interestingly, the Native Edge Platform, although it is built as a control plane, uses open standards across the board. So we use Ansible, and Redfish, we plug into those existing technologies, we use packaging standards for the application. So you can package your application using a VMware streaming technology, or you can use regular transform files. You can package your applications to be deployed by the Native Edge Platform using existing standards. So although we have a control plane there that allows us to do certain things from a distribution and a management point of view, being open it is designed to plug third party software technology. And that's one of Dells guiding principles when we build our technologies: that we are an open standards company.
And we have our leaders from a product group perspective sitting on many of the Open Standards Boards, as well as developing those standards in conjunction with HPE, Kyndra, AMD and Intel as well many others. So we absolutely believe in open standards. And we have those API's all built into the open standards that already exist.
Edge computing is bound to end up vitally important for the majority of businesses, but it is also likely to end up a pain for many of them trying to get it established and running smoothly. For some they won’t have sufficient staff with the right skillsets, and for others the shear size and complexity may well leave them flummoxed. Having an outside source of expertise for the process will, for many, prove to be an important option.