Main content

Engagement, functions, resources - a connected digital enterprise architecture

Phil Wainewright Profile picture for user pwainewright September 10, 2018
Connected digital technologies are reshaping enterprise architectures around a platform of engagement functions & resources to support next-gen applications

engagement functions resources - connected digital enterprise architecture via
Enterprise technology architectures are moving towards what I recently called a seismic shift — but I've given few details. This post aims to rectify that omission. The new model I see emerging is centered on three core centers of gravity — engagement, functions, resources. This is how connected digital technologies are reshaping IT to enable a new world of frictionless enterprise.

I used to think of this as a vertical stack of three horizontal layers, but on reflection, the accompanying Venn diagram is a better way of representing how these three sets of components interact. The concept of tiers or layers is part of the legacy baggage of enterprise IT, harking back to a time when data sat in databases, functions were processed by application servers, and people interacted with an application interface.

Connected digital computing architectures

In most traditional enterprise applications — think classic ERP, CRM or HCM — all three tiers were part of the same stack. Getting at the data or interacting with a function meant grappling with the entire stack. But the advent of connected digital computing architectures means that's no longer the case.

In modern enterprise IT architectures, all of these tiers are being decomposed and their components made available via APIs.

Instead of being baked into specific applications and their underlying databases, both data and functions are delivered as loosely coupled services, accessed via any number of different engagement modes. In addition, many system-level resources that used to be application-specific, ranging from identity and access management to machine learning, are now routinely made available as autonomous system services.

This leads to what I've called a 'headless' approach to application architectures. People no longer have to load and open a specific application to access its functions and resources. Instead, they can do so from a separate engagement layer, such as a conversational messaging app or a voice assistant, which fetches answers and actions on their behalf.

Engagement with headless applications

The emergence of this separate engagement layer directly accessing a whole range of business logic, process functions, system services and data sources has a number of far-reaching ramifications. This is not just a matter of adding conversation as a new interface option alongside conventional web and mobile apps. It makes it possible to completely refactor the underlying applications and systems. As I wrote last year when presenting several examples of headless applications:

In all of these examples, the traditional bundle of functionality that makes up an enterprise application has been broken down into separate components that are then recombined in new ways to provide a different, more streamlined outcome that wasn’t possible without the new technology. This is a phenomenon known to economists as unbundling and rebundling and it’s invariably a harbinger of disruptive innovation in a given field as new patterns of consumption become possible.

The rise of messaging applications such as Slack and Microsoft Teams provide just one example of these new patterns of consumption in the engagement layer. These messaging and collaboration platforms are increasingly becoming hosts to workflow automation, which link different functions and resources together to automate process flows. These ad hoc, low-code/no-code process integrations become mini-applications in themselves, but with far more flexibility than traditional custom-built applications.

Coaching networks in the digital enterprise

In addition, there's an as-yet little-noticed side-effect of moving engagement into this conversational computing layer. It converges two domains that have previously always been separate — people's conversations and behavior with each other, and their digital interactions with computers. This provides raw material for the iterative feedback loop that characterizes the emerging phenomenon of coaching networks, a phrase coined by VC investor Emergence Capital's Gordon Ritter. In coaching networks, computers analyze behavioral data to help humans achieve better results:

The computers therefore have to learn how to understand people, whether by listening to their voices, interpreting their messages, watching video, or collecting sensor data from around them. This adds a huge volume of new data that was never previously available digitally. At the same time, computers can use speech and messaging to guide people in the course of what they’re already doing, instead of having them turn away to type at a keyboard or point and click on a screen.

Ritter believes that this provides the foundation for an entirely new generation of digital enterprise applications, in which machines will analyze behavior and provide guidance to help people achieve better results, while continuously learning from human adaptation.

A platform for next-gen enterprise applications

The opening up of functions and resources underneath the engagement layer provides the platform on which such next-gen enterprise applications can flourish. Here, two separate strands of evolution are coming together.

On the one hand, there's the 'APIfication' of traditional enterprise technology stacks, a phenomenon that's been the subject of several conversations I've had with MuleSoft founder Ross Mason. He and MuleSoft (now part of Salesforce) speak about three layers of APIs — experience, process and system — which form a similar template to my own triumvirate, except that MuleSoft's, as you might expect, is much more closely based on the existing enterprise IT architecture that the company caters to in most of its customers.

But now look at what architecture you're likely to use if you want to build brand new digital applications. It's all about microservices, increasingly served up as serverless functions. This is just another way of meshing up service APIs, which was certainly the strategic direction at this summer's Google Cloud Next conference.

Serverless functions and resources

There's even a debate developing between serverless purists who argue that using third-party functions built to operate at scale is always preferable to building your own. As someone who spend many years making the same argument about cloud, I see the irony:

As the history of private cloud shows, there will be many who persist in following their old models out of habit and resistance to change. But others — like MuleSoft's customers — will have no choice but to transition slowly because they're stuck with legacy baggage that they can't junk yet.

Final takeaway

The most important takeaway is to work within an architectural framework that looks forward to a managed network of autonomous microservices rather than backwards to a rigid hierarchy of micro-managed APIs.

Ultimately, I suspect that what I've called functions and resources will look more and more like each other, but for now I feel it's helpful to think of them separately. In my mind, functions provide business or operational logic of the type that traditionally sits in the application layer, while resources are raw data sources or systems logic. But the circles overlap because none of these are precise demarcations.

Perhaps the most important message, therefore, is to leave old preconceptions behind when building applications in this new world. This is the emerging foundation for a new generation of enterprise IT that's built to support the real-time connections and constant change of frictionless enterprise and XaaS business models.

A grey colored placeholder image