Frictionless Enterprise - the evolution of the digital user experience

Phil Wainewright Profile picture for user pwainewright January 6, 2023 Audio mode
Summary:
We start the New Year with the next in our series on the journey to Frictionless Enterprise, as Phil Wainewright examines the ongoing evolution of the digital user experience.

Young female engineer at work with digital UX © metamorworks - shutterstock
(© metamorworks - shutterstock)

The previous chapter in this series on Frictionless Enterprise examined the underlying technology architecture that shapes digital strategy. Now our focus turns to the people who use it, without whom the technology would have no purpose. This chapter explores the dramatic and ongoing changes in how we interact with digital resources, while the next chapter will look at the wider impact on how we work and how organizations manage people at work.

As with every other aspect of Frictionless Enterprise, digital connection is the fundamental driver of the transformation of the User Experience (UX) we'll describe in this chapter. That transformation is taking place in two dimensions. First of all, connected digital technologies have dramatically expanded the volume and range of information and resources that can be conveyed to users, wherever they happen to be. Secondly, these technologies are expanding the scope of the UX to encompass far more of the user's context and surroundings.

Evolution of the digital UX

For the first fifty years of the history of computing, interaction was almost entirely in the form of numbers and text. Icons and images were added in the early 1990s as a result of the introduction of the Apple Mac and Microsoft Windows, which popularized a more flexible Graphicsl User Interface (GUI) over the earlier alphanumeric standard. But the landmark event in the evolution of a truly digital UX came with the launch of the Apple iPhone in 2007. This ground-breaking device embodied the combination of connectivity and digital technology in ways that I suspect not even its creators fully realized at the time.

Equipping the phone with a powerful on-board computer, touch screen and Internet connectivity made it possible to connect mobile users to enterprise data and resources with far less friction than ever before — fulfilling essential characteristics of Frictionless Enterprise such as the ability to operate on real-time data and resources, available anywhere on-demand. By adding a camera, GPS and other sensors, they provided mechanisms for instrumenting the recipient's environment and reactions to a far greater extent than earlier mobile devices, boosting other key characteristics such as adaptability and collaboration.

The rise of cloud computing and pervasive connectivity enabled a further crucial development — the separation of the User Experience from a specific device. Pioneering apps such as Evernote enabled users to update the same document whether they were using a desktop PC or a mobile device and switch seamlessly between the two. Applications and their data were no longer tied to a specific device platform. This was the beginning of a User Experience that is delivered from the network rather than from any individual device.

The popularity of social media and messaging apps brought new ways for consumers to interact with the brands and retailers they buy from, fueling demand for what the industry calls an omnichannel experience — in other words, the ability for a user to interact through whatever channel is most convenient for them at the time, and then continue the conversation across messaging, voice, in-person, or otherwise. The need to deliver this highly personalized consumer User Experience from the network rather than constraining it to a specific channel or platform in turn led to a new wave of composable Digital Experience (DX) platforms in the Business-to-Consumer (B2C) sector, with Business-to-Business (B2B) not far behind.

Digital twins - instrumenting our enviroment

In the industrial sphere, the emergence of low-cost sensor technology and wireless capabilities created much excitement around a concept known as the Internet of Things (IoT), which involves the collection of environmental and operational data from machines, devices and their surroundings to monitor and optimize performance. In its first iteration, this aimed to collect as much data as possible from a given environment and then figure out how it could be used. A more mature approach has since focused on collecting data for specific purposes, such as analyzing operational data from a piece of machinery to detect when it might need maintenance or repair in advance of an actual breakdown, or to discover opportunities to enhance performance or efficiency by changing how it runs. This digital instrumentation makes it possible to bring the XaaS Effect to physical assets. Manufacturers such as Rolls-Royce among others use the word servitization to describe this use of continuous data collection to enable improved customer service.

One concept that's commonplace in these scenarios is the notion of a 'digital twin' — a virtual representation of the physical object, which can be used to analyze the operational data as it's collected and then run simulations to help predict and plan future behavior or outcomes. A digital twin might represent an individual component or piece of equipment, or in some cases it might represent an entire factory floor or some other ecosystem of many different participants and their surroundings. Take the example of Samsara, for example, a vendor that specializes in collecting IoT data from operations, such as a fleet of refrigerated trucks, and then analyzing it using AI to find opportunities to improve fuel efficiency, or to coach drivers to reduce accidents.

Digital context - digitizing our experience

People don't explicitly talk about building a 'digital twin' of the user and their environment, but that's effectively the consequence of digitizing the user experience. While the overt purpose is to enable more frictionless operation by delivering real-time data and resources on-demand, wherever the user may be, it also brings the capability to digitize the user's interactions and aspects of their surrounding environment. There doesn't have to be anything sinister about this, provided the user is aware that it's happening and consents to this use of their data. Expanding the scope of the UX to instrument the user's interactions and surroundings makes it possible to improve engagement and reduce friction even further through better automation.

One could even argue that greater digitization of the UX has the effect of further humanizing the experience, in the sense that it allows us to interact with computing in a more natural way, starting from what we see, feel and say. Instead of asking mobile workers to fill out cumbersome forms on their iPhones to complete a work order or status check, why not start from an image, some voice commentary, their geolocation and biometrically confirmed identity? Laurent Gasser, CEO of Wizy, a French startup whose platform enables image-centric mobile apps for field workers, comments:

Historically, process has gone from paper to forms, and from web forms to mobile forms. But anyways, there was a form you complete with information. We think it should change.

You should start with the picture, extract information from the picture, and then start your form, or start your process, already pre-completed with 50% of the data in it.

In a consumer context, businesses use geolocation, sentiment analysis and other digital UX clues to make more relevant suggestions, for example when the user is visiting a sports stadium or shopping mall. Artifical Intelligence (AI) plays an important role in analyzing this contextual data and tailoring the experience accordingly, as we'll explore further below.

The Metaverse is here, not theirs

In the past year or two, people have started talking up the Metaverse as if it's an entirely new way of interacting with computing, but in reality it's just a new word for what we've always done. All computing operates on digital representations of the physical world. The only thing that's changed in the past seven-odd decades of computing, right from the very first punch cards and alphanumeric green screens, has been the scope of those representations, and how the computers show them back to us. The Metaverse is simply a richer depiction of the emerging digital twin of the real world, representing either our real-world environment or a version of it that we've imagined. The Metaverse is not some separate digital territory to be parceled up and colonized. If we must use the word, it simply describes a further elaboration in the constant progress towards a more expansive user experience.

So why are vendors trying to persuade us that the Metaverse is something entirely new? One of the constants of the history of computing is that technology vendors have always aspired to gain proprietary control over those representations. In fact, the entire history has seen a back-and-forth tug-of-war between vendors attempting to monopolize how their customers experience computing and the customers themselves, whose interests are best served by a healthy degree of competition between the vendors they deal with. Just as IBM once tried to own the enterprise computing landscape, just as Microsoft attempted to own the personal computing landscape, just as AOL, Compuserve and others tried to turn the Internet into a walled garden to which they controlled the access, now Meta, Microsoft, Apple and others aspire to own the Metaverse and charge rents to all who use it. Like all of their predecessors, they will fail, because connectivity demands openness and a free market demands choice.

A common fallacy behind concepts such as the Metaverse, as these vendors are presenting it, is that the user is perceived as a passive object of computing — something to control, manage and monetize. The success of Frictionless Enterprise depends on the user being an active participant, and the proper purpose of the digital user experience is to engage and empower them.

Conversational computing and the multi-sense UX

Enterprise computing has seen a huge rise over the past few years in the use of messaging apps as a platform for interacting with digital resources. Chatbots and voice assistants have harnessed AI to help surface data and actions from other applications into the messaging layer, making it possible for the user to interact with those resources using natural language in the flow of their work, rather than having to negotiate many separate application UIs. This trend is known as conversational computing, as I explained here:

Thanks to the rise of AI-powered voice interfaces and messaging chatbots, conversation is becoming the new frontier of how people interact with computing ... What makes this significant is that we can get a response from the application without ever having to leave the conversational layer — and we can converse with multiple applications all from the same platform.

This is part of the humanization of the digital experience that I mentioned earlier, and there's still some way to go in its evolution. Whereas five years ago digital conversations typically took place in chat or voice, today video is far more commonplace, whether in web meetings or in the rising use of video recordings to share information and feedback. As Phil Libin, previously CEO of Evernote, which we mentioned earlier, and now the founding CEO of video presentation app mmhmm, told me a year ago when speaking about the future of video meetings:

Like the dot-com days, when the Internet came and embedded itself into the fabric of every business transaction, I think now video is going to be embedded in the fabric of every business transaction, even when people are walking around doing stuff in person.

Jeetu Patel, who as EVP & GM of Security and Collaboration at Cisco looks after the Webex video meetings platform, talks about making the experience of meeting digitally 10x better than meeting in person:

My contention would be that when you are actually meeting someone in person, that you will still want Webex turned on, because it can do things that you might have not otherwise have been able to do as effectively through sitting and talking to them in a complete analog form.

The "10x better" comes as a result of digitizing the interactions so that the digital platform can then augment the user experience with tools such as automated transcription, simultaneous translation, fetching relevant documents and knowledge, suggesting next actions, and so on. I've even speculated that the utility of this kind of digital augmentation might even see people wearing some kind of VR headset in the future to enhance the experience in certain web meetings. Moving beyond digital representations of text, voice and static images, the UX becomes 3D and multi-sense, digitizing more of the user's context in order to serve up a better all-round experience.

Experience workers, experience economy

One highly significant side-effect of moving to a far more mobile and digitized User Experience is that it brings more knowledge and capabilities to the frontline workers in an organization who directly serve customers. This shifts the center of gravity of enterprise computing away from the back office and onto the experience layer, as digital transformation reinvents formerly analog patterns of work, engagement and commerce. As I wrote shortly before the onset of the pandemic about the rise of the experience worker:

Connecting these firstline workers into enterprise systems gives them the knowledge they need to support customer success at the point of consumption. Whether they're retail associates enabling a tailored shopping experience or service engineers suggesting product enhancements to R&D teams, they become a far more valuable asset to the organization. Meanwhile, their itinerant, mobile pattern of work is spreading to other roles across the organization. After all, desks only exist because people needed to organize paper or sit in front of a static computing device. Digitally connected information and knowledge, available wherever we are, allow all of us to be productive on our feet.

This brings us into territory that we'll explore in more detail in the next chapter, but the key point to note here is that the expansion of the user experience is enabling completely new patterns of work that many of us began to explore during the pandemic, but have not yet fully embraced. Here's my verdict at the time on the concept of hybrid working:

The most important feature of hybrid working is not that some people are in the office and some people aren't, but that everyone is using digital technology to stay connected. Hybrid working marks a shift from the old world of mostly analog-only encounters to a new future in which all aspects of teamwork are digitally augmented.

AI and digital augmentation

We can't talk about digital augmentation without mentioning Artificial Intelligence (AI) and Machine Learning (ML), which are integral technologies to realizing the full potential of the digital user experience. Digitizing the UX opens up the user's interactions to on-the-fly analysis using AI and ML, making it possible to detect the topic and direction of the conversation and suggest useful knowledge or appropriate next steps to achieve a goal faster. This is using AI to help humans work smarter rather than replacing them entirely. As Rick Nucci, CEO of Guru, whose software surfaces useful knowledge in the flow of work, explains:

The risk is, we're going to rush as an industry and say, 'Let's put algorithms in front of our customers and get rid of the customer service agents and sales people.' The reality is that the technology is not going to understand critical things like empathy that is required to have a great customer experience.

This type of technology makes humans better at their jobs and this is key in how we think this evolves.

AI can also analyze patterns of work and suggest new ways to achieve goals more effectively, as I've argued in relation to digital teamwork:

In the same way that today artificial intelligence can listen in on contact center interactions and suggest alternative lines of conversation or actions, in the future a digital teamwork tool could suggest an alternative way of setting up a new task that might improve results.

Gordon Ritter, an early investor in Salesforce and founder and General Partner at VC firm Emergence Capital, promotes a concept the firm calls Coaching Networks, in which machines and humans collaborate in an iterative feedback loop, with humans contributing the creativity and originality, while the machines analyze behavior and outcomes to suggest what works best. As he recently explained:

The software acts as a real-time, on-the-job coach, guiding users to successful outcomes and gathering new data that are fed back into the system. 

After five years of researching and investing in this model, Emergence has concluded that the best results come from analyzing the richest sets of UX data, provided that the model is informed with a strong contextual framework within which to analyze a more expansive UX. In this way, the digital UX becomes the platform for creating a new generation of enterprise application vendors whose digital workflow is constantly improved by studying its users — taking the XaaS model of engage-monitor-improve to a whole new level.

User empowerment

None of this works without proactive users who are willing participants in this digital landscape. The success of Frictionless Enterprise is contingent upon breaking away from old industrial models of work, in which individuals were seen as simply another cog in the machine. These old models sought to eliminate human variability in the service of predictable, repeatable processes, but their downside was an inability to adapt to change. Today's more sophisticated and powerful technologies are better able to assimilate and flex with change, and therefore can accommodate and build on all the richness of human variability. Yes, humans make mistakes, get tired and have their own agendas, but they have been tuned by millions of years of evolution to rapidly invent and apply new patterns of understanding and behavior, something computers can't even dream of.

The emerging digital UX is one that empowers users and works best when it gives them agency in how they achieve successful outcomes. How organizations can attract, nurture and manage this new generation of users will be the subject of our next chapter.

This is the fifth chapter in a series of seven exploring the journey to Frictionless Enterprise:

You can find all of these articles as they're published at our Frictionless Enterprise archive index. To get notifications as new content appears, you can either follow the RSS feed for that page, keep in touch with us on Twitter and LinkedIn, or sign up for our fortnightly Frictionless Enterprise email newsletter, with the option of a free download of The XaaS Effect d·book.

Loading
A grey colored placeholder image