Decision Intelligence - closing the gap between data and leadership

Chris Middleton Profile picture for user cmiddleton September 30, 2022
Summary:
A pioneer in Decision Intelligence explains how leaders can not only use data to make good decisions, but also model the consequences of their actions

An image of dozens of question marks laid down flat
(Image by Arek Socha from Pixabay )

Recent events in the UK have demonstrated the importance of scenario planning and stress testing: often overlooked disciplines that are increasingly supported by the emerging technology of Decision Intelligence (DI). This is the new outcome-based branch of machine learning and AI that is about supporting decision-making across every part of the business.

In case you missed the UK’s self-inflicted problems this month, the government’s mini-budget, designed to address the financial crisis and kickstart growth, instead caused a run on the pound, a ticking off from the International Monetary Fund, and £65 billion of emergency measures from the Bank of England to protect pension funds – underwritten by the government.

Either the leadership intended this outcome, or it didn’t consider the repercussions of its decisions, which would be a failure of both management and data analysis. At the time of writing, the British Prime Minister and Chancellor continue to blame everyone but themselves, despite mounting evidence that they made a mistake: a management style that is more about belief than data.

The Bank of England, meanwhile, is stress-testing eight UK banks (to see if they could survive) against a “severe but plausible” scenario next year which includes six percent interest rates, a five percent fall in GDP, doubling unemployment, 17% inflation, and a 31% collapse in house prices. Clearly an organization that is thinking ahead and imagining worst-case scenarios.

So, why don’t more organizations do this and gather data about the future? Why does so much data analysis only consider the present moment or the past, rather than existential threats or emerging opportunities? Why are companies focused on the ‘short now’ of today’s share price, newspaper headlines, or social shares, rather than what writer Stuart Brand calls ‘the long now’, which considers the present moment in terms of years and centuries rather than milliseconds? Stewardship rather than instant reward.

A quick glance at your smartphone reveals that retail, travel, banking, insurance, news, entertainment, travel, transport, and healthcare are just some of the markets that have been changed beyond recognition this century. So, why didn’t more established brands see the future coming and model it?

Intelligent response 

Computer scientist, author, and machine learning specialist Professor Lorien Pratt is co-founder and Chief Scientist of decision intelligence software provider Quantellia LLC. This self-described “technology nerd girl” taught herself coding as early as the 1970s and is, alongside Decision Intelligence, a prime mover of the discipline of Transfer Learning.

This is the branch of machine learning that applies lessons from solving one problem to finding solutions to different but related ones. (For example, heart disease diagnosis in India versus the same challenge in Norway, or speech recognition with US and English accents: different but related domains for a neural network.)

In her view, the big challenge facing most organizations is that while they may be applying artificial intelligence and machine learning to their data, few really understand these technologies. And the vast data repositories that they have at their disposal are either poorly understood or, worse, poorly deployed.  

She explains:

[In the past] when I interviewed C-level executives, I learned that some couldn't even spell artificial intelligence, let alone use it. And that was a really big surprise: that there was something in the chain from innovative technology development to trillion-dollar decisions that was missing.

I assumed that the bigger and weightier the decision, the more rigour would be applied to it, and the more formal data would be used to back it up. So, I was really surprised to learn that the opposite is true. As managers become more senior, their decisions get less and less formal [and less based on facts].

The cult of the maverick CEO – or politician – is something with which we are all familiar: the notion that charisma and personality, or ideology and belief somehow outweigh competence and expertise, even if the data says otherwise.

She says:

Yeah, we tend to follow the person who has the greatest charisma, the greatest assertion, rather than the one who can make the best rational argument.

And a lot of these people came up before we had all the data and evidence. That was functional for a while, because humans are good at creating packages of best behaviours. It’s what helps us survive in the Arctic and other places.

But what's happening now is VUCA: volatility, uncertainty, complexity, and ambiguity. Things are changing, but the pace of change is faster than the old mechanisms are effective for. They're using these old methods – you know, the five rules or 45 rules of how to manage or make decisions, which might have worked when things were changing less often.

But what we're seeing now is, those organizations that start being evidence based and data driven are doing much better, but there is not a lot of baked-in knowledge about how to do that.

The problem is a cultural gap, she says:

It’s between the leaders and the data and analytics groups; we see it in both the commercial world and science. For example, I'm working with a national space administration, which has millions of dollars for Earth imaging on where the bushfires and floods are. They're doing AI image recognition for predicting climate change. But by the time the project makes it to decision makers on the ground, they are still using paper maps. There's this incredible gap between the data people and the decision makers.

So, why is that, and what can organizations do about it? She says:

The main reason is that decision makers think in terms of the actions that they use to achieve outcomes. But when you talk to a data person, they don’t talk about actions and outcomes, they talk about insights and answers, so there's this knowledge mismatch.

For most decision makers, it's about what business outcome they can achieve. Say their stakeholders are holding them accountable for EBITDA over the next 12 months. And the data says, ‘Here’s the EBITDA of my competitor’. But how do I change that into an action? And the data is a hidden secret. Nobody likes to talk about it, but data preparation is really hard. These are the gaps that we try to address with decision intelligence.

Where AI fits in

Is this type of technology making its way out of the research lab and into deployment quickly enough? Pratt's response leaves us in no doubt: 

Abso-freaking-lutely not! That’s my dream. And the t-shirt has been well earned over 10 years of trying to convince the US to do translational work with machine learning. But instead, we've got huge resources devoted to AI risk and safety. And it’s fine to do those things, but they don't understand that they’re presenting the data and AI to users in a form they can't digest. It's like they’re trying to make the flour and the water better, but they don’t know how to make the cake.

So, what practical steps should business and IT leaders need to take to better connect their data with their decisions? Unexpectedly, Pratt says:

Number one, leave the data out of the room at the start of the conversation. That's a really hard thing to do when you have data people whose entire careers have been reinforced by their expertise. Number two, they should mature their people, processes, and technologies.

Get a diverse group of people into the room and you find that there are as many ideations of what the outcome should be as there are people in the room. That’s because nobody takes the time to get teams, companies, and organizations on the same page as their outcomes. And the third part is about process and interventions.

You need a diagram: it's equivalent to walking into a building site to build a skyscraper, but there's no blueprint for it. There is no visual metaphor for the electricians, plumbers, and end-users to get on the same page. You need a picture of the actions on the left and the outcomes on the right, and a diagram of the important middle part: how the actions lead to the outcomes.

That's where the AI fits in, it’s where the data and AI connect. Clarity is important, because what’s happening is that the data people are guessing all the time about what decisions they're supporting. They're wasting huge resources and they're not providing what's needed.

She adds:

Some organizations expect you to have a PhD before you can do AI. But a PhD is about inventing new algorithms and we don't need new algorithms. We don't need new Formula One cars, we just need the drivers. You can take a smart software engineer and, with the right training, turn them into a kick-ass machine learning person in a couple of weeks.

The Chief Decision Officer

So, if Professor Pratt could build an ideal organization to cope with today’s complex challenges – all that VUCA and FUD that are stalking the world – what would it look like?

I would add a Chief Decision Officer. The Harvard Business Review says the modern organization is a decision factory, and this new role is to ensure that the data people's work is well integrated with the decision makers at the operational, strategic, and tactical levels. And I would empower him – no, let's say her – at the C-level.

By contrast, the Chief Data Officer is, I think, a misguided concept, because data is the chocolate chips, not the chocolate cake. It should be about the decisions that we support.

In some traditional, top-down organizations, might the CEO feel personally threatened by a Chief Decision Officer?

It’s brilliant that you said that, because this is precisely why it's not happening. Many of the current rash of senior executives are old enough that they came up without evidence-based or data-driven decision making, so they consider it a threat. The antibodies activate when you talk about bringing decision intelligence into an organization.

I just worked with a G20 central bank, and they said, ‘OK, we're supposed to be more evidence based and data driven, but nobody at the non-technical level knows what that means’. So, they tell the analytics group, ‘Get us more data’, but the analytics group is maxed out producing more and more data, and again, it's in this indigestible form. It's just charts. It's not ‘If you take this action, this will be the outcome’.

Let's use modern computer simulations, like we do with airline pilots in complex environments. Use simulations to crash the company 20 times in a red-team [attacking] environment. But most organizations don’t know how to do that. And decision intelligence helps them do it.

At the monthly management meeting where you go through the numbers, don’t just look at the backwards-facing numbers, but stress test your forward-facing decisions. Have a digital boardroom where you run simulations of the different choices you make and how they might play out.

Is she worried that, thanks to the cloud and social platforms, we increasingly live in a world of surface noise and clickbait, rather than the in-depth information we should be reading? She says:

I love that question, as it brings up a really important point. We're at a unique moment in history where we've got this cognitive load, this information overload, which makes us more knee-jerk, right? But at the same time, the consequences of our actions – because of globalization, because we're so interconnected – have these ripple effects that go out into time and space at a distance they've never done before.

The great problems we face – poverty, conflict, climate, COVID, the status of women – impact the status of water, food, security, policing, and law, so there's this bouncing around between domains and the consequences of our actions.

The challenge for leaders is being able to think about the consequences of their policies as they ripple through complex systems. We have to be much smarter about this, when we draw pictures of these ‘action to outcome’ chains.

She adds:

I'm certain I'm right. Decision Intelligence is the Artificial Intelligence for the 21st century. It is where AI comes out of the lab and meets decision makers where they're at, which is thinking in terms of actions and outcomes. And having data that is not impossible to process and govern.

My take

With companies like Google searching for thousands of Decision Intelligence specialists, and the technology now on Gartner’s radar, there is certainly a groundswell of interest in the concept.

In the meantime, just look at the news headlines. If only more organizations – including governments – only crashed the economy in simulations then took alternative decisions, rather than hitting the ‘Believe!’ button and just seeing what happens.

But to play devil’s advocate, sometimes a maverick idea can be a good one, even if there is no data to support. Be prepared, though, to accept you might be completely wrong. Ploughing on regardless as counter evidence piles up isn’t good leadership.

Loading
A grey colored placeholder image