Main content

Ten10 Academy - why soft human skills take precedence in the AI Age

Chris Middleton Profile picture for user cmiddleton May 29, 2024
Summary:
Here's a tech training and skills consultancy on why organizations’ focus must be on human beings as AI spreads and, sometimes, is implemented with short-term goals in mind.

People carryng speech bubble icons sitting in a row, CRM conversation concept © Rawpixel.com - Shutterstock
(© Rawpixel.com - Shutterstock)

Soft skills and human intuition will be critical assets in the AI Age. That’s according to Ash Gawthorp, Chief Academy Officer at London and US-based digital skills consultancy, Ten10.

He says:

This need has become more and more acute over the years. The demand today is not just for tech skills, but also for well-rounded individuals with the right soft skills to be able to communicate, collaborate, and connect with a wide variety of different people.

It's all well and good having a great technical idea, but if you can't get people onboard and take that idea forward - take people on the journey with you - it's not especially useful.

The company was founded in 2007, but a Damascene conversion to the need for soft skills, such as teamwork, communication, and problem solving, occurred in 2012. He explains:

We had hired people with computer science degrees to put into tech roles as, say, junior developers, but we came to realize that a university education had not given them any kind of training in the practical aspects they needed to hit the ground running. 

The London Olympics had just finished when we started thinking about this. We thought, ‘We're bringing people into the tech business, but we are first having to teach them practical tech skills, which is doable, but then we are also having to teach them the soft skills they need to succeed, which is much harder.

Not only that, but we were also having to explain the reason for those soft skills! A lot of individuals would say, ‘I want to be a tech person, so I don't need to know how to communicate!’

He adds:

But I was always aware of people in the industry who didn't have a tech degree, necessarily. Some of them didn’t go to university - some barely went to school, in fact - but they had an amazing ability to see a solution when others couldn't, and to get people onboard with their ideas and take them forward. 

Often these individuals were in senior roles within tech companies, but there was no structure to how they got there.

Hence his company began focusing on producing skilled, rounded individuals to work in the tech sector, and not just on people who can code or build a motherboard. And as AI becomes a strategic boardroom issue for most large organizations, that guiding principle now seems prescient.

Pulling up the drawbridge

More and more tasks are being automated by AI, meaning that the focus is shifting away from the nuts and bolts of programming and towards more practical, business applications - a process accelerated by no-code and low-code systems. 

However, as I explored in my recent interview with OpenText CEO Mark Barrenechea, the flipside of this is that rungs are being removed from the tech career ladder - and in other industries too - making it harder for many people to bridge the gap between junior staffer and experienced professional. 

So, what does Gawthorp make of Barrenechea’s boast that OpenText, for example, no longer hires junior coders “because the AI can do that”?

He says:

That idea that we can do away with junior people is something that I see mirrored in a lot of industries, not just tech, with the impact of AI.

There are many industries where they have long relied on having a groundswell of junior people at the bottom to do the bread-and-butter work, to cut their teeth and gain essential industry experience - whether that's copywriters in the creative sector, or junior lawyers, and so on.

But you can only get to be investors in people, whatever industry they're in, through gaining that experience. They can then become the people who will, hopefully, still act as gatekeepers for AI-generated content. So, if you remove that, then… what? Are you sitting in your castle and pulling up the drawbridge? I think that will create a very self-sustaining problem. 

Then there is all the discussion around AI regulation and how we can somehow prevent that scenario from happening. That's one point. But with the skills gap widening, if we're not careful we will end up with the ‘uber experts’ at the top and the entire junior rung disappearing altogether. 

I think AI is very much in danger of doing that.

Gawthorp has a CEO story of his own, he continues: 

I was chatting with the Chief Executive of a company that has an AI-based prediction tool for project management and forecasting. And we were talking about the kinds of skills they need for their data science guys. And he was saying, ‘I don't interview anyone unless they've got a Maths degree, a Master's, and a PhD. And unless they have produced three research papers that have been peer reviewed and published in a top-tier publication.’ 

And I was thinking, ‘Wow, that's a really small pool of people - which universities will continue to produce, of course. But I feel that, if we're not careful, we're going to be looking at a scenario where most people just don't have any deep knowledge in an organization. 

And the gap between the people who are creating these models and everybody else will be so great that it almost appears like magic. You know, there's that great quote from sociobiologist Edward O Wilson. He said that the problem with humanity is, ‘We have Palaeolithic emotions, mediaeval institutions, and godlike technology.’

It’s not magic

As I have noted in previous diginomica reports, AI is being sold to us not with the promise of reversing climate change, curing cancer, or preventing nuclear catastrophe, but of giving us reams of creative tasks for free, negating the need to value human skill and invest in talent and experience. 

As author Benjamin Labatut puts it in his extraordinary book ‘When We Cease to Understand the World’ (2020, Puskin Press – highly recommended):

The sudden realization that it was Mathematics […] which was changing our world to the point where, in a couple of decades at most, we would simply not be able to grasp what being human really meant.

We simply scamper around it like apes, toying and playing with it, but with no true understanding.

So, what is Ten10’s approach? Gawthorp says:

The first thing we do is try to debunk AI, in a sense. We say to people, ‘This is not magic, it’s just maths. It might look like magic, in that you can give it a text prompt and it will produce an image or write some words, but fundamentally it’s just mathematics.’

It’s like that experiment that every kid does in school. Take everybody in the class, measure how tall they are, note their shoe size, and plot that on a graph. Any nine-year-old can eyeball that data and plot a line through it, then measure the height of somebody from the next class and predict their shoe size with a reasonable degree of accuracy. 

At some level, that’s all machine-learning is doing, whether it’s sound, medical data, or geospatial data. We find that really helps people ground the technology. 

And that will become more and more important in the education sector. Because if we are not careful, a child will ask their smart device, rather than their teacher, something and it will give them the answer. At that point, they will assume these things are all-knowing and it will have become like magic and no longer something that needs to be understood.

And at that point our worst instincts may take over – those Palaeolithic emotions that Wilson talked about at the dawn of the Machine Age in 1929 – and we risk becoming ever more passive, supine, and disinclined to find things out for ourselves, acquire new skills, and be creative. Does Gawthorp agree?

What we do with our academy, especially in the base layer of AI training, comes down to understanding what it is and what it isn't – what its limitations are, and where you can and can’t use it. 

We had a case recently where we needed to work out what the security clearance requirements were for a particular customer. So, I said to one guy, ‘Go away and look at this’. But he just asked ChatGPT and came back 30 seconds later with what seemed like a well-formed answer.

So, I said, ‘No. Go to the source. Go to the government organization that's looking for these people and look at their criteria, look at what you need to do to meet their security requirements.’ In this case, the AI wasn't hallucinating, as such, but it wasn't providing complete information. So, we might have sent people to work in that organization who would have failed the security clearance further down the line. 

Somehow, we need to get this principle into the heads of everybody in the population. Because even if an AI becomes completely and factually correct, bias is far harder to pick up on.

As I have previously explored, for years Industry 4.0 vendors have been claiming that their technology will automate boring tasks so that humans can be more creative. But with Large Language Models and generative systems, AI is being popularized by doing the exact opposite: automating creativity and freeing people up to serve the machine.

Witness the local paper in my city whose ‘AI-assisted reporters’ simply reword the news stories that ChatGPT writes to make them sound more human – a low-grade support role rather than an industry investing in its own future and the next generation of human talent.

Deborah Biscomb is Ten10’s Head of Marketing, which she acknowledges makes her an AI user herself. She says:

If I look at it from my perspective, it enables us to write more content, but I still have somebody who is a very good copywriter in his own right. And he is very good at doing that critical thinking.

That’s what we have to encourage people to do. But that is a slightly different world to the one we are looking at right now.

My take

Indeed. The words ‘I want to be a tech person, so I don't need to know how to communicate!’ are piercing in this context. 

A few years ago, I had the privilege of spending the day at singer, musician, and technologist Peter Gabriel’s Real World recording complex, at Box, near Bath, in rural England. While there, I asked in-house engineer Marco Migliari what skills this high-tech, state-of-the-art music and media facility looked for in junior recruits. 

Migliari explained that the very last thing they want is someone who locks himself away with a laptop; an isolated soul who is focused on hardware, software, and coding is no use to them, he said. Technical skills can be taught on the job and learned from experienced professionals from day to day. 

Instead, what they look for is people who are brilliant at working with human beings - individuals with great emotional intelligence, communication, interpersonal skills, and problem-solving abilities. All those soft skills that are often absent from a world that is focused on tech.

In the AI Age, those words will be more and more prophetic. But one challenge is this: is the education system really set up to encourage critical thinkers and brilliant, emotionally intelligent communicators - skills that many teachers observe are vanishing from society? 

And the flipside of that challenge is equally important. Employers need to invest in people too, and not relegate their human staff to low-grade support roles, fine-tuning the output of AIs that few will even understand. 

Meanwhile, vendors have a critical task too: create systems that solve the world’s most urgent, existential problems, and not play to people’s worst instincts in order to lure customers into a lazy, dependency-based relationship.

Perhaps above all, think strategically and for the long term in the AI Age, and set aside short-term tactics. Invest in your human resources.

Loading
A grey colored placeholder image