Friday Rant - academia still drinking the algo-pop despite the A-Levels fiasco

Profile picture for user cmiddleton By Chris Middleton August 28, 2020
One British university is using AI and automation to solve all the problems caused by AI and automation. Or is it?


(Pixabay )

The 2020 English A’Level results fiasco has become such a byword for government incompetence that software companies across the globe have been queuing up to tell us that their algorithms are better designed than Whitehall’s.

Let’s hope they are. But the real lesson from last week’s cock-up is this: the first thing that gets automated in any organization is not some repetitive task or a human job – they both come later; it’s the organization’s assumptions about the world. 

If those assumptions are wrong, then there are consequences. And when they are hardcoded into an algorithm, then it both automates the bias and gives it a false veneer of digital neutrality. 

This much we now know. Universities will still be wrestling with the knock-on effects of this entirely avoidable catastrophe for years to come. So it is perhaps an inappropriate moment for one of them to announce that it is using yet more algorithms – AI and intelligent automation – to help deal with the fallout. Or at least to manage its admin at this time of unwanted stress. 

The timing sets off a klaxon in my head that says, ‘Beware! Organization marketing itself against the backdrop of a crisis!’ Always a risky strategy, as any media training course will tell you, but anyway...

In practice

Andrew Proctor is Pro Vice Chancellor Digital at Staffordshire University. He has been doing the rounds this week talking about how intelligent automation can help universities “streamline their operations and provide improved experiences for students – particularly in light of the current challenges higher education institutions are facing with new admissions.”  It’s a brave move in 2020, with Proctor hailing “the exciting opportunity” for Staffordshire to be “really leading the way” on this with its software partner:

We've got a big, multi-year programme of embedding this in our new operating model at the university, so we are very excited to be, I think, the first higher education institution to partner with [UK automation and digital workforce provider] Blue Prism. Education is a challenging environment to be operating in at the moment, even more so over the past few weeks. But one of the things that we've always been mindful of at our university, is that the ability to be agile, lean where appropriate, and much more responsive to market conditions is vital to our future direction.

Proctor describes students as “customers” of the university, which he says wants to avoid the “steady state” of business as usual:

In the past, our operating model was very ‘siloed’ and process-centric rather than outcome focused, so we fell victim to just ‘serving the machine’. But we are now trying to adopt more digital culture and thinking, and to become a lot more ‘outside in’.

It’s hard for me to conceive of someone sounding more digital than this outside of a vendor’s PowerPoint presentation. Indeed, it’s unusual for a university spokesman to have slurped this much Kool-Aid – algo-pop, perhaps – in the language he uses to describe technology in education. But Proctor explains :

University staff would much rather be spending more time with students. We quickly spotted that the use of some really compelling automation and AI technology seemed like a good answer to some of these questions. In terms of our digital strategy, the use of AI and automation is one of our core pillars.

Fair enough. It’s just the context that is uncomfortable: a national failure of automation, albeit one that Staffordshire can hardly be blamed for itself. Yet despite the rank incompetence on display in Whitehall of late, Proctor goes out of his way to not criticise the government. Asked how he believes the Department for Education handled the crisis, he talks up Staffordshire’s track record instead:

We take a human-centric approach to the use of automation and AI, in that what we are trying to achieve is using technology to free up time for people to pursue more rewarding work and make more of their human connectivity. Rather than AI and technology being seen as this cold, distancing thing, we think that the use of automation can actually bring students together more successfully, and our teaching staff as well. If we can free people up from mundane, transactional activities, then we can promote more social ones, more contact time, and generally make the student experience, and the staff experience, more rewarding.

The problem is that customers often respond by slashing internal costs and human jobs rather than by making their organizations smarter: a perennial problem. So in this sense, it’s heartening to hear a customer talking like a software marketer, if only because it suggests he understands the product. Yet it’s still uncomfortable in the current context. 

But what about the message itself? Arguably, the idea that all employees need to be freed from the tyranny of repetitive tasks does a disservice to some excellent human workers.  Some people love repetitive tasks and logical processes, or actively seek them out. It’s why some of them become administrators. And what’s wrong with silos anyway? Aren’t they just places where we store things in depth? Do we all have to live in a world of surface noise? Proctor says:

As an ambitious university, we are really putting the value on people being creative and driving forward, adding as much value to our students as possible. But there is still room for a bit of transactional activity, particularly in customer-facing areas like student support. The ability to spend more time with students, our customers, is really important to us.

There’s that word again. But are students really ‘customers’ in some higher education superstore – one staffed by creative, outlier thinkers who want to hang out with the cool kids – while AI-powered digital employees (bots) mail out the invoices? Isn’t this a bit reductive? Isn’t this how we ended up in this mess in the first place?  Proctor responds:

Let’s unpick that. It’s a debate that academia has quite often. It’s fair to say there has been quite a bit of marketisation in higher education. Ultimately, students are paying us for services and they expect high-quality provision for their tuition fees.  People who interact with a product or service have high expectations. They expect things to be tailored, flexed [sic], and personalised. And higher education has been very slow to respond to this. So we are really comfortable with referring to our students as customers. We don’t see it as a dirty word. It’s the kind of university we are trying to build. We want one that is agile, with a mentality that business as usual isn’t always good enough. We need to be constantly looking at what our students are doing, so we can react to them as quickly as possible.


On the subject of which, how has the algorithm crisis affected Staffordshire itself? Proctor says: 

The [A’Level results] algorithm was designed, by people, to achieve a certain outcome. I think it was the design that was potentially flawed and needed to change – as it has done – rather than the technology being at fault itself.

I find that an odd answer by any standards, suggesting that technology somehow exists outside of itself and the people who design it. Plus the algorithm hasn’t ‘changed’; it’s more that its recommendations have largely been ignored.  He continues:

It does have a knock-on impact for us. Some students may decide we are no longer their first choice, and it may mean that students who were going to join a university elsewhere will now join us as their first choice. It’s such a fluid environment. It’s really challenging to get a grip on. here’s this narrative of universities being driven purely by the finance, trying to ‘hoover’ up as many students as possible.That is absolutely not what we are trying to do.

Leave aside that we didn’t suggest that it was, the reality is still that the situation is so complex – with the university taking an enlightened and proactive approach in offering opportunities to students from disadvantaged backgrounds – that the AI can’t actually help. 

But then it turns out that the university’s intelligent automation system, all those AI-infused digital workers, won’t actually be working on these problems at all. At least, not this year. Proctor explains:

It’s not something that is helping us clear the admissions process at the moment. What I can say is that if this had happened in a year’s time, it [the technology] would have freed up a lot more time for people to be making phone calls and responding to emails more quickly. But we are a year away from that. It’s one of those things, if we had started this process a year earlier.... There is a lot of administration work in admissions. And we are starting this [automation project] with more of a back-office function.

So there you have it. In the meantime, marketing yourself against a backdrop of a crisis - do you see why it gets problematic?