Staying ahead of the robots - how should we approach skills development?

Profile picture for user jreed By Jon Reed April 11, 2017
Summary:
It's been two years since I addressed the challenge robotic automation poses to skills development. Plenty has happened since. Recent blog posts show that utopians and alarmists have more in common than they think. Here's some recent posts of note - and how my own thinking has changed.

woman-robot-arms
The "how freaked out should we be about robotic automation?" debate is tiresome. Does it really matter if we are alarmists and see a crisis coming, or if we are utopians and see machines enabling humans to live a life of creative fulfillment?

Answer: no.  Because I think we all agree:

  • The AI/automation changes coming are profound, and will change jobs and careers forever (even if we dispute the time frame in which this will happen, or the exact scope of the change).
  • Our educational system is poorly equipped to prepare students for careers amidst the machines.

We must frame the threats machines pose correctly

The real dilemma is: what do we do about it? On Quartz, Dave and Helen Edwards frame the problem:

We’re all getting used to the thought that in a not-so-distant future, competition for jobs won’t just be other humans, it will also be an intelligent robot, self-driving car, or other artificial agent. But in our gut, we know this can’t be the full truth, that there’s a more nuanced story. We at least believe that elite human skills will remain valuable even as automation eats the world. The hard part is figuring out which ones will be the most valuable and where they will be the most prized. (emphasis mine, from The skills your kids should cultivate to be competitive in the age of automation).

It's been almost two years since I broke down the keys to keeping your skills one step ahead of the robots. One nugget from that piece was Gil Press' How Knowledge Workers Can Save Their Jobs In The “Bring Your Own Robot” Age, where he offered job advice amongst the robots from authors Thomas Davenport and Julia Kirby:

  • Find a job a computer cannot do (step up)
  • Choose a career where specific human characteristics (e.g., empathy) are needed (step aside)
  • Monitoring and modifying the work of computers (step in)
  • Find a specialty that wouldn’t be economical to automate; (step narrowly)
  • Develop the next generation of computing and AI tools. (step forward)

Yeah, the "steps" thing is jingle cheese, but that's a useful framework. Last week on diginomica, Denis Pombriant continued his worthwhile series on robots and jobs with Robots versus jobs - it's the polymath skillset stupid.

We need "elastic" skill sets

Pombriant is more optimistic on machines and jobs than I am, though he is far from a utopian. He cites a "sobering" study from Daron Acemoglu of M.I.T. and Pascual Restrepo of Boston University, “Robots and Jobs: Evidence from US Labor Markets” published by the National Bureau of Economic Research:

I regard the information as valid but believe it needs further analysis and not simply accepted at face value—at least not immediately. That automation claims jobs is indisputable and a part of how the U.S. model of capitalism is played out. Further, it’s not just blue-collar jobs that are in the lurch.

Pombriant is a solutions-oriented type of guy - in fact he has challenged me to do more in this area - so what actions does he advise? Pombriant warns that the biggest danger falls to workers who have a rigid skill set - one that would be difficult to evolve. We need "elastic" skill sets. He uses a dockworker example:

The dockworker’s skillset is not that elastic and more to the point, it’s hard to imagine retraining for years while the bills pile up. So the idea of creating new jobs might be valid in the aggregate but for individuals the shift represents a crisis that defines the difference between a recession and a depression.

But our current schooling isn't exactly built for "elastic" careers. Pombriant:

At the heart of this discussion is the reality that we need a better education system, one that provides skills capable of morphing to suit the changing workspace. Rather than encouraging the development of narrow skillsets that can (and ultimately will) be commoditized, we need to be laying the groundwork that encourages the development of a polymath mindset.

Pombriant only scratches the surface of how our educational system could create polymaths, but he references Vinnie Mirchandani's excellent book, The New Polymath, which documents the polymath approach companies use to "compound" technologies - a concept that extends to individuals as well.

STEM education isn't a cure-all

Dave and Karen Edwards take a position similar to Pombriant's by pointing out the limitations of STEM education as a cure-all:

As a parent, this can be a particularly vexing problem when thinking about how to advise your kids. Common wisdom–learn to code, cultivate empathy, study STEM–isn’t especially useful because it isn’t specific enough about what it takes to stay ahead of the robots for years to come. Many of the major advances in AI are happening in just these fields: Machine learning will ultimately eliminate a lot of coding work, perceptive and emotional AI is developing fast, machines are already good at math.

The Edwards studied the thirty jobs with the least likelihood of being automated, concluding that these jobs could be grouped into four categories:

  • People - jobs that rely on strong interpersonal skills like chief executives, school psychologists, social work teachers, and supervisors of a variety of trades.
  • Numbers - jobs that apply math to business problems, like economists, management analysts, and treasurers.
  • Bugs and bad things - human health-related jobs, like allergists, immunologists, and microbiologists.
  • Spaces and structures - jobs that manage the physical world, like engineers and environmental scientists.

I'm not sure about the "numbers" one - the only no-brainer of the four is the "people" section. I'm also baffled by the exclusion of creators, inventors, and artists. Granted many creators - myself included - are weird enough that we have to create their own jobs, but that's always been the case.

But the Edwards' conclusion is interesting: what ties these jobs together isn't the people-centric aspects, but the unpredictability. Dealing with unpredictable people, strange environments, complex and changing industries, ambiguous data - that's where people  should continue to excel over machines:

For instance, our people cluster doesn’t include just any job that handles people, it includes jobs that deal with people in unpredictable environments like school psychologists and supervisors of firefighters and repairers. And our bugs and bad things cluster doesn’t include just any health-care job, it includes jobs that handle complex relationships between ecological systems and human health like allergists, epidemiologists, and microbiologists.

Alas, the Edwards stop short of saying how they will bring this message back to parenting and/or schooling.

My take - dexterity amidst the machines is what matters

When I first read Pombriant's prior post about emotional intelligence being a primary differentiator of humans over machines, I wrote:

As Pombriant says, it might not be long before machines diagnose cancer and suggest treatment. The good human doctor still does that one big thing: sits down with the patient and reach common ground on what’s next. Will robots nudge professionals with crummy bedside manner to the unemployment line?

But even Pombriant's blog title (note the use of the word "yet") concedes that emotional IQ might not be a difference maker forever (Emotional intelligence, the empathy no robot can (yet) emulate).

Edwards and Pombriant are advocating a similar concept but in different timeframes. In each case, it's about having the skills to adapt - either quickly or over a longer perdod - to changing and unpredictable circumstances. The question of what educational program would cultivate such skills is now on deck.

As a creative misfit who "enterprised" my skills somehow, I will admit that the liberal arts education I've been cynical about for years had helpful breadth, and held my creative/analytical feet to the fire. Whether that type of education would work for those who are less into creative output but who want to be highly "elastic," I'm not sure.

I'm still a believer in mastery - the question is what to master. My best friend from college once said to me:

I'm afraid of becoming mediocre by mastering useless things.

We can now revise that statement: I'm afraid of becoming mediocre unemployable by mastering things a machine might soon learn, with more consistency and, probably, a better attitude than I have.

The notion of a "polymath" is viable but few of us will master more than a few things - and clearly, what we choose to master will make all the difference. If our mastery is too narrow, and lacks a foundation of adaptability, critical thinking and emotional IQ/soft skills, we could be in trouble.

I expect the lifelong learners who relentlessly consume new ideas and "sandbox" their skills to have an advantage, in part to correct the defects of flawed/obsolete schooling. In Den Howlett's piece with Vishal Sikka, CEO Infosys (Demystifying AI, ML, DL with Vishal Sikka and real world examples). Sikka says:

The important point, the incredibly important point is that it is not about AI replacing us or about us becoming redundant or AI killing us and all this kind of nonsense, it is about our ability to work with autonomous and semi-autonomous machines.

That means "staying ahead of the robots" goes from worthy ambition to bad advice real fast. The next generation will be much more fluid interacting with intelligent machines. The resulting jobs will not be the same ones we worry about losing today.

That said, I'm rooting for champion Chinese Go player Ke Jie in his upcoming face off with DeepMind’s AlphaGo program. Yes, that domino is going to fall soon, but in the meantime, I can enjoy the resistance. Humans are nothing if not stubborn - a trait machines can be programmed to emulate, perhaps to their detriment.

Updated 8:00 am, March 12 with a few headings and small clarifications for readability.