What Apple iPhone X taught me about AI. Hint – I’m already bored

SUMMARY:

Apple just amp’d the AI hype with incremental improvements that might accelerate the run towards a consumer AI winter. Thank goodness.

Unlike those who slavishly follow the goings on at Apple, I chose to wait for the inevitable tsunami of opinion, bloviation and adoration  over the launch of iPhone X.

What caught my eye was Apple falling into the AI pool with all the enthusiasm of a child getting their first present from Santa and yet leaving me shrugging and (almost) mumbling ‘meh.’ As an iPhone 6 Plus user I’m just not ‘getting’ enough to make me go ‘wow.’ And certainly not at an eye watering price point of $999 (plus sales tax) or £1,000 in old money.

Apart from the fact that much of what Apple announced is little more than playing catch up with Samsung in particular – think OLED and edge reaching glass which I hate in the Galaxy models – the conflation between machine learning and facial recognition as a feature that “lets Face ID adapt to physical changes in your appearance over time” struck me as fundamentally weak.

Will my face change THAT much in the  18-month half life of the iPhone X? I guess that might be true if I’m a 10-12 year old or a Pimply Young Thing, but for Apple’s core buying audience?

I am more concerned that with Apple having blessed ‘AI’ with its massive promotional reach, we will see more weak or barely intelligent use cases being thrown forward as ‘AI’ functions that are somehow going to change the world when they’re nothing of the sort. I’m not alone.

My inbox this morning contained a prescient piece from Derrick Harris, author of the excellent ARCHITECT newsletter where he said: (my emphasis added)

I’m just going to come right out and say it: The excitement over Apple’s new AI chip and facial-recognition security feature seems overblown. It’s just so … utilitarian.

Don’t get me wrong: Apple’s strategy is very smart from a UX perspective, assuming the Neural Engine gets a full workout. Use deep learning to power tasks like facial recognition (I’m not convinced this is a game-changer), NLP for text messages, image processing and, of course, Siri. In the process, save battery life by minimizing use of the main CPU and GPU, thus making users that much happier.

But at some point, we might expect the company that all but invented the current image of consumer AI—with Siri—to do something else revolutionary. We might be waiting longer than we think. And not just from Apple, but from Google, Samsung and everybody else, too.

The fact of the matter is that while the field of AI—and deep learning, in particular—is advancing like mad, its strong suit is still in what we might call machine perception. Vision, speech, hearing, language, stuff like that. Which is why we now have devices like the Amazon Echo and Google Home, and phones that can understand us, predict our next words and recognize who’s in our photos. Most of these things we could do before; now we can do them easier or better.

However, there’s still a long way to go before these areas are perfected, and it’s not entirely clear where the next big idea in consumer AI will come from or when it will hit the mainstream. If you look at most consumer applications of AI and machine learning today, they’re really step improvements over the status quo. I think a lot of people would say companies are spending an awful lot of time on AI research so we can add funny mustaches to selfies.

This is worth unpacking because Harris is pointing up some of my major bugbears with the white hot hype around AI.

Siri – at least in my experience – is not a patch on Amazon’s Echo which itself does not feel like much of an improvement over the speech tech I saw developed by Lernout and Hauspie and Nuance back in the 90s. The fact it doesn’t understand me at least 15-20% of the time is worse than the L&H algorithms that managed a speech to text recognition accuracy of 92-95%. Of course, some of what Siri/Alexa thinks I say are comical and as a consumer we forgive very easily on matters that don’t matter. But it’s not like that in enterprise land. Harris goes on (my emphasis added):

That’s not an indictment of anything, but rather, I would argue, a function of how research works. You don’t change the world overnight, and even promising experimental results can take a long time to make it from lab to production. When they do, I have an easier time seeing game-changing applications of AI in fields like manufacturing, logistics, media and health care than in pure consumer spaces. And even there, we’re talking more about process automation and optimization than about, say, anything that would strike an outside observer as revolutionary. At least in the near term.

I agree. Harris’s point about incremental change among all mobile handset makers is well made. The real problem comes when CXOs start buying up iPhone X’s like the latest flavor of Ben and Jerrys. They will drive adoption and ask questions of their development teams which won’t be easily answered.

For instance, developing for yet-another-screen-size, in this case 5.8 inches, is anything but trivial and already I hear the developer jungle drums beating furiously about what this might mean for their applications in the world of Apple.

My take

So much for the consumerization of tech. Apple just brought that to a shuddering halt. At least as it relates to AI.

My hope is that Apple will see this kind of critique as something of a wake up call and that enterprise people will start to call out on the incrementalism that plagues much of what we see touted as AI.

I take what Harris says concerning advances in manufacturing etc as real. We have seen plenty of examples in the recent past and look forward to more in the future.

In the meantime, and unless my iPhone craps out, I’ll be giving the iPhone X a pass.

P.S. In other news, I’ve just got my hands on an HP Chromebook 13 G1. Given Apple was silent on its laptop range, this may well be my next machine.

Image credit - Apple

    Leave a Reply

    Your email address will not be published. Required fields are marked *