Sometimes I get a slow burn going. I have one of those on AI and the future of creativity. Yes, this preoccupation goes well beyond the enterprise - but it certainly includes the enterprise context.
I believe that hand-crafted content still impacts enterprise projects. So yeah, I get my dander up when I run into technophiles who think AI can displace this. But are they entirely wrong?
Let me bring you back to early 2021, when I got a delightful PR pitch about 30-second videos. The pitch included one of my biggest marketing trigger-phrases, "snackable content." My not-so-restrained response?
Me to PR on "30 second videos"
"I'm not a fan of snackable content. I wish you luck but I'm dedicated to proving in-depth, immersive and interactive content is still indispensable. We aren't going to fix technology projects, or our vexing social problems, with 30 second snacks."
— Jon Reed (@jonerp) February 11, 2021
The plot thickens. Around that same time, I ran into this unsettling article, AI will soon outperform us in disciplines we thought were uniquely human. The publisher, techradar.com, stirred the viral pot further, employing the social media title, "AI is on the verge of mastering the creative arts." My spleen vent was immediate:
AI is on the verge of mastering the creative arts https://t.co/UJq8UyA8W1
"Thanks to developments in AI, the days of the human creative may be numbered"
-> the most ridiculous piece of AI hype I've seen in a long time. May require a full Friday rant type deconstruction.
— Jon Reed (@jonerp) January 24, 2021
Can AI make human creativity obsolete?
Articles like this are easy to mock - and for good reason. The more you learn about AI, the more you understand: what machines are good at threatens some human work, but not all. Human creativity is one of the most difficult things for machines to conquer - much harder, by the way, than emotional companionship, where our standards are quite a bit lower (and, perhaps, our needs greater).
Granted, when it comes to the logistics of content creation - including enterprise content - AI is achieving a firm foothold. But it's about augmenting the creative effort, not displacing it. AI can help automate the grit of content production. I've written about the impact of machine transcripts from Otter.ai on my content workflow (How an AI service won me over by becoming an AI platform - the Otter.ai machine learning transcription example). Thanks to a machine, we'll do an audio-version of this article, and make it available in dyslexia mode as well.
But "augmented AI" is entirely different than the viral-seeking sensationalism of AI is on the verge of mastering the creative arts, where supposedly the entire creative process is at risk. I was hardly the only one to take umbrage:
I guess these so called #AI technologies are replacing those tasks which had no soul in them to begin with. No purpose. No inspiration.
— Pɾҽɱ Kυɱαɾ Aραɾαɳʝι 🏡😷🤖💬🦾🎫 (@prem_k) January 24, 2021
By the time I appeared on the Intersections show with Brian Solis and John Kao on February 11th, the implications of the article had settled in (my segment begins at 25:00). Turns out, the piece was not so easy to dismiss after all. By that time, it had dawned on me: the author, Joel Khalili, buried the lead.
Is content engagement now the proxy for creative quality?
The unsettling point is NOT that AI is a threat to today's artists - that premise is absurd.
Watch: This AI is playing an infinite bass solo on YouTube https://t.co/0CE4QdeSRg
"If Frank Zappa's endless guitar solos somehow leave your earbuds craving more, music-hackers Dadabots are here to satisfy your auditory desires."
-> AI will never catch Zappa, technofanboy
— Jon Reed (@jonerp) December 23, 2020
Here's where it all shifts. Khalili frames the content performance argument around "engagement." In other words: what if the goal of content is not to change hearts and minds, but to simply "engage" people? After all, in social context, the level of "engagement" can be measured.
If engagement is the chosen KPI, isn't the door open to all kinds of "snackable" content, fleeting-but-measurable content moments, each of which can be followed by the next sensationalized bite? If so, that's the type of content AI is potentially capable of creating. Consider the supposedly scary "GPT-3 authored" article in The Guardian. The overall logic flow of the essay was the problem. Individual paragraphs were provocative, and occasionally effective (GPT-3 also had quite a bit of editing help).
Khalili takes it further: if engagement holds attention, then do we need to rethink artistic quality? Or, as Khalili puts it, "What is quality, anyway"?
No longer will quality be a subjective matter, up for debate, but rather assessed based on hard metrics such as time-on-page and finish rate.
This process is already playing out in digital media, where snackable content more likely to generate impressions takes precedence over in-depth reporting, and where hyperbolic headlines outperform purely descriptive ones.
'Content publishers will increasingly rely on technologies for analysing user engagement, rather than defining a criteria for the quality of the content itself,' Dirik predicts.
And then, the line that sent a chill up the ol' spine:
'Reader engagement will ultimately become a proxy for quality.'
That, folks, is our AI content wake-up call. Look at the consequences of amoral social engagement on, say, vaccination facts-versus-fiction. It's a dangerous cliff Khalili is taking us on:
AI can create content snacks, summaries, and sensational conspiratorial BS. To conflate the latter with art because it gets engagement is a precipice we don't need to peer over
— Jon Reed (@jonerp) January 25, 2021
To his credit, Khalili raises the bias problem in AI content-run-amok. Yet he still seems fairly gleeful about his premise. Does his redefinition of "quality" via engagement, and (AI created) content snacks threaten hand-crafted creative content? If we say "yes," that's a tragic concession. We would be accepting life in a distracted dystopia of factually ambiguous snacks. But the cultural impact of AI content is beyond this post. On deck here is a different question: is this a viable threat to B2B content creators?
My take - B2B content has a different definition of quality, and AI can't reach it
I don't believe enterprise professionals can live on content "snacks" alone - there is too much at stake. As I wrote in Does the enterprise have a fake news problem?
Misunderstanding enterprise data causes a range of problems, from inconvenient to career-altering:
- Time wasted during a hectic day, sidetracked by inaccurate info
- Misunderstandings or distrust between vendor and customer, due to sensationalized stories, tech hype and lack of open dialogue.
- Projects extended or derailed due to poor choices in vendors/partners, or lack of industry context.
Bad or biased information has dire project consequences. That imposes its own content quality standard - and AI-created "content snacks" won't be enough.
Despite my grouchfest, there is room for snackable content in the enterprise. A thirty-second customer video testimonial can be perfect. That leaves open the question of whether AI could "create" that type of snack. Not yet, but AI could certainly help to produce, tag, and distribute such content. As I later wrote in Honing our enterprise BS filters:
A few years ago, I asked if the enterprise has a fake news problem. The short answer is: kind of. The good news? We work in an industry where our careers are defined by the success of complex, high-stakes projects. We simply can't afford to get duped too often.
In my enterprise video shows, it often takes an hour to unravel a topic like avoiding HR project mistakes, or the do's and don'ts of retail projects. Research indicates B2B buyers want independent viewpoints. Project managers want informed opinions, not just facts. All facts come with bias - that includes "facts" put in trapezoids or quadrants or waves. The best way to inoculate against bias? Diverse perspectives. Buyers want to hear expert takes from analysts. They want hands-on views from consultants - and from their peers. AI can't create that type of sustained interaction. Only a community of peers can support that.
Examine how we get software projects over the finish line. Or: look at what enterprise buyers need to make a project decision. When we drill into that, the idealization of "engagement" for its own sake falls apart. Yes, we can measure it, but is it taking us anywhere? If a supply chain vendor gets "social engagement" from a post about brewing beer in bathtubs during lockdowns, does that advance their cause?
In the enterprise, content has a different definition of quality as well, but it's not about engagement, or even entertainment. It's about relevance. You can be an average writer, but if you're a subject matter expert with project know-how, your content is relevant. It might even be essential. We don't need enterprise art; we need project success. Content fits in here.
Social engagement needs a context. Just because we can measure a "bump in brand conversations" doesn't mean that measurement did a thing. We need to identify the content engagements that moved buyers closer to decisions - and those that did not. Which types of content earn us topic authority, or customer trust? When does "social engagement" hurt trust - for example, if we post something tone deaf about a security breach, as T-Mobile recently did? Yes, that definitely boosted engagement. But not in a way that helps the brand.
When I appeared on Intersections, I got the impression Brian Solis found my withering view of today's artistic economy a bit of a downer. I can't help that - I believe we should worry greatly about the future of capital-intensive creativity, particularly things like documentary film production and investigative journalism. Not to mention, for music lovers, whether we'll see production-intensive studio masterpieces like Dark Side of the Moon again. But that's not an AI problem; that's a creative/digital economy discussion.
However, I do derive hope as well. For individuals - and for brave organizations - step back from the obsession with social virality and "engagement." As I said on Intersections:
I'm putting my creative economy book through Grammarly right now, and it's incredibly helpful. Otter.ai has been huge for me on the transcription side. So there's great AI tools you can use. But this is more than a tool conversation, when we can define art in many ways.
In my mind, the core definition is that you take a courageous excavation inside yourself. Find the things that resonate and the talents you might have, and create a discipline towards mastery. Experiment and pursue those talents in a way that gives expression to the world for your unique point of view.
Brands can do that too. Either way, that excavation process is uniquely human, not possible for "AI," and not limited to exceptional artists either. I added:
We can all undertake that in our own way. One thing I tell people about social media is: you're going to the party too soon. Go to the social media party after you've done this work, and created meaningful, immersive content that matters to you.
It's a cycle of deep work. There's a whole process I've defined around deep work that involves reflection, research, curation, which you can begin to share publicly, and then your own creative process. And you can actually differentiate yourself from others who haven't undertaken that quest.
And yes, organizations can fund and support this too. I did a piece on how a software company uses deep work models to support programmers who really need that immersive time. So this kind of ties in with your whole detox theme you had earlier, which is: ' Okay, now that we've created some silence, let's do something with that silence.' You know, let's do something that changes our lives - and hopefully changes the changes our companies' futures as well.
If you want to go further down this rabbit hole with me, check my 2013 diginomica post, The career-defining consequences of value productivity, which remains my fave piece I've written here, 1,500 or so pieces later (though perhaps the Woody Allen example hasn't aged well). It's basically a juxtaposition of task and value productivity. None of us are getting paid to answer email, so the problem remains. And yes, creative efforts are at the center:
We are all under a constant influx of interruptions that threaten our ability to create the deliverables that transform our own value.
That surely hasn't changed. But there is one more missing piece: the issue of context. Enterprise vendors seem to think that AI can solve for personalized context. I happen to think that's a problematic overreach. This directly implicates content, as in: which content should be served up, and when. It deserves a further review, but my coffee is running low, and probably yours as well. That's for another time.
Revised, Sept 4, 7am UK time, with a number of minor tweaks for reading clarity, and the addition of resource links.