How generative AI is enabling the greatest ever theft/opportunity - delete as applicable

George Lawton Profile picture for user George Lawton June 1, 2023
Opportunity or heist - generative AI throws up an awful lot of, as yet, unanswered questions...


As excited as the industry is about generative AI, less understood or talked about is the decompositional aspect of it that distills elements from content and data not currently covered by copyright or case law. This allows generative AI to capture new kinds of value from our data that no one had previously considered. 

Companies may potentially lock this behind new walled gardens. Some types of this new value mined at scale by generative AI include ideas, likeness, insights, application features, product utility, style, plot, logic, and ingredients.

My recent piece on types of AI hallucinations included some thoughts from Columbia Professor Naomi Klein, suggesting that one of the biggest hallucinations is that the current crop of AI is sentient or will necessarily be beneficial. 

Unfortunately, I forgot to include a link to her recent article, AI machines aren’t ‘hallucinating.’ But their makers are. I can’t entirely agree with everything she said, but she makes some interesting points. One particularly provocative observation she wrote was that benevolent stories about generative AI are

...powerful and enticing cover stories for what may turn out to be the largest and most consequential theft in human history.

As with opening any new frontier, one person’s heist is another’s opportunity. Klein did touch on some aspects of this, but it did seem worthwhile to contextualize various dimensions of this puzzle.  

Many older controversies have not scaled up with generative AI - yet. Many “innovative” Silicon Valley business models have not yet either. It will certainly evolve as we learn new nuances and discover new examples of this changing landscape. Let’s walk through a few of these:


Google reads all the web pages and books, which everyone is happy with because Google provides a path to your front door. Google’s upcoming generative AI search feature is starting to put a summary distilled from one or more pages at the top above the links, potentially reducing the need to go to the page and view the ads that pay the rent. This might save web surfers time but destroy the current ecosystem built on click-throughs and page views. As disagreeable as this may be to creators, it may be preferable to the alternative that surfers migrate over to new generative AI competitors with no links at all.


Generative AI copies an individual’s voice or image and repackages it into a new song, trading card, or image that drives traffic, ad revenue, or other benefits to the creator of this mashup-up. Worse, they mimic your grandson or friend asking to borrow money for an emergency. One poor fellow in China almost got scammed out of about $500,000 this way.


 Chris Middleton recently reported that Levi Strauss planned to use generative AI to emulate diversity, cut costs and reduce the need for ad reshoots with different body types. Middleton said:

That a company with 2021 revenues of $5.7 billion thinks denying opportunities to real humans of ‘every body type, age, size, and skin tone’ by using AI to increase perceived diversity is yet another example of the convoluted thinking that already characterizes the AI gold rush.


Watson and Crick are famous for discovering the DNA helix. Less often mentioned is that this insight arose from an image from competitive scientist Rosalind Franklin’s lab, who sometimes gets a side note. Now what happens when generative AI can make these discoveries at scale using data that no one thought to look for and allow the operators to take credit for the new discoveries or revenue-generating opportunities? 


A corollary to the previous is what happens when someone distills your writings, clicks, and likes on Facebook to sell you stuff more efficiently or help political candidates target the right ads or message more effectively. Cambridge Analytica happened long before generative AI provided the means to send you personalized ads convincing you to vote for a candidate you otherwise might not. Facebook’s new generative AI advertising platform promises to “help optimize campaign results, personalize ads by matching them to the right people at the right time and ultimately help advertisers save time and money.”


Apple incorporated the search functionality of a smaller app, called Watson, into its Sherlock file manager. This move killed the smaller developer but made Apple products a little better. Today the idea of copying innovative app functionality into legacy apps is called Sherlocking. Generative AI will make it easier to decompose the new features of innovative apps and then bolt them into the legacy leaders. Even if done badly, this can spell the end for scrappy innovators.

Product utility

Amazon has deep insight into the performance of the millions of products sold across their platforms. Although it has long denied reverse engineering its suppliers’ products, a few reports have suggested otherwise. The result is that companies suffer all the liabilities and burdens of testing out new products only to have the best ones essentially stolen. Companies can choose to avoid selling on platforms that may compete with them. However, generative AI will make it increasingly easier to mine public data sources required to distill competitors’ successes, generate knockoffs, and then scale up on a more efficient distribution infrastructure. This is similar to Sherlocking but for physical things and the business models around them. 


When you go to ChatGPT, you can ask for a story in the style of Shakespeare, Mark Twain, or anyone else. The same goes for pictures in the Style of Dali, Rembrandt, or others. While the stories and images these people wrote are copyrightable, the style is not. But the only way to capture the style is to distill the essence of the copyright originals using AI. This is an essential aspect of several industry copyright lawsuits from artists and Getty Images against generative AI companies. 


Beatrice Potter is currently being accused of cultural appropriation, having reframed ancient slave stories into some of her most popular stories, such as Peter Rabbit, written in 1901. A big component of the current writer’s strike in the US is that AI could steal their best plots and repurpose them for the top media brands without credit or royalties. What happens when generative AI can decompose the most popular plots and generate new versions more resonant with different cultures at scale?


After someone has been in any industry for a while, they start to pick up insight into logic for the best way to do things. One concern is that the logic in apps posted on GitHub and other code repositories is not copyrightable, even if the actual code is. This is the gist of a class action lawsuit against OpenAI, GitHub and Microsoft alleging “software piracy on an unprecedented scale.” Code is just the first frontier of easily mineable logic. What happens when decompositional AI can watch how we work to decipher what we look for and how this affects our decisions? 


 Rumor has it that the Coca-Cola recipe is locked up in a safe and only seen by a few living individuals. What happens when generative AI gets better at decomposing the recipes of things? Meta developed one tool to reverse-engineer ingredients from photos. Imagine what they could reverse engineer when food scientists start training them on chemical spectrograms to copy Coke. Ditto for all the other billions of products that no one expected to reverse engineer using the tools available when they are released. 

My take

So, generative AI is coming for your assets that never looked like an asset. Or maybe your generative AI is coming for someone else’s assets, and it is all perfectly legal – sort of, for now. I remember in the early days of the Internet when you could register the domain name for only $7 a year! Eventually, registering someone else’s name to make a packet was called cybersquatting and made a crime. 

It’s also important that this phenomenon – whether we call it a heist or an opportunity - also speaks to the growing call for data dignity. How do we grant rights, royalties and recognition to the humans creating the value that AI mines into this conversation? 

It’s very clearly a Wild West as people look to fence off the value of things in light of the rapid innovation in large language models. We need to think about a practical and just way to fence this new landscape before we quietly accept all the forms of our unrecognized assets have slipped behind new paywalls. 

A grey colored placeholder image