A pattern is emerging. From software to workplaces, the same three-stage decay repeats: first, they serve you, then they serve through you, finally, they serve themselves. Understanding this pattern—and our response to it—may be the defining challenge of our time.
An Emerging Pattern
MIT's latest research on AI adoption reveals a stark reality: despite $30-40 billion in enterprise investments, 95% of generative AI pilot projects deliver no measurable profit-and-loss impact. The reason isn't technical limitations—it's human ones.
It reminds me of the panic of buying toys at Christmas. In our excitement to see the joy when they get unwrapped, we ignore the small print that says “Batteries not included”
We are the batteries.
This idea stopped me cold, not because it's surprising, but because it captures something I've been seeing everywhere. Cory Doctorow calls it "enshittification", the three-stage pattern where platforms are first good to their users, then abuse users to make things better for business customers, and finally abuse those business customers to claw back all the value for themselves.
It's not just digital platforms. I found myself with too many overlapping apps on both my Mac and phone. Each of them has a unique feature, but over time, that feature has become buried in extensions and “upgrades”. They pile on feature after feature to take something initially useful and weigh it down with the digital equivalent of fat until it becomes obese. It started with Evernote, which I have used for years, and I have gradually become accustomed to and accommodated its ever-increasing complexity until, suddenly, I no longer want to. The trigger was a renewal notice, and as I moved to cancel, their immediate reaction? A huge discount if I stayed. A discount triggered not by loyalty, but by desertion. It’s not just Evernote - in the same week, I’ve seen the same response from insurance companies, and even The Economist.
The Great Extraction
This isn't accidental. Over fifty years, we've systematically shifted from creation-based to extraction-based models. Globalisation lets us extract value from cheap labour. Financial deregulation lets us extract value from debt and speculation. Now technology promises to extract value from human attention, creativity, and judgment.
The common thread: leadership as optimisation rather than creation. Find inefficiencies, squeeze harder, move on. Vision and values became marketing copy, while the real work shifted to what I call "mechanically-recovered-meat strategies"—reworking existing value into new forms rather than creating anything genuinely new.
When AI Needs Human Craft
Here's where it gets interesting. The MIT study found that while only 40% of companies had invested in formal AI subscriptions, over 90% of employees were already utilising tools like ChatGPT to complete their work. A "shadow AI economy" where individuals found value that organisations couldn't.
Why? Because AI, despite the breathless headlines, isn't magic. It's a tool that amplifies human judgment, creativity, and craft. The organisations failing with AI are treating it like another extraction opportunity—a way to replace people rather than enable them.
Arthur C. Clarke warned us: "Any sufficiently advanced technology is indistinguishable from magic." However, Clarke also said that the only way to discover the limits of the possible is to venture beyond them. The limits of AI aren't technical—they're human. They're about craft, judgment, and the irreplaceable ability to recognise what matters.
The Craft Response
So what do we do when everything around us becomes extractive?
Choose your domain carefully. Focus on creation over extraction. Make something that matters, that contributes, that you'd be proud to explain to your grandchildren.
Set boundaries consciously. Stop saying yes by default to systems designed to extract from you. Question renewal notices, subscription traps, and "efficiency" measures that really mean doing more for less.
Cultivate your craft. In a world of algorithmic optimisation, human judgment becomes more valuable, not less. The organisations succeeding with AI aren't automating creativity—they're amplifying it.
Find your people. The shadow AI economy reveals an important truth: individuals working with effective tools and a clear purpose consistently outperform organisations trying to extract efficiency from flawed systems.
Design for emergence, not extraction. Ask not "how can I squeeze more from this?" but "what wants to emerge here that I can serve?"
The Antidote
reminds us that "the forty-hour week is there for a reason; it gets the best work from people." The role of AI isn't to extract more hours from human attention; it's to give us space to create something worthwhile with the time it gives us back.That space, between what AI can optimise and what humans can create, is where the future lives. Not in the extraction of more efficiency from broken systems, but in the craft of building better ones.
The question isn't whether we can avoid enshittification. It's whether we can choose to create instead of extract, to serve something larger rather than optimise something smaller.
Ultimately, that's a decision only humans can make. No algorithm required.
** Cory Doctorow launched his Kickstarter for the book he is writing on enshittification yesterday. I’m supporting it. Please consider it - we need voices like his.