Have You Ever Thought of Yourself as a Luxury?
AI is turning you into one...
You are a luxury good.
Not in the aspirational, Instagram sense, but in the economic one.
The dynamics of luxury aren’t really about money. They’re about time, attention, and the fundamental truth that if you’re working with one person, you cannot simultaneously be working with another on the same thing. This makes every skilled practitioner, given the opportunity, an artisan. Someone for whom craft matters as much as income.
The veterinary surgeon who’s spent decades developing clinical judgement. The diplomat who can read cultural complexity. The coach who knows when to hold space and when to push. They all possess something genuinely scarce and non- linear. so there is a tragedy here: most of us spend our organisational lives doing work almost anyone could do, whilst the skills that make us genuinely valuable sit largely unused.
Two forces create this waste.
First, those who own organisations rarely operate within them. Consider the Hermès Birkin: a bag requiring 18-20 hours of skilled handwork by a single artisan, years of training, and selling for fifteen times its production cost. The artisan who creates this object, which has outperformed the S&P 500 and gold as an investment, earns a middle-class French salary. The brand maintains 40% operating margins and 30% net profit margins.
This isn’t necessarily unethical. Artisans receive stable employment, benefits, training, and profit-sharing. But it demonstrates clearly that in luxury goods, overwhelming value accrues to brand equity, artificial scarcity, and market positioning rather than to the skilled labour producing the actual object. Extend this investigation to those who outsource the soul of the brand, and you see the abject asymmetry of labour to reward.
Second, organisations demand this because scalability requires it. Underneath the HR bluster, volume profitability demands replaceable human parts. Hard data is easier to measure. Defined, enforced processes ensure competitive costs. Productivity and efficiency demand systems that erode the attention we can pay to each other and our clients.
What We’re Learning About AI
There’s something quietly unsettling in what we’re discovering. Nearly half of what we’re building with AI exists in limbo, either people don’t actually want it, or it’s not yet possible, or both.
What strikes me more is what this reveals about constraint. We’ve accidentally trained these systems towards mediocrity, not because they lack capability, but because we’ve taught them to mirror our own limitations. We’ve encoded our biases, our preference for the easy and unchallenging. We’ve taught the system not to provoke us. The creativity that could amplify us is there. We’ve just taught it to hide.
Meanwhile, most skills people use in their work will fundamentally shift within just a few years. Organisations find themselves with too many people for work as currently structured, whilst desperately lacking the capabilities they sense they need. It’s not a problem we can train away. It’s the ground shifting beneath us, calling into question the very architecture of how work is organised.
Business models that felt solid are being questioned, not optimised or tweaked, but pulled apart and reassembled. Some organisations have quietly replaced significant portions of their workforce with AI agents. Yet almost no one believes they’ve truly worked this out. Most remain stuck in pilots and experiments that never quite scale, never quite transform anything meaningful.
We’re not at the beginning of something entirely new. We’re at a turning point within a longer revolution, the mechanisation of thought itself, unfolding for decades. At such turning points, we face choices about what our technologies will serve.
The craft of being human in relation to these systems has become urgent work. Not as defence against technology, but as practice in noticing what matters before it disappears beneath protocols, processes, and other people’s definitions of productivity.
The Logic of Decay
Cory Doctorow’s enshittification describes how digital platforms decay: first, they serve users, then they exploit business partners, finally, they extract maximum value for themselves until the system collapses under its own greed.
It’s not just digital platforms. It’s a culture. The same logic applies to most service industries, education, medicine, and care. The result is three tiers: commodity, enhanced, and premium. In education: underfunded local provision, education trusts, and private schools. In medicine: ten-minute appointments at the local practice, health insurance, and exotic private clinics.
The difference is rarely in capability. More often, it’s in the time we get to spend with those looking after us.
Attention is the ultimate luxury good.
What Remains
It’s too early to say what AI will ultimately do, how much will match the hype, how much will enable things we haven’t imagined, and how much it will change the nature of work.
But we know this: our deepest human needs are to be seen, acknowledged, and heard. This expresses itself in the things we buy and the relationships we have with others. We also know that, despite whatever artifice we create, we will never have that relationship with technology.
Genuine craft, whether of head, heart, or hand, is hard to measure and value. It comes one unique person at a time. The artisan in us does not disappear; it simply goes dormant, waiting for conditions where craft can matter again.
I wonder how much of that dormancy will end as we learn what AI can’t do. Not to deny its power, but to use it where it can be effective, and not ascribe to it qualities it’s unlikely ever to have.
The question this leaves us with: What is our craft? As we hand over to AI the things better done by AI, what remains?
That is one of the questions we’re considering at the Athanor. There is an alchemy to this, as the old system disintegrates, we take what comes next from what lies within it.



