← Back
May 2026 · Essay

The Great Leverage Inversion

Post-Scarcity Economics and Software Development

In our era of eras, the "Era of AI" is distinct for its prevailing mood. A sheer vibe hangs over us all, like Schrödinger's Recession: This is either a massive "AI bubble," or the beginning of the "take-off" phase of the singularity, depending on your outlook on life and whether you like long walks on the beach or prefer hikes in the woods. You notice it in the hallway chatter and overheard cafeteria banter, and in the subtle and not-so-subtle jockeying for positioning to be the company AI guru. Silicon Valley, the harbingers of capitalism's cataclysmic glow-up, are paradoxically driving both the perceived outcome and the subsequent hand-wringing, as captured in this excellent dissection by Jasmine Sun's recent opinion piece in the Times. Her essay captures individuals whose work they believe is establishing a permanent underclass, an inevitable conclusion forced by the prevailing market economics of the day.

One choice quote — one part hyperbole, two parts "shit, they've got a point":

"There's only a matter of time before GPT-7 comes out and eats all software and you can no longer build a software company. Or the best version of Tesla Optimus comes out," and can perform all physical labor as well. In that world, this year is a human's "last chance to be a part of the innovation."

This vibe shift as captured in Sun's essay isn't a reactionary response or a misplaced emotion, but a natural reflex to a phenomenon I'm calling a leverage inversion. Knowledge workers — and software developers in particular — are uniquely positioned in the labor market due to the nature of the work, and as a result are losing access to the levers workers have historically used to apply a countervailing pressure against the gravity at the center of the economy: capital, and the leverage it holds over managing scarcity.

Underpinning this inversion is the biggest and most ambitious capital-intensive project in human history. We're living through the industrialization of cognitive labor. Regardless of the possibility that these AI models may have even the faintest glimmer of self-awareness, we know these computers can actually think. Think — not 100% exactly like a human brain — but... close enough. It's close enough that it can predictably produce real "white collar" work. And so, we've managed to bottle that up on all these computer chips, and we put all these computer chips together, and we connect them all up in huge, massive, power-hungry warehouses, and we put them to work. It's a literal "thinking factory." You insert even the most meager idea or question into it, and the "goblins" at the factory shovel tokens into the forge, smelting together "context" and the ghosts of humanity's summed thinking, threading them back together into much bigger thoughts, occasionally some new ideas, and other times, answers to really hard questions. "Small inputs, big outputs."

Some of those questions can be really expensive to get an answer to. They might be something like "What does an app that can solve this one really specific problem for me look like?" Answering that question might cost thousands of dollars, in part because the answer involves asking and answering a bunch of smaller questions. We haven't yet reached the point where we can one-shot building a real enterprise-grade app, so you need to break it up into a bunch of smaller tasks and loop over the whole thing over and over again with what's being called an "Agentic Software Development Life Cycle."

I recently built a markdown editor, one part fun side-project, one part experiment in pushing "Agentic SDLCs" to their limits. I'd say I succeeded... (I'm writing this on that app right now)! I asked Claude how much the raw token usage that went toward building the app would cost at today's current API rates. Using session metadata, plus estimates for the missing late-December/early-January data gap assuming my typical coding hours, this was the response:

Extrapolated to full Dec–May:

  • Standard API: ~$6,500
  • Enterprise: ~$26,000

Vs what you actually paid: ~$500 (5 mo × $100 Max 5x).

  • Subsidy at standard API: ~13×
  • Subsidy at 4× enterprise: ~50×

Thanks to the subsidies, I spent twenty-six grand of Anthropic's compute! Think about that for a second, I was able to extract $26,000 of labor for $500, but I'm sure you're seeing the flaw in this sentence. That's $26k in token extraction, not labor extraction. I'm exploiting Anthropic's largesse here, not some poor humans. The whole project took advantage of that "factory of goblins," and I effectively had a personal software agency, with designers, QA, senior programmers, principal engineers, project managers — the whole nine yards.

I previously worked at a boutique creative agency, and also owned my own once upon a time, so I feel well positioned to back-of-napkin the math on what it would've actually cost to hire humans (you have to feed them so much!), to do this for me. Assuming a project manager, art director, design lead, engineering lead, two developers, and one QA — hired directly — this would have easily cost $1-1.5 million and taken 12 months. Assuming the overhead of running an actual agency, you're looking at upwards of $2 million.

Now the math starts to sink in. Let's say I mathed totally wrong, let's say we're extracting even just a fraction of that. Extracting $500k of labor costs for $500 in half of the time it would've otherwise taken is a wild proposition.

The fundamental physics underpinning capitalist economies are being upended, but this isn't the first time we've been here. Ms. Sun writes, "[a]t the same time as A.I. erodes ordinary workers' leverage, it may concentrate power and wealth in large companies and the U.S. government." What's that AI trope, "we've seen this movie before?"

The industrial revolution accelerated the production of material goods at a scale never seen before, all work that previously required manual physical labor. We went from artisans hand-crafting things, to factories pumping out whatever we wanted.

Everything that happened during the industrial revolution cost a great deal of money, what we today call "seed" or "series" funding, and we've been locked in this hyper-growth cycle ever since, born out of the leverage industry provides. "Small inputs, big outputs." Industrial factories have a high one-time start-up cost to get going, and relatively nominal carrying costs. Once Henry Ford built his factory, he was cranking out cars all day long.

The capital-intensive projects of the late-19th/early-20th century industrialization aren't dissimilar from the start-up costs of producing "frontier" AI models. Billions of dollars are pouring into the production of these cutting edge "thinking factories," and billions more are required to run them. Which brings us to the interesting economic dynamics of this moment.

Capital is extractive by its very nature. (This is what I'd call a "flat fact.") Whether it's pulling rare minerals from the earth, hard work from the sweat of a brow, ideas from a knowledge worker's head, creativity from the artist, or attention from eyeballs, the entire project is predicated on the idea that money is exchanged as the token for that value. Surplus and scarcity intermingle and dance, in a delicate play we call "the economy."

In what feels like a "subtweet" rebuttal from The New York Times's Ezra Klein aimed at Jasmine Sun, Klein posits a positive counter to the "AI doomers," and thankfully, his essay mostly rests on and pre-chews "What will be scarce?" by Alex Imas. Imas's argument is sharp, well-reasoned, and well-sourced; but there's also a critical ingredient missing from this hot-take sandwich.

Imas names the central phenomenon, much of which we've covered already, but draws a different conclusion. He pulls from historical data covering the industrialization era, alongside modern examples of industrialization in Taiwan. Prior to industrialization, labor predominantly goes toward agriculture. With the emergence of large scale automation and factory farming, 40% of the farming labor force shifts towards factories and offices. Economists call this "structural change." (I know, what a technical term). It has a clear and repeatable pattern: the slice of the economy that goes to that automated sector shrinks while producing greater abundance, followed by spending and employment shifting to demand for more and better things in different and frequently emergent sectors of the economy.

Imas flags many of the possible dark scenarios, including my favorite: a world of "Intolerable Abundance." This is a truly post-scarcity economy, one which undermines our ability to use labor as the chief mediator of the social contract. Absent the ability to exchange labor for value — the argument goes — social organization, income distribution, and democracy itself lose the gravity that keeps them in place. Poor little rich world.

Mostly, however, Imas envisions a world that transitions to a "relational economy" undergirded by "mimetic desire," both of which deserve a thorough unpacking. This is a "post-commodity" economy, where the cost of commodity production drops to cheap or near-zero, and surplus income is redirected to existing or emergent relational sectors: education, childcare, healthcare, therapy, craftsmanship, and community — and eventually new heretofore unimagined industries such as "experience designers, human-AI collaboration artists, and provenance certifiers" (whatever those are...). Ultimately, this resembles the phenomenon economists identified in the structural change we saw with the emergence of agricultural automation.

Mimetic desire, Imas believes, is the human nature that will drive us there. In short, people actually gravitate towards scarcity and novelty when buying things. Mimetic desire sits somewhere at the intersection of "coveting stuff" and exclusivity. It's the reason NFTs had their moment in the sun, and the reason people will pay a premium for designer goods, even if they're just partly hand assembled junk from overseas factories. This is the impulse that would drive individuals to take surplus income, now freed by AI automation, and funnel it into exclusive relational experiences. One thing I personally find horrifying: It's those exact sectors Imas highlights which have a strong history of suffering from a perversion of incentives when profit becomes the dominant motive. American privatized healthcare and higher education need no explanation.

But I digress, this is an essay about software development specifically, and it's such a heavy component that I worry Imas and Klein's sunnier viewpoints collapse under the weight. Software is structurally different from every sector Imas considers, and both he and Klein leapfrog that fact entirely. Commodity production is fundamentally physical in nature, software is a purely digital endeavor. While servers and compute bear some of the cost, ultimately those are nominal, and thanks to Moore's Law, they shrink and shrink. The scarcity that drives software production, as I outlined earlier, is the labor scarcity, the scarcity of the specialization contained in the knowledge worker. All of Imas's models are based on physical resource transitions.

Furthermore, software has eaten the world, as the saying goes. The relational sector ceases to be a refuge when all of our relations are mediated through an increasingly digital world. Every single sector Imas highlights is now heavily represented in the digital space. Relying on the relational sector here places a high premium on the last mile of human interaction, and these digital platforms are inherently alienating. Coffee shop conversations happen on Reddit, but you're likely talking to an AI bot thanks to the "Dead Internet." Talkspace makes therapy feel like you're talking to someone at a bus stop, and telehealth visits increasingly become the predominant mode of interaction with doctors, let alone the fact that people like Klein are first hitting ChatGPT to look up their symptoms. The craftsmanship is found on Etsy, but you're also sorting through half of "artisanal" Alibaba junk on there as well. eBooks replace browsing for books at the bookshop and chatting up your local librarian. The human element here isn't gone, but I think both Imas and Klein over-index on the possibility that Artificial Intelligence will somehow lead to a renaissance in the civic space.

The mimetic argument also begins to fall apart when you look more closely at the way in which AI changes the fundamental way software is not just produced, but customized. I truly believe that we'll see this phenomenon Imas identifies in the realm of software. The exclusivity of having an app tailor made for your precise needs is the exact impulse I believe will launch a million boutique and niche software projects, just like my markdown app. But what happens when the mythical ChatGPT-7 arrives, and that impulse is just one prompt away from fully realized software?

This will then be compounded and accelerated to an incredible degree as the open-source models catch up with the frontier models. It's still unclear what the cadence for closing the gap will look like, but it's more of a "when" than an "if." Even if they lag by 1-2 years, the open source models of the future will generate output that surpasses today's frontier models, and the barrier to entry is likely to be a $500 Mac Mini.

When all of this is combined, Imas's framework starts to point to more starkly dystopian outcomes, even if just taken on its own terms. A widening wealth gap and a displaced labor force collide, creating a world where the lifestyles of the secure upper/middle class may resemble those of today's billionaires — a parallel Imas himself draws.

Where does this leave the disenfranchised software developer?

Software developers have historically held some of the highest leverage professions, possibly in history. I don't think developers were ever quite "artisans" on the level of the tailor or woodworker displaced by the industrial revolution (my apologies to those whose titles ever contained "rockstar," "ninja," or "guru"). But their work does represent a truly mind-boggling degree of specialization depth and breadth, while requiring hard-to-quantify traits like "taste." The mythical "unicorn" of the software industry took many forms, and they commanded wild six- and even seven-figure sums for the privilege to let others tap insights covering topics ranging from architecture to how to coerce the CSS box-model into cooperating.

Setting aside the "artisanal" nature of the work, it's hard to argue that the profession feels less like slowly carving something out of stone, and more like laying bricks. Laying bricks is still incredibly valuable work, but this is where the track record of American industry starts to look demonstrably and unequivocally spotty. When the output of labor is commoditized and starts to look like the movement of fungible building blocks, the power dynamics start tilting away from workers' favor. Goodbye nap pods and free snacks, hello in-house surveillance state.

The United States went through a long, and even bloody period transitioning from the Gilded Age into the post-WW2 era of prosperity. What many call the "Golden Age" of this country — where kids went to college to pursue dreams and then work in a randomly unrelated field, where there were two cars in every garage and color televisions and tv dinners — was built atop an era of turmoil and tumult. May 1st, a worker's holiday the U.S. refuses to acknowledge (but the literal rest of the world does), was a celebration of the Haymarket anarchists who were unjustly framed, and subsequently murdered, all in the pursuit of this radical idea known as the "8-hour workday." The period from 1880 to 1940 is littered with stories of truth stranger than fiction: privatized police forces (see: Pinkerton) attacking protestors, the actual National Guard being called in against striking workers. Henry Ford's Sociological Department inspected workers' home lives as a condition for his supposedly progressive $5 workday, and workers deemed "morally unfit" were denied bonuses and given six months to shape up.

All the basic worker rights you likely take for granted were hard-won, never granted.

Ezra Klein may have a point about the destination. The world has a tendency of not getting worse in the sense that "progress" affords us an increasing quality of life over time. The abundance of goods, leisure, and health for even the poorest among us would seem like the stuff of kings to a feudal peasant. (This isn't to diminish the very real struggles the marginalized groups of our society face, there's still much work to be done.)

Artificial Intelligence carries with it a truly beautiful potential, the kind that's driving the aforementioned trillions in investments, and has billionaires waxing poetic. I believe we stand on the threshold of what I would call "undeniable post-scarcity." Scarcity, as noted by Imas, is at the center of all economics. Some would argue we've lived at a primed potential for post-scarcity economies since the 1970s, but for the first time we have concepts like "Universal Basic Income" entering the window of acceptable public discourse. Chat GPT-7 and Tesla's Optimus might be the thing that puts you out of work, not because you don't have a job, but because nobody has a job anymore — not in the traditional sense. But what the billionaire giveth, the billionaire taketh away; Sam Altman says he no longer believes UBI is necessary, and instead offers us vague platitudes for "collective alignment," whatever that means.

Focusing too much on the destination ignores the potentially fraught path to it, along with the genuinely unique circumstances of our time. In the wake of the industrial revolution, workers used leverage as the countervailing force. The general strike, which brought in the National Guard to bust some heads together, was one of the levers pulled to reassert the balance between the monied classes and the workers atop whom the whole of society rested on. This time, however, leverage is the exact thing that is being engineered away by Artificial Intelligence.

The story of the Software Developer is the perfect illustration of this inversion, which I believe is coming for all knowledge workers. The idea that the Silicon Valley elite's "doom and gloom" is hyperbole tends to center on the idea that there's something unique about software development. That it's easy to automate entirely away because of something inherent to the development of software that other forms of knowledge work don't share. As someone on the frontlines of these Agentic SDLCs, I can assure you the same approaches can — and will — be applied to every other field in due time. Claude Code — most of its impressive capabilities gained from its Command Line Interface and the "harness" that powers it — has a general purpose non-coder version in the form of Claude Cowork. Each week, Anthropic releases a new set of capabilities aimed at a particular industry: finance, healthcare, law, graphic design...

The question begs: if what we're really seeing is a Great Leverage Inversion, a potentially unprecedented phenomenon, what does humanity do to ensure we come out on the other end of the "new industrial revolution" looking more like Star Trek, and less like Star Wars? If the "vibe" seems like it's been off lately, maybe it's because no one's found an answer to that question, or worse, we're stuck pretending the question either doesn't exist, or doesn't need asking.

— End