Competing in a world of AI abundance: How to win when code has no value?
Economics begins with scarcity. Land, labor, and capital have always been limited, and societies have been organized around how best to allocate them. The theory of abundance argues that artificial intelligence is eroding scarcity in two of those inputs: intelligence and labor. Tasks that once demanded months of work by skilled professionals — writing software, analyzing data, drafting contracts, designing campaigns — can now be done in minutes, replicated infinitely, and adapted at no meaningful cost.
The effects of this shift ripple far beyond software. As AI reduces the marginal cost of intelligence-based work, it threatens to automate away millions of jobs, displacing workers in sectors from customer service to law. The abundance of digital labor also creates pressure on wages, polarizing labor markets between those who supervise, govern, and direct AI systems and those who are displaced by them. Society will need to confront the consequences: new definitions of meaningful work, new safety nets, and new forms of inequality. At the same time, AI-driven robotics and scientific discovery could push abundance into the physical world — lowering the cost of housing, medicine, and energy, and shifting scarcity into areas like trust, distribution, and governance.
Each of these transformations deserves deep analysis. But to make the discussion concrete, this article will focus on software, the first domain where abundance is already being felt. By examining how AI reduces the marginal cost of software to nearly zero, we can see more clearly how industries are unmade and how new winners emerge. Future essays will return to labor markets, society, and the broader economy.
Software has always been unusual in its economics. Copying it has been nearly free, but creating and maintaining it has been expensive. The SaaS model rested on this asymmetry: companies absorbed high fixed costs to design and maintain software, then scaled to millions of users at high margins. AI destabilizes this model. By automating coding, debugging, and maintenance, it drives the fixed cost of creation toward zero. With both creation and distribution essentially free, the foundations of the industry shift.
That shift raises the central question: in a world where software itself is abundant, how does a software company compete, and how can it win? The answer lies in recognizing where scarcity has reappeared.
One arena is what might be called the outcome economy. If anyone can build a tool, the tool itself no longer commands a price. What is scarce is the ability to credibly guarantee results. Many vendors will claim their product improves conversions or resolves support tickets faster. Very few will be able to measure those outcomes rigorously, integrate into workflows to prove them, and assume risk if they fail. The scarce good is not outcomes in the abstract, but trustworthy accountability: verifiable, contractually guaranteed results.
Another arena is data. Code may be free, but proprietary datasets remain deeply scarce. They make outcomes measurable, they enable personalization, they fuel trust, and they drive speed. More importantly, they compound: usage generates data, which improves performance, which attracts more users, which generates more data. Companies already recognize this dynamic. Salesforce’s restriction of AI access to historical Slack data is a vivid example: strategically, it locks in a moat. Only Salesforce can use Slack’s conversational history to power its models, turning that dataset into both a competitive fortress and a guarantee of trust to its customers.
Closely related to data is context and attention. Not all data is equal; raw data is abundant, but context-rich data that maps to user preferences and workflows is scarce. It is what makes personalization meaningful, allowing software to become uniquely indispensable to each user. Attention is scarce for the same reason. With infinite personalized tools vying for mindshare, the battle shifts to who can maintain durable engagement and weave themselves into daily routines.
A fourth arena is the explosion of niche applications. When software creation is practically costless, problems once considered too small to matter suddenly get solved. Millions of micro-apps and AI agents will emerge — some serving whole industries, some only one company, and some even a single individual. But this proliferation creates fragmentation, and fragmentation shifts scarcity to those who can provide trust, integration, and distribution. This is where Ben Thompson’s aggregation theory becomes crucial. Thompson showed how internet-era power shifted from producers to aggregators — platforms like Google, Facebook, and Apple that organize abundance and control demand. The same logic applies here: the winners in the AI era will be those who aggregate countless micro-apps into cohesive ecosystems. They will orchestrate secure environments with identity, billing, compliance, and interoperability baked in, becoming the trusted gateway through which abundance flows.
Finally, in a world of infinite competitors, speed itself becomes scarce. When anyone can generate an app overnight, the half-life of innovation shrinks dramatically. The scarce factor is the organizational ability to learn, adapt, and deploy faster than the rest of the market. It is not just speed of shipping that matters — code is abundant — but speed of learning, iteration, and trust-building. Companies that can close feedback loops quicker — translating usage data into improvements and then into stronger customer relationships — will pull ahead in ways others cannot replicate.
This is how scarcity reappears when software becomes free: it migrates into domains where abundance cannot follow. And it is in these domains that companies must compete.
In a world of zero-cost software, five scarcities define the new competitive order. Not all of them are equal, and their relative importance determines the depth of the moat they provide:
Data sits at the foundation. It is the hardest to replicate, compounds over time, and fuels everything else.
Trust and Integration come next. Platforms that aggregate abundance into secure, reliable ecosystems become irreplaceable gateways.
Accountability follows, powerful because customers will pay for outcomes, but dependent on the data and trust that make outcomes measurable and credible.
Speed is critical in the short run, enabling firms to outpace infinite competitors, but it erodes quickly unless anchored by stronger foundations.
Context and Attention are valuable for personalization and stickiness, but the most fragile, as attention is fickle and context loses strength without the anchor of proprietary data.
Software companies will not win by producing code — anyone can do that. They will win by mastering these scarcities in order of their strategic depth. Those who secure data and trust will build enduring advantages; those who rely only on speed or attention will find their lead fleeting. In the age of abundance, the hierarchy of scarcity will decide the hierarchy of winners.
Suggested Reading
Sam Altman, “Moore’s Law for Everything” (2021) — outlines how AI and robotics could make most goods radically cheaper, driving a future of abundance (moores.samaltman.com).
OpenAI blog: “Planning for AGI and beyond” (2023) — explores abundance, inequality, and safety in a world of artificial general intelligence (openai.com).
Herbert Simon, “Designing Organizations for an Information-Rich World” (1971) — classic essay introducing the idea that in abundance, attention becomes scarce.
Ben Thompson, Aggregation Theory (Stratechery, 2015–present) — the definitive modern analysis of how aggregators organize abundance and capture demand.
Nick Srnicek, Platform Capitalism — a critical take on how platforms build power through data and ecosystems.