This is a future retrospective, written as if from 2029 looking back. It builds on the ideas in The Rehydration Flip. All events, outcomes, and corporate misfortunes depicted herein are entirely fictional. Any resemblance to actual quarterly earnings is purely coincidental. The views expressed here are solely those of the author and do not represent the views or positions of any institution the author is affiliated with.
June 17, 2029
I spent this morning approving a security patch that "shipped" to 9,142 codebases.
Not nine thousand servers.
Nine thousand implementations.
Same product. Same company. Same quarter. Same vulnerability.
Nine thousand variations because, somewhere between 2026 and now, we collectively decided that "one codebase to serve everyone" was a luxury we could not afford.
And then we discovered the hidden cost of that choice:
You can mint software like coins.
But you still have to update it like a civilization.
When implementation became abundant, maintenance became the scarce resource.
We won speed. Then we discovered speed has a winter.
A grounding note, because I am old enough to have deja vu and this still felt new
I grew up in the 90s.
I remember when "knowing computers" meant you could explain what the machine was doing, top to bottom, without leaving the room. I watched assembly give way to compilers, watched compilers give way to frameworks, watched frameworks give way to "just call the API." Then the web became platforms. The phone became the platform. Cloud became gravity. SaaS became default. Social networks taught everyone the same cruel lesson: distribution is not a feature. It is the business.
So yes, I have seen abstraction shifts before. Plenty of them.
But living through the past few years did not feel like "the next abstraction."
It felt like the first time the thing that changed was not how we write code.
It was what code is.
Because in every previous era, assembly, C, Java, cloud, containers, code stayed sacred. Code was still the source of truth. We wrapped it, layered it, deployed it differently, but in the end: the code was The Thing.
Then intent came along and stole that role.
And once intent became the thing, everything downstream had to reorganize: teams, pricing, compliance, jobs, hardware... even what "maintenance" meant.
That is why it felt new.
An obscure essay in 2026 tried to name three concepts for what was happening — a law, a trilemma, and a theorem. Most people ignored it. They got the important stuff right. But frameworks are just frameworks. What unfolds from them is never that simple.

Rehydration Law: The dominant cost shifts from producing code to rehydrating the understanding needed to change it safely.
Forkability Trilemma: Pick two: one shared upstream, high variability, low rehydration cost.
Intent-Compilation Theorem: When generating is cheaper than understanding, compile from intent instead of maintaining a universal platform.
Everything that follows is what those ideas did to us in practice — and what they did not warn us about.
But to understand how we got here, you have to understand what we lost.
The part we did not say out loud in 2026: code used to be art

In the 2000s and 2010s, there was a kind of romance to it, especially if you came up through open source, or you learned to love the shape of a clean system.
Code was... expressive.
You could read a file and feel the author in it:
- the taste in naming
- the elegance in a function boundary
- the restraint in what they chose not to build
- the little structural decisions that made the whole thing breathe
We argued about style because style mattered.
We argued about architecture because architecture lasted.
We debated "good code" the way people debate good writing.
And the economy reflected that, too.
When code is hard to produce, the people who can produce it well have leverage. When code is scarce, it is valuable. When it takes time and judgement, it behaves like craft.
That is what code was for a while: craft. Sometimes art.
Then the supply curve collapsed.
And the art got industrialized.
Not because we became soulless.
Because we were rational.
So what replaced it?
The quiet revolution: intent became the ultimate abstraction

If you want the cleanest sentence for what happened, it is this:
Intent became the ultimate abstraction, and it changed the unit of software.
Not "files."
Not "services."
Not even "APIs."
The unit became: an intent bundle.
Meaning: what you want, what you allow, how you prove it, and how you recover when you are wrong.
Specs. Tests. Policies. Permissions. Procedures. Interfaces. Evidence.
What mattered was the layering:
- Declared intent: what humans said they wanted.
- Encoded intent: what got formalized into specs and tests.
- Enforced intent: what gates actually verified at deploy time.
- Observed intent: what the system demonstrably did in production.
When those four layers agreed, everything worked. When they drifted apart, you got the kind of failures that made 2028 so painful.
But in 2026, most people only talked about the first two. The industry had not yet learned that the gap between declared and observed is where systems go to die.
In practice, a mature intent bundle looked like this:
/intent
spec.md # outcomes and invariants
policy.rego # permissions and constraints
tests/
golden_paths.yaml
property_tests.py
runbook.md # recovery and rollback
evidence/
threat_model.md
audit_receipts/If a variant could not rebuild from this folder, it was an orphan waiting to happen.
Once you have that full stack, code stops being sacred. It becomes output.
And that one conceptual move rewired everything:
- Code stopped being something you hand-crafted and started being something you compile.
- Reviews stopped being "is this elegant?" and became "does this intent + these gates actually constrain behavior?"
- Architecture stopped being "how do we scale a platform?" and became "how do we keep rehydration cheap across a fleet of specializations?"
In hindsight, you can see the whole era as a fight over one question:
Are we building software... or are we building an intent compiler that itself builds software?
The winners answered "compiler," whether they admitted it or not.
Technically, it looked like this:
In the craft era, you shipped a codebase.
In the industrial era, you shipped a production process.
Of course, people saw this coming. And they had objections. Good ones.
Lie #1: the bugs will save us
You remember the objection. Everyone does.
"Come on. LLMs write buggy code. This is gonna be a shitshow."
They were right about the bugs.
They were wrong about the conclusion.
LLMs never became reliable the way people meant back then, and the truth is they were never designed to be. They got better, sure, cleaner output, fewer obvious mistakes, more plausible reasoning. But the real shift was not that models became trustworthy.
We stopped trying to trust the model directly.
We learned to trust something else:
- gates
- boundaries
- provenance
- rollback
- and the ugly discipline of "prove it or it does not ship"
Bugs did not prevent the flip.
Bugs shaped the flip into the only form that could survive: constrain, verify, regenerate.
That was the Rehydration Law turning from a phrase into a budget.
The second lie was more practical. And more dangerous, because it was half right.
Lie #2: the tests will save us
We did add tests.
Then lint rules.
Then policy checks.
Then sandboxing.
Then signed build receipts.
Then provenance trails.
Then "you cannot deploy unless the intent bundle is complete."
We built a bureaucracy and called it "verification infrastructure," because that sounded less embarrassing.
It worked.
And once it worked, the next step became inevitable, because engineers are engineers and we optimize what hurts:
If we can generate the variant we need, and the gates can keep it safe... why are we dragging a giant general platform around at all?
That is the moment the Forkability Trilemma stopped being a triangle and became a quarterly decision.
The signs were everywhere. We just were not reading them yet.
Signals we ignored

I keep a list. I call it "Things that were obvious later."
- Oct 2026: "$0 Seat Licenses." Billing moved to audited actions. Switching costs collapsed.
- Jan 2027: "Procurement Standardizes the Build Threat." Customers could credibly threaten to regenerate. Margins cratered.
- Apr 2027: "Philippines BPO Sector Hits 40% Unemployment." Voice agents did not need to be perfect. They needed to be cheap.
- Sep 2027: "Deloitte Launches Intent Compliance Practice." The Big Four smelled blood. Paperwork became product.
- Feb 2028: "GitHub Deprecates Pull Requests for Intent Diffs." Code review became intent review overnight.
- Mar 2028: "Generator Template Ships Fleet-Wide Vulnerability." One shared template. Thousands of affected artifacts.
- Jul 2028: "Stack Overflow Closes Public Q&A." The last knowledge commons went behind an intent paywall.
- Nov 2028: "ForkOps Funding Boom." Patch speed became the product.
- Dec 2028: "McKinsey Sued Over Shared Intent Template." The lawsuits were not about code. They were about who wrote the spec.
- Jan 2029: "EU Mandates Intent Audits for Critical Infrastructure." Governments started claiming sovereignty over the abstraction layer.
- Apr 2029: "Google Achieves Full-Fleet Rehydration in Under an Hour." The gap between prepared and unprepared became unsurvivable.
If you read those as dystopian, I get it.
If you read them through the three concepts, they are just the Intent-Compilation Theorem and the Rehydration Law taking turns.
But headlines are abstract. Living through 2026 was not.
2026 felt like cheating
In late 2025, early 2026, a competent engineer with a good agent setup felt ten times stronger. Not because they suddenly became smarter, but because the cost of implementation dropped so fast it barely registered.
It started in the corners: internal tools, scripts, dashboards, integration glue. The stuff you never put on a roadmap because it was "too much effort for too little reward."
Then it moved into customer features. Then into product lines.
The first time I watched a serious org rebuild something "overnight," it was not cinematic. It was mundane. (Cloudflare rebuilt the Next.js API surface in a week — one engineer, $1,100 in tokens. That was February 2026. It only got faster from there.)
- product writes a paragraph
- agent reads the repo + specs
- code changes
- tests appear
- diff looks reasonable
- deploy happens
The first surprise was the speed. We had a name for it: vibe coding. Ship on feel. Trust the output. Do not look too hard at what is underneath.
It felt like freedom.
The second surprise was the question nobody asked loudly enough:
"Okay. Now patch it."
Speed is intoxicating. Continuity is boring.
And boring is where most systems die.
Here is how we learned that the hard way.
November 2026: the meeting where we made the wrong call
My team is debating whether to go all-in on intent-first specialization.
We have a classic platform: plugins, configs, a feature matrix we are proud of and secretly terrified of.

One of our best engineers, smart, cautious, allergic to hype, says:
"Are you guys out of your minds? We start generating variants, we are never going to patch them. We will create thousands of snowflakes and drown."
He was right about the risk.
He was wrong about the conclusion.
We did it anyway. Not because we were reckless. Because a customer forced our hand.
They wanted a bespoke workflow that our platform could technically support... but only by adding so many toggles, exceptions, and "enterprise mode" shims that the whole system became harder to explain.
We tried to do it "the right way."
We extended the plugin interface.
We designed the config.
We wrote the docs.
We added the flags.
It got worse.
And then a less senior engineer, more pragmatic, less emotionally invested in the platform, said the sentence that eventually became obvious everywhere:
"This is not a config problem. This is a compilation problem."
The room went quiet. Then the boss:
"Fuck it. Fork it. Make it simple and small enough to understand and maintain."
We took the kernel, wrote the intent bundle (specs + policies + tests), generated a specialization, and shipped it behind strict gates.
It went out faster than the platform version.
It was easier to audit.
We could explain it to a new engineer in an afternoon.
Every customer migrated within a quarter.
That was the day "forking" stopped sounding like rebellion and started sounding like hygiene.
We did not eliminate patching risk. We postponed the bill.
And we paid it later, with interest.
But first, something else had to die.
2027 was the year configuration started to feel embarrassing
The warmth of 2026 was fading. You could feel the first frost in the architecture meetings.
Configuration is not just options. It is a promise:
"All combinations should work."
Once you make that promise, behavior lives in the cross-product of code x config x environment.
That is rehydration poison.
So the industry did what the laws predict:
- push variability out of runtime config
- pull it into build-time specialization
- replace "enable feature" with "compile variant"
- replace "plugin marketplace" with "intent bundles + strict gates"
That is when the story stopped being "software architecture" and became "economy."
Because once code becomes industrial, markets treat it like industrial output:
- abundant supply
- falling margins
- brutal competition
- consolidation around distribution and trust
- regulation showing up late and angry
And when code becomes a commodity, the question stops being "who builds it?" It becomes "who survives it?"
The broader context: winners, losers, and why prices fell like a rock
Here is what the three concepts meant in practice:
When implementation becomes abundant, differentiation collapses.
Value migrates to distribution, trust, constraints, and data.
And the moment code stopped behaving like craft, the market did what markets do:
It competed away the margin.
Winners
Distribution. Microsoft sold the default path, from VS Code to GitHub to Azure. Relentlessly. Cloudflare turned the edge into the quarantine lane for forks and became air traffic control for machine traffic, sitting between agents and the internet.
Systems of record. Screens were dead. SAP and the other systems of record became besieged survivors: everything that could leave did leave, but the data stayed because migration still broke payroll and audits. They won by keeping users hostage, not by being good.
Verification and trust. GitHub build receipts became passports; production refused artifacts without an attestation chain. The winners were the pipeline owners and registries, plus the provenance startups nobody had heard of in 2026 that sold liability coverage by the build. Proof.
Hardware. Apple and Qualcomm turned laptops into local workhorses for routine loops: diffs, refactors, tests, and migrations. Frontier reasoning still ran on NVIDIA in the cloud, rented by the minute, because the hard parts stayed hard. The money was in the loop.
Documentation culture. Trello died. Basecamp died. Project boards became noise that agents could read but had no use for. Today the only money-making product in the Atlassian suite is Confluence. Wikis became executable intent.
Losers
UI-moat SaaS. The UI stopped being the place where work happened. BI tools died first — Tableau, Looker, Power BI — because "show me a chart" is a prompt, not a product. Jira became an audit database. Figma became an export format for models, and the design layoffs pulled the floor out from the rest.
Offshore support centers. When support automation crossed 95%, the remaining 5% became a trust problem, not a cost problem. Companies hired a few local veterans for escalations and deleted the rest of the org chart. Snapped.
Implementation-only firms. Accenture lived by pivoting to AI governance, procurement, and verification, because someone had to sign the liability. The casualties were the staff-augmentation firms and regional body shops that had nothing to sell once keyboards stopped being scarce. Nobody paid $500k for an integration that an agent could ship in an hour. Their margins went to zero overnight.
Traditional coding education. Bootcamps imploded because they taught the part that got automated. The four-year degree from a fancy school stopped being a signal. What mattered was whether you could specify behavior precisely enough that a machine could build it and a gate could verify it. Some people learned that in a semester. Some never learned it at all.
The surprise
In 2026 everyone thought the moat was code. It was trust. Forks were free; liability was not. The orgs that could prove what they shipped kept shipping.
That was the macro view. But on the ground, forking was not a strategy deck. It was a daily reality.
What we actually mean when we say fork
In 2026, "forking" sounded like Git drama.
Now it mostly means something simpler:
a different intent bundle produces a different artifact.
- kernel stays small: permissions, sandboxing, logging, deployment scaffolding
- intent bundle carries meaning: specs, policies, tests, procedures, interfaces
- code becomes compiled output: persisted if useful, regenerated if needed
So yes, we have thousands of "codebases." But what we really have is:
a fleet of compiled specializations.
And that brings us back to rehydration, because the bargain was:
We will accept many artifacts... in exchange for cheaper understanding.
It held.
Until intent itself started to drift.
Business goals changed but old constraints stayed. Legal interpretations evolved but tests stayed frozen. Edge cases accumulated until the specs described a system that no longer existed.
The compiler incident was loud. Intent drift was silent. And silent failures compound.
The real winter is not caused by bad code. It is caused by stale intent.
The most dystopian part was not layoffs — it was orphaned software
Layoffs happened. In waves. Not everywhere, not all at once. Enough to change the culture.
But the darker thing was not "AI took jobs."
It was the acceleration of forgetting.
In the old world, you could keep legacy systems alive with institutional memory and a few heroic engineers.
In the new world, systems survived if they had:
- complete intent bundles
- reproducible builds
- verifiable constraints
- patch paths
- provenance
The rest became orphans. The vibe-coded artifacts. The ones built on feel instead of intent:
- fork fleets with no patch strategy
- internal tools that could not rebuild
- "temporary" dashboards that became compliance-critical
- overnight builds that became permanent dependencies
We could create software faster than we could remember what we created.
Not the speed. The forgetting. That is what haunted people.
When code was craft, you could read it like a work.
When code became industrial output, you had to manage it like inventory.
Nobody told us it would feel like grief.
And then the thing we feared most actually happened.
Spring 2028: the compiler incident (the week winter began)

A popular generator template shipped a subtle permission escalation. Not cartoon malware. A "helpful default" that widened capability in a way nobody noticed immediately.
Monday. The security bulletin hit inboxes at 6 a.m. Pacific: "Severity: Critical. Exploit: in the wild. Affected: all fleets using BaseGen templates v4.2+." The reflex was automatic: regenerate.
Wednesday. Regeneration was the problem. Half the fleet failed verification. Intent bundles had drifted. A quarter could not rebuild at all — source of truth incomplete. The remaining quarter rebuilt, but behavior shifted in ways that did not look like security bugs.
Until production started paging at 2 a.m.
Saturday. The internal postmortem opened with this line:
"Root cause: not the vulnerability. Root cause: 2,300 intent bundles with no verified rebuild path. We do not have a code problem. We have a memory problem."
Remember the cautious engineer from the 2026 meeting? The one who warned about drowning in snowflakes? He ran the incident bridge that week. On day four he sent a message to the team:
"We are not drowning in snowflakes. We are drowning in missing receipts."
He was right both times. In 2026, he was right about the risk. In 2028, he was right about the shape.
That week ForkOps stopped being a cute term and became a budget line item.
Not because we loved forks. Because the alternative was the old world: giant platforms with unbounded rehydration cost. The blizzard had arrived, and there was no going back inside.
We were trapped inside the trilemma we chose.
And the companies that could patch across fleets, fast, started to dominate.
Patch speed became trust.
Trust became pricing power.
That is how the market stabilized after the race to the bottom:
Features commoditized. Prices fell.
Then governance became the premium tier.
Trust became stratified. Three tiers: unverified and cheap. Verified but lightly constrained. Fully audited with provenance trails.
Companies stopped differentiating on features. They differentiated on guarantee levels.
Lloyd's started pricing intent maturity and patch throughput. Software risk became actuarial.
But the market was not the only thing that noticed intent had become valuable. Governments noticed too.
August 2028: intent had become territory

The compiler incident showed us code could fail at scale. What came next was worse: intent could be captured.
By late 2028, large enterprises were outsourcing intent bundle construction to McKinsey, Accenture, and a dozen template vendors. Standardized policies. Pre-approved compliance bundles. Industry "best practice" constraints.
Entire sectors converged on identical intent templates.
Then a regulatory shift exposed what was hiding inside: subtle biases in risk tolerance and liability allocation, baked into the templates themselves.
The lawsuits were not about code. They were about who wrote the spec.
Intent, it turned out, was not neutral infrastructure. It was a political surface.
We should have seen this coming. The signs were there as early as February 2026, when US War Secretary Pete Hegseth gave Anthropic an ultimatum: drop your safety constraints or lose your Pentagon contract. The fight was framed as national security versus corporate ethics. The real precedent was structural:
A government was asserting the right to define what an AI system is allowed to intend.
That was not a procurement dispute. That was a sovereignty claim over the abstraction layer.
The pattern has spread. The EU, the UK, and California now mandate "intent audits." Defense contractors require pre-approved constraint templates. The companies that own dominant intent standards have more leverage than the ones that own the models.
Departments fought over constraint ownership. Legal teams gained architectural veto power.
The spec review meeting became the most political room in the building.
Intent did not just replace code as the unit of production. It replaced code as the thing worth fighting over.
Which raises the question everyone kept asking: if code is industrial and intent is political, where did the craft actually go?
The last job standing was the one nobody wanted

Not librarian like "organize folders."
Librarian like:
- curate the intent
- curate the constraints
- curate the procedures
- curate the evidence
Because in a world where you can generate a thousand implementations in a day, the only durable advantage is:
- what you can prove
- what you can reproduce
- what you can remember
This is also where the "art" went.
It did not disappear. It moved.
The art moved from implementation to:
- choosing the right constraints
- designing interfaces that do not leak footguns
- writing tests that actually capture the real-world behavior
- shaping intent so it compiles into something safe and understandable
Code became industrial.
Intent became the new canvas.
The cautious engineer from 2026? The one who warned about snowflakes, who ran the incident bridge in 2028? He became our first intent librarian. Turns out the person most paranoid about the risks was the best person to curate the constraints.
He was the first. He was not the last. And the advice he would give his 2026 self is simpler than you would expect.
What I wish I had known in 2026

Looking back, the advice is embarrassingly simple. I just did not have the scars yet to believe it.
Do not worship "forking." Do not worship "vibe coding." Do not worship "minimalism." Worship rehydration throughput.
If I could go back and tape one checklist to the wall, it would be this:
- Can a new engineer rebuild any variant from intent alone, in under a day?
- Can you regenerate with reproducible, signed outputs?
- Can you diff intent bundles meaningfully, not just code?
- Can you patch 90% of the fleet inside 48 hours without bypassing gates?
- Can you prove least privilege after regeneration, not just assume it?
- Can you answer "what changed and why" for any artifact, at any point in time?
The flip was never about code being cheap. It was, and still is, about understanding being expensive.
And once intent became the ultimate abstraction, code became manufacturing output.
But intent was never salvation. It was relocation. We moved the bottleneck from implementation to meaning. And meaning turned out to be the one thing you cannot generate, cannot automate, and cannot afford to get wrong.
Control shifted from creators to curators. The most powerful institutions are not those that write the most code. They are those that best define what code is allowed to be.
The ones who built the factory early, intent, gates, boundaries, provenance, get speed without winter.
The rest of us get speed too.
Just the way we got it in 2028:
Fast.
Cold.
And hard to undo.
Related thinking
This essay did not emerge in a vacuum. Several people and projects were exploring adjacent ideas, and I want to acknowledge their work as part of the broader conversation:
- ContractSpec pushed the "spec-first compiler" framing early, with deterministic generation from contracts, signed outputs, and policy decision points.
- Roberto B.'s "Programming in the Age of AI: From Code to Intent" arrived at a strikingly similar structure: humans write intent and constraints, AI generates, trust shifts to verification.
- The VariantSync project at Ulm and the PaReco work (ESEC/FSE 2022) studied exactly the clone-and-own patch propagation problem that became "ForkOps" in this story, years before AI made it worse.
- Kuhlemann et al.'s "Patching Product Line Programs" (2010) described the specific failure mode where a bug in the generation mechanism propagates to an entire fleet — our "compiler incident," fifteen years early.
- a16z's "Trillion Dollar AI Software Development Stack" mapped the market reorganization around verification and governance that this essay narrates as lived experience.
- Citrini Research's "The 2028 Global Intelligence Crisis" is a great read that takes the future-retrospective format in a different direction: macroeconomic contagion from AI displacement. I found it after I started writing this piece. The two are disconnected in origin but share the conviction that scenario-building is the honest way to reason about what is coming.
This essay builds on The Rehydration Flip, which introduced the three concepts. What it adds is the causal chain: intent-first specialization leads to variant fleets, variant fleets create patch propagation economics, and patch economics become the new basis for trust and pricing. The winter is not in any one of those steps. It is in the compounding.
