Sachin Got It Right, But He Stopped Too Soon
Last week, Sachin published an essay called "Vibe Coding and the Maker Movement" in his newsletter Technically. It's the best thing I've read about vibe coding. He argues that vibe coding is consumption, the expenditure of a surplus intelligence that exists whether you use it or not. He connects it to the Maker Movement of 2005-2015, to Fred Turner's academic work on Puritan self-transformation narratives, and to Joel Spolsky's strategy of commoditizing your complement. He names the hypomania, that state where you genuinely are more productive but you've lost the ability to distinguish between "this is good" and "I feel good making this." If you haven't read it, stop and go read it. Everything below assumes you have.
I agree with almost all of it. But I think Sachin stopped one step short.
He ends with four value-capture strategies: develop taste, generate attention, give gifts, capture signal. These are directions, not instructions. If you're a founder who just spent the last six months vibe-coding your way through a product, reading "develop taste as a residue of expenditure" is like being told to "just be more strategic." You know he's right. You just don't know what to do Monday morning.
This essay is the Monday morning part.
But first, the data, because the situation is more urgent than Sachin's measured tone suggests.
Everyone adopted the tool. Almost nobody trusts it. And the people using it most are, by their own admission, spending more time cleaning up after it than they'd spend doing the work themselves. This is the situation Sachin diagnosed as consumption. I want to talk about what you can extract from the combustion.
Who Actually Captures Value from Vibe Coding?
The margin window is the 18-to-36-month period in any technology transition where production costs have already collapsed but market prices haven't adjusted. It's the gap between what things cost to make and what people will still pay for them. Every major technology shift has produced one. And we are sitting in the middle of one right now.
Here's what the margin window looks like for vibe coding in 2026: A solo founder can build, in a weekend, a tool that replaces a SaaS product a company was paying $50,000 a year for. They can charge $20 a month. The margins are obscene because customers are still pricing against the old cost structure: teams of engineers, months of development, enterprise sales cycles. The buyer's reference price hasn't caught up to the builder's production cost.
This is not a secret. Everyone reading this has either done it or watched someone do it. The question is: how long does it last?
Sachin cited Joel Spolsky's "commoditize your complement" strategy to explain the macro picture. Smart companies try to make their complements cheap: Microsoft commoditized PC hardware to sell more Windows, IBM commoditized add-in cards to sell more PCs.[2] Anthropic, OpenAI, and Google are doing the same thing to code. By making code generation nearly free, they commoditize the application layer (your layer) and make the model layer more valuable by comparison. The Gwern extension of Spolsky's argument puts it bluntly: when you commoditize a complement, the value gets captured by whoever controls the layer that's hardest to substitute.[3]
At the macro level, that's the model providers. But at the micro level (your level, the founder building something this month) the complement being commoditized is code itself. And when code is the cheap complement, what's the product layer that benefits?
Judgment about what to build and for whom.
The margin window closes through three mechanisms, all of which are already in motion:
- Incumbents integrate. The SaaS company you're undercutting will add AI to their own product. They have the distribution, the brand, the existing customer relationships. When they catch up on capability (and they will), your price advantage evaporates.
- The market floods. If you can vibe-code a CRM alternative in a weekend, so can a thousand other people. Differentiation by capability disappears when everyone has the same capability.
- Price expectations adjust. Customers learn that things are cheaper to build now. They stop paying 2024 prices for 2026 products. Your margins compress to something normal.
None of this means don't build. It means build with the awareness that you're in a window. The margins you're capturing today are a function of a temporary information asymmetry, not a permanent competitive advantage. What you do with those margins, what you invest them in while the window is open, is the whole game.
What Is the Judgment Premium?
Sachin identifies "taste" as a residue of expenditure. You develop it by making a lot of things and paying attention to which ones felt alive and which ones felt dead. He compares it to the protagonist of William Gibson's Pattern Recognition: someone with such finely tuned aesthetic instincts that companies hire them simply to say yes or no.
I want to draw a sharper line. Taste is aesthetic. It tells you what feels right. But there's a more operational version of the same capacity, and it's what founders actually need. I'm calling it judgment.
Judgment says: "This feature will retain users because I've watched 50 prototypes fail and I know the failure modes." Judgment says: "The AI is hallucinating this API call and I know to check because I've seen it do this three times before in this domain." Judgment says: "This is a weekend project, not a company, because the distribution problem is unsolvable at this price point."
Taste and judgment differ in three ways that matter:
- Taste is felt. Judgment is tested. You can validate judgment against outcomes. Did the feature retain users? Did the API call work? Was the distribution problem real? This makes judgment accumulative in a way taste isn't. You can be wrong, learn, and update.
- Taste is personal. Judgment is transferable. You can teach someone your judgment about a domain. You can write it down. You can sell it as consulting. Taste is harder to decompose into transferable knowledge.
- Taste is developed through exposure. Judgment is developed through failure. You develop taste by seeing a lot of things. You develop judgment by building a lot of things and watching them break in specific ways. The second process requires what Sachin identified as the missing "scenius," a feedback loop, but the feedback loop doesn't have to come from a community. It can come from your own systematic practice.
This distinction matters because of the trust collapse. The Red Hat assessment that "vibe coding amplifies experts but amplifies novices' mistakes in dangerous directions" is really a statement about judgment.[4] The experts have it. The novices don't. And the tool doesn't care. It generates the same confident-looking code for both. The 60% trust figure isn't about AI being unreliable. It's about builders lacking the judgment to evaluate AI output. The developers who do trust it are the ones who've accumulated enough domain-specific failure data to know when to trust and when to override.
As Dave Kiss argues, the experienced developers who survived the first year of vibe coding didn't do it through blind acceptance.[5] They evolved into skilled supervisors who knew when the AI reached for the wrong solution. That supervisory skill is judgment. It's the new premium.
The good news: judgment is buildable. The bad news: it's built from crapjects. Every failed prototype, every hallucinated API, every feature nobody used: they're all deposits in your judgment account. But only if you're tracking the deposits. Which brings us to the exhaust problem.
How Do You Capture Value from Failed Prototypes?
The most important section of Sachin's essay got the least development. He wrote about "signal capture before upstream absorption," the idea that every vibe coding session produces informational exhaust that currently flows upstream to model providers for free. Your prompts, your iterations, your corrections all become training data. You are performing unpaid labor for the infrastructure layer every time you build something.
A recent paper in Humanities and Social Sciences Communications formalizes this as "algorithmic surplus value," where AI systems function as what Marx called dead labour, reorganizing production by compressing necessary labour time and enclosing informational rents.[6] The takeaway for builders is less academic than it sounds: you are generating value with every vibe coding session, and almost all of it is being captured by someone else.
I want to break "signal" into three specific types of exhaust, because they require different capture strategies:
Failure Patterns
What the model gets wrong in your specific domain. If you're building fintech tools, your running log of "Claude hallucinated this banking API, misunderstood this compliance requirement, generated this vulnerability" is proprietary domain knowledge. It's the kind of information companies pay consultants $500/hour for. Right now you're generating it for free and letting it evaporate after each session.
User Signals
Vibe coding lets you run ten experiments in the time it used to take to run one. That's a market research engine, but only if you capture the data. How did people react to the prototype? What feature did they ask for that you didn't build? What did they ignore that you thought was the whole point? One person running vibe-coded experiments systematically generates more user insight in a month than most startups generate in a quarter. The problem is that the experiments happen so fast that the learning gets lost in the speed.
Workflow Recipes
The prompt chains, architecture patterns, and iteration sequences that actually work in your domain. "I start with this system prompt, scaffold with this structure, and the model reliably produces working output when I constrain it this way." These workflow recipes are operational IP. They're the difference between a 2-hour build and a 2-day build, and they're specific enough to your domain that they don't generalize easily. Which means they're defensible.
Most builders let all three types of exhaust dissipate. It floats upstream as free training data. One person's three months spent vibe-coding a Pomodoro app produces enormous exhaust (all the wrong turns, dead ends, framework switches, debugging spirals) but none of it was captured.[7] At the end, you have a Pomodoro app (a crapject) and nothing else. No failure log, no user data, no recipe documentation.
The fix isn't sophisticated. It's just consistent. A spreadsheet. A daily log. Three columns: what the model got wrong, how users reacted, what workflow pattern worked. The point is systematization, not sophistication. The exhaust is already being produced. You just have to stop letting it vent.
What Happens When the Code Breaks?
There's an economic reality that Sachin's essay doesn't address, probably because it's ugly: the maintenance cliff.
Vibe-coded products hit an unmaintainability wall around month three to six. The numbers tell the story: code churn up 41%, refactoring collapsed to under 10% of changed lines.[1] What this means practically is that AI-generated codebases accumulate technical debt at a rate that makes them disposable. Not theoretically disposable, but actually, literally disposable. At some point it becomes cheaper to rebuild from scratch than to maintain what you have.
This sounds catastrophic if you're thinking like a traditional software company. It sounds like an advantage if you're thinking like a vibe coder who understands the margin window.
Here's the counterintuitive strategy: don't fight the cliff. Plan for it.
If rebuilding from scratch costs 1/10th what it used to (and with current tools, it does) then the rational strategy might be: build disposable, rebuild quarterly, and invest in distribution and brand instead of code quality. Your moat was never the code. The code is the commodity, remember? Your moat is:
- Your users (who don't care what's under the hood)
- Your judgment (which survives any rebuild)
- Your captured exhaust (which makes the next build faster and better)
All three are durable. The code isn't. Stop investing in the thing that expires.
This framing explains something that's been confusing people: why some solo founders seem to ship impossibly fast while others get stuck in debugging spirals. The fast ones aren't better at vibe coding. They've accepted the disposability of the output and optimized for the things that persist across rebuilds. They document the what, not the how. They keep architecture simple enough that a rebuild is a weekend, not a quarter. They treat each version as a draft, not a manuscript.
The Operating Manual
Everything above is framework. Here's what to do with it.
- Run the margin test. Before building anything, ask: "Would someone pay 10x my price to get this from an incumbent?" If a company is paying $500/month for a tool you could replace at $50/month, you're in the margin window. Build fast, charge now, iterate later. If the incumbent is already cheap or free, you're not in the window. The margin has already been competed away. Move on.
- Keep a crapject journal. After every failed prototype (and most will fail), write three sentences: what you tried, why it failed, what you'd try differently. Date it. This is your judgment database. It compounds. After 30 entries, you'll start noticing patterns you couldn't see at entry 5. After 100 entries, you'll be able to predict failure modes before you build. That prediction capability is the judgment premium, and it's worth more than any single product you'll ship.
- Capture your exhaust. Log three things from every vibe coding session: one model failure (what it got wrong and in what context), one user reaction (how someone responded to what you shipped), and one workflow pattern (a prompt or approach that reliably worked). A spreadsheet is fine. A Notion table is fine. The format doesn't matter. The consistency does. You're building a proprietary dataset that gets more valuable with every entry and that nobody upstream can replicate, because it's specific to your domain and your users.
- Plan for the cliff. Design every project assuming you'll rebuild it from scratch in 90 days. This means: keep architecture simple (monolith over microservices, SQLite over Postgres, server-rendered over SPA). Document the product decisions, not the implementation details. Invest in the things that survive a rebuild: your brand, your distribution channels, your user relationships, your exhaust logs. Treat the code as ephemeral. Because it is.
- Sell the judgment, not the product. Your crapjects are R&D. The valuable output isn't the app. It's your accumulated understanding of the problem space. Write about what you learned. Consult on what you know. Let the next build benefit from the last one's failure. The founders who will be standing when the margin window closes aren't the ones with the most products. They're the ones with the deepest judgment and the most captured exhaust. The products are how you get there. They're not the destination.
Your Crapjects Are Capital
Sachin closed his essay with a line I keep coming back to: "Consumption doesn't have to be passive. Surplus can be spent well."
He's right. But I want to be more specific about what "spent well" looks like.
It looks like someone who built 30 things, shipped 5, and learned from all 30 why the 5 worked. It looks like a founder whose competitive advantage isn't code (it was never code) but the judgment, the exhaust, and the distribution they accumulated while everyone else was chasing the next prototype. It looks like someone who understood that the margin window was temporary and used it to build something that lasts: not a product, but a position.
The Maker Movement produced a lot of crapjects. Most of them were worthless. But the people who made them, the ones who paid attention to what they were learning, went on to build real companies, lead real teams, make real contributions. The objects didn't matter. The capability that developed through the making did.
Vibe coding can work the same way, but only if you're deliberate about it. The scenius isn't coming. Nobody is going to build you a protected playground where you can develop judgment at leisure. The tools went straight to production and the margin window is already open and closing. What you do in this window, whether you spend the surplus wisely, whether you capture the exhaust, whether you build judgment alongside products, determines whether your crapjects were waste or capital.
I know which one I'm betting on.
Sources
- Hashnode, "The State of Vibe Coding in 2026: Adoption Won, Now What?"
- Joel Spolsky, "Strategy Letter V: The Economics of Open Source" (2002)
- Gwern, "Laws of Tech: Commoditize Your Complement"
- Red Hat, "The Uncomfortable Truth About Vibe Coding" (2026)
- Dave Kiss, "Stop Calling It Vibe Coding" (2026)
- Humanities and Social Sciences Communications, "The Transformation from Human Surplus Value to AI Algorithmic Surplus Value" (2025)
- Calvin Ku, "It Took Me Three Months to Vibe Code a Simple Pomodoro App" (2026)