Prelude

I have spent the last decade creating complexity. We all have. We built towers of abstraction. We invented problems so we could sell the solutions. We convinced ourselves that to put a button on a screen, we needed a build step, a virtual DOM, three state management libraries, and a hydration strategy.

It was madness. (necessary madness, perhaps, but madness nonetheless).

I've been looking at the raw HTML my latest Rust-based agent generated this morning. It is clean. It is semantic. It loads instantly. There is no hydration gap because there is nothing to hydrate. It effectively renders the last five years of my frontend career obsolete.

The industry is clinging to tools that were designed to help humans manage complexity. But humans aren't writing the code anymore.

The Orthodoxy

Go to any tech conference. Read any junior developer's CV. The religion is the same. You learn JavaScript. Then you learn a framework. React, Vue, Svelte. It doesn't matter which one.

The orthodoxy states that the browser is a hostile environment. We believe that raw DOM manipulation is too difficult for mere mortals. We believe that we need a "component model" to keep our sanity. We treat the user interface as a function of state, which sounds profound until you realise the sheer computational weight we drag along to prove it.

This belief system created a massive ecosystem. We have "React Developers" rather than software engineers. We have entire companies built around hosting providers for specific frameworks. We have a generation of coders who have never written a raw SQL query or a line of uncompiled CSS.

The argument is always about "Developer Experience" (DX). We accept slower load times and massive bundles because "it's easier to maintain." We accept the fragility of the node_modules black hole because "the ecosystem is rich."

We are optimizing for the human typist. We are building tooling to make it easier for a person to type code into a text editor.

That constraint just evaporated.

The Cracks

The cracks started appearing when I watched an agent write a complete dashboard in thirty seconds. Not using a component library. Not importing a design system. It just wrote the code.

I asked it to change the layout. It didn't refactor a component tree. It didn't fight with useEffect dependencies. It just rewrote the output.

The "language wars" we used to fight are over. The debate used to be about which syntax was more expressive for humans. Now the debate is about which language allows the AI to reason best, and which language executes fastest.

We are seeing a bifurcation. On one side, we have Python. It is the language of experimentation. The playtime language. As noted in recent industry analysis, Python dominates AI experimentation because humans find it easy to prototype in Python.

But here is what most people miss: Python's dominance is a temporary artifact of human limitations, not a fundamental truth about AI development.

The models do not "think" in Python. They think in tokens. They can generate Rust as easily as Python. The only reason Python dominates today is that humans are still in the loop, and humans find Python easier to read and debug.

Remove the human from the loop, and Python's advantage evaporates.

On the other side, we have Rust. The runtime language. The production language. If the AI is writing the code, why would we choose a slow, interpreted language for execution? We wouldn't. We would choose Rust.

As models improve at generating correct Rust on the first attempt, the need for Python's "forgiving" syntax disappears. The playtime is ending. The runtime is all that matters.

That is the backend story. The frontend story is even more brutal.

The same logic that kills Python's dominance kills JavaScript frameworks. React, Vue, Svelte. All of them exist because humans needed help managing complexity. The "intermediate framework" (the layer designed to make JavaScript palatable to humans) is becoming technical debt the moment you install it.

Consider the metrics we used to care about. Lines of code. Commits per day. Now, we are looking at something entirely different. We are looking at token efficiency.

The Token Economy

Every character costs money. Every token costs time.

If I ask an AI to "build a button," and it has to hallucinate a React component, import statements, prop definitions, and type interfaces, it is burning tokens on boilerplate. It is wasting context window space on the framework's overhead.

If I ask it to generate an HTML string, it is lean. It is fast.

We are moving toward a world where 2,000 token per second generation is the benchmark for flow state. In this world, verbosity is a sin. React is verbose. Angular is verbose.

The efficiency of the input and the output is the new "Big O" notation. Optimizing token usage is not just about cost. It is about speed. It is about the latency between thought and execution.

The Agent Does Not Care About Your DX

I built a system recently where the frontend is disposable. The user asks for a view. The agent generates the HTML. The browser renders it. When the state changes, the agent generates new HTML.

"But that's slow!" I hear you scream.

Is it? Or is your perception of speed warped by the latency of the DOM diffing algorithms you've been forced to use?

When you remove the framework, the browser is incredibly fast. The AI doesn't need "Hot Module Reloading" because it doesn't make syntax errors. It doesn't need "Prettier" because it formats perfectly every time.

We are entering a post-frontend-framework world. The complexity is shifting. It is moving away from the client-side bundle and into the orchestration layer.

The Deeper Truth

The future of software is not "coding." It is architecture.

The code snippets we treasure are becoming ephemeral artifacts. I don't care about the implementation details of the button anymore. I care about the system that decides the button should exist.

This brings us to the rise of Rust.

Today, Python handles the orchestration layer while Rust handles the production runtime. Tomorrow, Rust handles both.

Rust is the perfect target for AI generation. Why?

  1. Correctness: If it compiles, it usually works. AI struggles with logical consistency over long contexts. The strictness of the Rust compiler acts as a hard guardrail for the hallucinations of the model.
  2. Efficiency: We are running inference on expensive GPUs. We cannot afford to waste CPU cycles on the runtime logic. Rust offers C++ level performance with memory safety.
  3. Type Safety as Context: The strong type system of Rust provides excellent context for the LLM. It constrains the search space for the next token.

Here is a speculative look at how my mental model has shifted.

Old Paradigm (The Human Coder):

"INPUT: User wants a dashboard."
"PROCESS: I need to install a charting library. I need to create a component. I need to manage state."
"OUTPUT: A 5MB JavaScript bundle."

New Paradigm (The AI Orchestrator):

"INPUT: User wants a dashboard."
"CONTEXT: Schema is defined in Rust structs."
"ACTION: AI generates SQL query for data."
"ACTION: AI generates raw SVG for charts."
"ACTION: AI generates semantic HTML wrapper."
"OUTPUT: A 50kb HTML file served by a Rust binary."

This is not theory. This is not speculation. This is what I am running in production right now.

The Proof is in the Page Load

This very blog you are reading is the proof. Go ahead. Open your dev tools. Check the network tab.

There is no React. No Vue. No Svelte. No hydration. No bundle splitting. No lazy loading of JavaScript chunks. There is HTML. There is CSS. There is a Rust binary serving it all.

The entire site is generated by AI and served by a simple Rust templating system. Every page loads in milliseconds. The Lighthouse score is effectively perfect. Not because I spent weeks optimising bundle sizes or configuring code splitting. Because there is nothing to optimise. The complexity simply does not exist.

Compare this to the average React application. Megabytes of JavaScript. Hydration waterfalls. Layout shifts. Loading spinners while your "Single Page Application" figures out how to render text on a screen.

I deleted all of it. The AI writes the content. Rust serves the HTML. The browser does what browsers have always done brilliantly: render documents.

This is faster, better, and more optimised than any React website ever built. Not because I am a better developer. Because I removed the parts that were slowing everything down in the first place.

The framework exists to bridge the gap between human intent and machine execution. When the machine handles the intent, the bridge is no longer required. In fact, it becomes a toll booth.

There are concerns, of course. Security vulnerabilities in AI-generated code are real. If you let an LLM write raw SQL without sanitisation, you deserve what happens to you. But this is where the expertise shifts.

We don't need developers who know how to center a div. We need developers who know how to verify that the agent didn't just open a backdoor. We need AI Orchestrators.

The Death of the Toolchain

It's not just the frameworks that are dying. The entire ecosystem around them is becoming obsolete.

Think about what we've built. Webpack. Babel. ESLint. Prettier. TypeScript (the compiler, not the language). Jest. Cypress. Storybook. Each tool exists to solve a problem created by the previous tool. Each configuration file spawns three more.

I recently looked at a fresh Next.js project. Before writing a single line of business logic, I had: package.json, package-lock.json, tsconfig.json, next.config.js, tailwind.config.js, postcss.config.js, .eslintrc.json, .prettierrc, and a node_modules folder containing 847 packages.

Eight hundred and forty-seven packages. To render text on a screen.

This is not engineering. This is archaeology. We are digging through layers of sediment deposited by a decade of "best practices" that were never questioned because everyone was doing it.

The AI doesn't need any of it. It doesn't need Webpack because it generates complete files. It doesn't need Babel because it writes valid syntax. It doesn't need ESLint because it doesn't make the mistakes ESLint catches. It doesn't need Prettier because every output is perfectly formatted.

The toolchain was built to catch human errors and enforce human conventions. Remove the human, and the toolchain becomes a museum.

I am not saying these tools were bad. They were necessary. They served their purpose brilliantly. But the problem they solved no longer exists. The human typist who makes syntax errors and forgets semicolons is no longer the bottleneck.

The bottleneck is now the orchestration. The system design. The prompt engineering. And for that, we need entirely different tools.

The Rise of the Machine Whisperer

So what should you learn? If frameworks are dying and syntax is commoditised, where does value come from?

Here is my honest answer: become a machine whisperer.

Stop indexing on language. "I'm a JavaScript developer" is a dying identity. "I'm a Python developer" is only slightly better. The language is a detail. The model doesn't care which language you prefer. It will generate whichever one you ask for.

Stop indexing on front-end versus back-end. That distinction made sense when humans had to specialise to manage complexity. The AI doesn't need to specialise. It generates the database query and the HTML in the same breath. "Full stack" is no longer an aspiration. It is the baseline.

Learn systems thinking. Understand how data flows through a system. Understand latency. Understand failure modes. Understand caching strategies. These are the things the AI cannot intuit because they require understanding the physical constraints of networks and hardware.

But here is the twist: you must also understand the detail.

The junior developer who dismisses syntax as "something the AI handles" will be useless. You need to read the generated code. You need to verify it. You need to spot when the model has hallucinated an elegant-looking function that will fail silently in production.

The senior engineer of the future operates at two levels simultaneously. They think in systems (architecture, data flow, user needs) and they verify in details (security, edge cases, performance). They do not write much code themselves, but they understand every line the machine writes.

This is the paradox of AI-assisted development. You need to know more, not less. You need broader knowledge, not deeper specialisation in a single framework. You need to understand enough about everything to supervise a machine that can generate anything.

The machine whisperer is not someone who prompts well. Anyone can prompt. The machine whisperer is someone who knows when the machine is wrong.

The Transition

I am not naive enough to think this happens overnight. There will be resistance. There always is when comfortable paradigms get threatened.

The framework vendors will tell you that AI-generated code is unreliable. They are not wrong, but they are missing the point. The code was never the product. The system is the product. The code is a disposable implementation detail.

The bootcamps will keep teaching React because that is what they know. Their graduates will struggle to find jobs because the market is saturated with component creators and starving for system thinkers. The mismatch will be painful but corrective.

The conference speakers will debate whether frameworks are "still relevant" for another three to five years. By the time consensus emerges, the early movers will have built and shipped entire products while the debaters were still arguing about state management.

Here is what the transition actually looks like in practice:

Legacy codebases don't disappear. React applications in production will run for years. Companies don't rewrite working systems on a blog post's recommendation.

But greenfield projects are where the shift begins. Every new project is a choice. And increasingly, the choice that makes sense is: skip the framework.

Start with a Rust backend. Let AI generate the HTML. Serve it fast. Keep it simple.

The developers who make this transition early will have an enormous advantage. They will be building systems while others are still configuring build tools. They will be shipping features while others are debugging hydration issues.

The transition is not about abandoning everything you know. It is about recognising which parts of your knowledge are timeless (systems thinking, security, architecture) and which parts are temporary (framework syntax, build tool configuration, component patterns).

Keep the timeless. Let go of the temporary.

The developers who cling to frameworks will find themselves in a shrinking market. Not because frameworks are bad, but because the market for framework expertise is contracting while the market for system expertise is expanding. Supply and demand. Basic economics.

I have watched brilliant engineers spend years mastering the intricacies of React Server Components, only to see the entire paradigm become irrelevant before they could fully deploy it. That is not a failure of their intelligence. It is a failure of investment strategy. They bet on the wrong layer of the stack.

The right layer is the system layer. Always has been. We just got distracted by the shiny abstractions in the middle.

The Objections

I can hear the objections forming. I've heard them all. Let me address them directly.

"But AI Can't Replace Real Understanding"

Correct. And I never said it could.

The argument is not that AI replaces understanding. The argument is that understanding needs to operate at a different level. You don't need to understand how to write a React component. You need to understand why a React component is the wrong abstraction in the first place.

The junior developer who memorised the useEffect dependency array rules is in trouble. The senior engineer who understands when client-side state is appropriate versus server-rendered content is more valuable than ever.

Understanding is not deprecated. Syntax memorisation is.

"But Frameworks Have Tooling and Ecosystem"

Yes. Tooling designed to help humans. Ecosystems built around human limitations.

It does not need Storybook to visualise components in isolation. It can generate the entire page in less time than Storybook takes to boot.

The "ecosystem" argument is circular. We built tools to manage the complexity of frameworks. The frameworks exist because we couldn't manage complexity without tools. Remove the human from the loop and the entire house of cards becomes unnecessary.

"But This Only Works for Simple Sites"

Ah, the "it doesn't scale" argument. Classic.

Define "complex." A dashboard with real-time data? The AI generates it. A form with validation? The AI generates it. An interactive data visualisation? The AI generates raw SVG faster than D3.js can parse its configuration.

The things we thought were complex were only complex because humans had to write them. The actual computational problem (turn data into pixels) was never that hard. We made it hard by adding layers of abstraction optimised for human comprehension.

I am running production systems with AI-generated UIs that would have taken teams weeks to build. Not because the AI is smarter. Because it doesn't carry the baggage of framework conventions.

"But What About Team Collaboration"

This is the most interesting objection, and it has merit.

Frameworks provide shared conventions. They give teams a common vocabulary. "It's a React component" communicates a lot of information quickly.

But consider what we are actually collaborating on. We are collaborating on the system, not the syntax. The architecture. The data flow. The user experience. The business logic.

A team can collaborate on prompt templates as easily as component APIs. The shared understanding shifts from "how we structure our React code" to "how we instruct the AI to generate what we need."

The collaboration doesn't disappear. It moves up the stack.

Conclusion

I am done with intermediate frameworks.

I am done waiting for npm install to download half the internet just so I can render a list of items.

The future is raw. The future is intelligent. The future is a Rust binary serving HTML generated by a model that understands the user better than I do.

It is time to stop building tools for humans to write code. It is time to start building the systems that write the code for us.

We are returning to the metal. And it feels good.

The change is already here. The question is whether you see it.

This is not a prediction. This is an observation. The death of intermediate frameworks is not something that will happen. It is something that is happening. Right now. While the industry debates whether it is real.

Now if you will excuse me, I'm off to build stuff.