It’s time for modern CSS to kill the SPA
24th July, 2025

Native CSS transitions have quietly killed the strongest argument for client-side routing. Yet people keep building terrible apps instead of performant websites.
The app-like fallacy
“Make it feel like an app.”
At some point during the scoping process, someone says the words. A CMO. A digital lead. A brand manager. And with that single phrase, the architecture is locked in: it’ll be an SPA. Probably React. Maybe Vue. Almost certainly deployed on Vercel or Netlify, bundled with a headless CMS and a GraphQL API for good measure.
But the decision wasn’t really about architecture. It wasn’t even about performance, scalability, or content management. It was about interactions. About how the site would feel when you click around.
The assumption was simple: Seamless navigation requires us to build an app.
That assumption is now obsolete.
The false promise of SPAs
The reason SPAs became the default wasn’t because they were better. It was because, for a while, they were the only way to deliver something that felt fluid – something that didn’t flash white between pages or jank the scroll position.
But here’s the uncomfortable truth: most SPAs don’t actually deliver the polish they promise.
What you usually get is:
- A page transition that looks smooth, until you realise it’s just fading between two loading states
- Broken scroll restoration
- Inconsistent focus behaviour
- Delayed navigation while scripts rehydrate components
- Layout shift, content popping, or full-page skeletons
- A performance hit that’s entirely disproportionate to the effect
This isn’t theoretical. Look at most sites built with Next.js, Gatsby, or Nuxt. They’re shipping kilobytes (often megabytes) of JavaScript just to fake native navigation. Routing logic, hydration code, loading spinners – all just to stitch together something that browsers already knew how to do natively.
Instead of smoothness, you get simulation. And instead of a fast, stable, SEO-friendly experience, you get a heavy JavaScript machine trying to recreate the native behaviour we threw away.
We’ve been adding mountains of JS to “feel” fast, while making everything slower.
An aside – I went deeper on this in JavaScript broke the web, where I outlined how our obsession with JS-first development is actively eroding the web’s foundations.
The web grew up
While we were busy reinventing navigation in JavaScript, the platform quietly solved the problem.
Modern browsers – specifically Chromium-based ones like Chrome and Edge – now support native, declarative page transitions. With the View Transitions API, you can animate between two documents – including full page navigations – without needing a single line of JavaScript.
Yes, really.
What we’re calling “modern CSS” here is shorthand for View Transitions, Speculation Rules, and a return to native browser features that were always designed to handle navigation, interaction, and layout. These capabilities let us build rich, seamless experiences – without rewriting the browser in JavaScript.
An aside – CSS is also declarative, resilient, expressive, scalable, and increasingly intuitive. It’s accessible to anyone who can write plain HTML. And that structural clarity reinforces everything I argued in Why semantic HTML still matters – that clean, meaningful markup is the bedrock of performance, maintainability, and machine readability.
That means you can:
- Fade between pages
- Animate shared elements (e.g. thumbnails → product detail)
- Maintain persistent elements like headers or navbars
- Do it all with real URLs, real page loads, and no JS routing hacks
Let’s make this concrete.
🔄 Basic cross-page fade transition
With just a few lines of CSS, you can trigger smooth visual transitions between pages.
On both the current and destination page, add:
@view-transition {
navigation: auto;
}
::view-transition-old(root),
::view-transition-new(root) {
animation: fade 0.3s ease both;
}
@keyframes fade {
from { opacity: 0; }
to { opacity: 1; }
}
That’s it. The browser handles the transition – no client-side routing, no hydration, no loading spinners.
🔁 Shared element transitions
Want to animate a thumbnail image into its full-size product counterpart on the next page?
No JavaScript needed – just assign the same view-transition-name to the element on both pages:
On the product listing page:
<a href="/product/red-shoes">
<img src="/images/red-shoes-thumb.jpg" style="view-transition-name: product-image;" />
</a>
On the product detail page:
<img src="/images/red-shoes-large.jpg" style="view-transition-name: product-image;" />
The browser matches and animates the elements between navigations. You can animate position, scale, opacity, layout – all with CSS.
🤖 But what if I need JS-driven transitions?
You can manually trigger transitions inside a page too:
document.startViewTransition(() => {
document.body.classList.toggle('dark-mode');
});
Perfect for things like tab toggles or theme switches — without needing a framework or hydration layer.
🔮 Speculation rules: instant navigation without JS
View Transitions make things smooth. But what about fast?
That’s where Speculation Rules come in. This lets the browser preload or prerender full pages based on user behaviour – like hovering or touching a link – before they click.
<script type="speculationrules">
{
"prerender": [
{
"where": {
"selector_matches": "a"
}
}
]
}
</script>
The result? Navigation that’s instant. No waiting. No loading. No spinners.
⚠️ A Note of Caution
Speculation Rules are a performance multiplier. On a lean site, they make things feel instant. But if your pages are slow, bloated, or JS-heavy, speculation just front-loads those costs.
If your site is bloated, speculation will still speculate – and the user pays the price.
That means wasted CPU, network bandwidth, and mobile battery – often for pages the user never even visits.
Use them carefully. On a fast site, they’re magic. On a slow one, they’re a trap.
Browsers want to help – if we let them
Modern browsers are smarter than ever. They’re constantly looking for ways to improve speed, responsiveness, and efficiency – but only if we let them.
One of the clearest examples is the Back/Forward Cache (bfcache), which allows entire pages to be snapshotted and restored instantly when users navigate back or forward.
It’s effectively free performance – but only for pages that behave. That means no rogue JavaScript, no intercepted navigation, no lifecycle chaos. Just clean, declarative architecture. Just HTML and CSS.
Unsurprisingly, this plays beautifully with a well-structured, multi-page site. But for most SPAs, it’s a non-starter. The very design patterns that define them – hijacked routing, client-side rendering, complex state management – break the assumptions that bfcache relies on.
This is a microcosm of a much bigger theme: browsers are evolving to reward simplicity and resilience. They’re building for the kind of web we should have been embracing all along. And SPAs are increasingly the odd ones out.
📊 SPA vs MPA: a performance reality check
Average Next.js marketing site
- JS bundle: 1 – 3MB
- TTI: ~3.5 – 5s (depending on hydration strategy)
- Route transitions: simulated
- SEO: complex, fragile
- Scroll/anchor behaviour: unreliable
Modern MPA + View Transitions + Speculation Rules
- JS bundle: 0KB (optional enhancements only)
- TTI: ~1s
- Route transitions: real, native
- SEO: trivial
- Scroll/focus/history: browser-default and perfect
Modern CSS doesn’t just replace SPA behaviour – it outperforms it.
Don’t build a website like it’s an app
Most websites aren’t apps.
They don’t need shared state. They don’t need client-side routing. They don’t need interactive components on every screen. But somewhere along the way, we stopped making the distinction.
Now we’re building ecommerce stores, documentation portals, marketing sites, and blogs using stacks designed for real-time collaborative UIs. It’s madness.
A homepage with six content blocks and a contact form doesn’t need hydration, suspense boundaries, and a rendering strategy.
It needs fast markup, clean URLs, and maybe – maybe – a bit of interactivity layered on top.
And yet, on every project:
- A stakeholder says, “make it feel like an app.”
- A dev team reaches for Next.js or Nuxt.
- Routing goes client-side.
- Performance falls off a cliff.
- Now you need edge functions, streaming, ISR, loading strategies, and a debugging plan.
- And somehow… it still feels slower than a regular link click and a CSS animation.
This isn’t about being anti-framework. It’s about being intentional.
Use React if you want. Use Tailwind, Vite, whatever. Just don’t ship it all to the browser unless you need to.
Build a site like a site. Use HTML. Use navigation. Use the platform.
It’s faster, simpler, and better for everyone.
Build for the web we have
SPAs were a clever solution to a temporary limitation. But that limitation no longer exists.
We now have:
- Native, declarative transitions between real pages
- Instantaneous prerendered navigation via Speculation Rules
- Graceful degradation
- Clean markup, fast loads, and real URLs
- A platform that wants to help – if we let it
If you’re still building your site as an SPA for the sake of “smoothness,” you’re solving a problem the browser already fixed – and you’re paying for it in complexity, performance, and maintainability.
Use modern server rendering. Use actual pages. Animate with CSS. Preload with intent. Ship less JavaScript.
Build like it’s 2025 – not like you’re trapped in a 2018 demo of Gatsby.
You’ll end up with faster sites, happier users, and fewer regrets.




Great article. Thanks for the alternative perspective.
It depends a lot on how the site is made. 10 years ago I made page transitions in JavaScript (optiosoft.ru) and the PageSpeedInsights test still shows Performance 95%, all loading speed indicators are excellent. But it’s good that native API appeared, it will be much more reliable and even faster.
I whole-heartedly agree with this take and I think more of us should make waves on this subject. We may just have a chance of nudging decision makers in the right direction, and fellow practitioners, consultants and advisers might also capitalize on this argument and push things where they should be: towards more simplicity and usage of a platform as it’s intended (and widely supported, natively).
This is a really interesting article. My toy project, a live demo at TinyApp.LucasCekan.com, actually uses a hybrid approach that touches on this debate.
The registration process is a traditional multi-page action (MPA), but the login flow happens entirely on the current page, giving it a seamless, SPA-like feel. This was done without a heavy framework like Next.js, so the overhead is minimal.
This experience leads me to agree with the your criticism of the heavy frameworks, but not of the SPA experience itself. For me, a “SPA” isn’t a framework — it’s that excellent, seamless UI/UX feel.
Reading this has made me realize that my own app could be improved. It doesn’t have good transitions for either the MPA or SPA-style components. The View Transitions API you mentioned seems like the perfect tool to enhance that hybrid experience, applying it to both sides of the app.
It seems the ideal isn’t a strict choice between MPA and SPA, but rather using the right technique for the right feature, focusing on the user experience first.
Awesome article! I love that every other sentence makes me feel more proud that I build websites with Astro.
I have been developing business applications for over 20 years using web technologies (and the almost ubiquity of the web browser) for the user interface. I have seen many trends come and go but held fast in the belief that, behind the hype, web standards held the key to success. In the last couple of years I have been encouraged by the rise in interest of “returning to the standards”.
One of my current projects is trialling HTMX (that shares some similarity with the Datastar approach Bob mentioned) to great success. Also in recent years HTML and CSS have become far more capable and continue to do so. We are also beginning to see the futility of using JS for everything everywhere.
JS is my favourite language and I like using React but they are not always the right tools, in fact increasingly I am coming to realise they are all too often the wrong choices.
For personal projects I use my own html-form[1]. Which is more bare metal. It doesn’t require hydration like HTMX (and maybe Datastar?). It is focused more on progressive enhancement rather than SPA.
I don’t know if it is production proof. But you can get it a long ways. I use vanilla HTML for triggers or a web component if needed. It kind of encourages simplicity but gives a better experience for the user when a full page update isn’t needed.
[1]: https://github.com/jon49/htmf
This vision is great, but a bit too early for me.
If someone asks you to add a chat/Intercom widget to your website, you don’t want that widget to reset on every navigation. I’ve seen nice MPA docs site having the support widget closing on every link click, giving a really bad UX.
This is just one example, but until these things are completely solved, I still think SPA is a safer choice.
Looking forward to this “proposal” to prove me wrong in the future: https://github.com/explainers-by-googlers/companion-windows
Hey great article! What about if your intention was to wrap a PWA for the app stores? It’s my understanding that SPA was the way to go. Would love your input as I’m trying to be an intentional dev.
> Modern browsers – specifically Chromium-based ones like Chrome and Edge – now support native, declarative page transitions. With the View Transitions API, you can animate between two documents – including full page navigations – without needing a single line of JavaScript.
I believe web devs should support Firefox on principle, and very few developers can afford to just totally ignore Safari like you propose to.
You can’t in good faith write an entire article on how the native web platform in 2025 has all the things SPAs provide and not explicitly call out that Safari doesn’t support one of the two APIs you propose to replace SPAs with and only supports the other in the most recent version (unavailable to many on older devices).
I’m excited for the web of 2030 when these APIs are both available across all browsers. In the meantime, what you’re pushing for is for web devs to only pay attention to Chrome’s support for these features, which is an attitude that will kill the open web in the long run.
I want to try these new features, and I never used Next.js extensively, but I need to highlight that once upon a time it got popular because it was fast at rendering complex sites. It was originally not created by a tech behemoth, and so it by default had to get popular due to performance, not advertisement. Maybe things have changed since, but you don’t really describe what has changed. Of course, pure HTML is faster, but how many complex sites thrive without JS? And, modern React frameworks leveraging RSC are again much closer to sending raw HTML while also hydrating delayed dynamic content. I don’t even like Next.js/React that much, but Next.js is literally doing server rendering, so I think you’re conflating things. Anyways, it’s cool to see the tips, thank you, I just wish there was less click bait blanket statements in this.
What you’re highlighting is how broken modern apps are, and that’s a frontend dev skill problem.
In 2008 we had extremely fast sites that judiciously used API calls or did full page navigation.
The answer isn’t 100% dumb frontend. Nor is it 100% SPA. Both are the positions of devs who haven’t been around the block enough times. (And while things like NextJS are impressive, page performance peaked in the era of Backbone & and Handlebars… React was never a performance solution, it was a developer productivity solution)
Monochromatic reasoning is a problem.
While I agree that browsers are better, and css is *much* better than it was in those days, the answer is not getting religion about a single-source approach.
The answer is better devs who are willing to put in the intellectual time and work to build a solution that meet the needs of the project.
Sounds good, at this point so many companies have invested in react or angular etc .
I think we need more demos of server side rendering and regular html . It’s 2025 and we still need app behavior in forms – decision tree type behaviors .
Thanks for the interesting blog post, lots of useful information here. I’m wondering if you can provide a little bit more information about “Maintain persistent elements like headers or navbars”. How do we get persistent nav bars and headers? Does the browser just automatically do this if it sees that the HTML is exactly the same?
You can configure which elements your view transitions target (or don’t), and so, e.g., exclude a header navigation bar whilst transitioning only the body content.
If you are looking for more information on keeping nav bars and headers persistent, you might find this useful: https://vtbag.dev/fwvt/morphing/#exempt-elements-from-animations
I absolutely loved this and will consider this for my next app. I have been using React for while and am sick of it. I think it also does away with the two-way binding between form inputs and the component useState members.
I remember when web development used to be enjoyable… and then SPAs reared their ugly heads, sucking the joy out of front-end work with their preposterously overwrought frameworks.
I cannot wait to share this article with the next person who dares to utter, “We should use for our next CRUD project!”
The article could also mention that SPA require more efforts in term of web accessibility, and I am not just talking about the scroll restoration. When a browser loads a new page, it will announce it to screen readers. With a SPA people will often need to implement a live region and an autofocus on the page h1.
Absolutely! I touch on this in https://www.jonoalderson.com/conjecture/why-semantic-html-still-matters/ and https://www.jonoalderson.com/conjecture/javascript-broke-the-web-and-called-it-progress/.
Hi Jono, 👋
Motivating article, and definitely true for many websites. Thanks for spreading the word about how the View Transition API will change the way we build websites, and how users will tell old-school sites from modern ones in the future.
I understand that you included the first code snippet to show how easy it is to enable cross-document view transitions and style the pseudo-elements. For an example likely to be copied by people new to the API, you might consider
@view-transition {
navigation: auto;
}
::view-transition-group(*) {
animation-duration: 0.3s;
}
Not only is it shorter, but more important, wouldn’t override the API’s default styling (inheritance, isolation) so carelessly.😏
Ah, great example! Thanks!
What would be a best practice prompt for Claude Code to use this modern MPA design pattern?
“use WordPress” 😅
Probably not many code examples scooped up by LLMs yet, so vibe coding might be out.
You can always provide a context where you describe how to use these new APIs.
I created a set of Github Copilot instructions for this after reading Jono’s post.
Tested it out today as have been frequently frustrated that Github Copilot with Sonnet 4 keeps going waaaaaay off track back to SPA style coding even on old PHP or HTML files. This ruleset seems to have done the trick so far 🙂
https://github.com/nickrenwick/mpa-first-guidelines
I absolutely agree to a point. The amount of technical debt some projects accrue is insane. Interested to check out these things. I don’t think we’re out of the framework woodworks yet but with the advances in JS, CSS, and now native browser behavior I am optimistic we’ll return to a cleaner, more sensible DX.
This was a fantastic read Jono – great article and very insightful indeed.
I’m listening!!!
Well said.
Recently I’ve been thinking of the web browser as an operating system with native primatives and some pretty sensable defaults.
Wow! An essay from Jono I actually like and agree with! Congrats!
There’s an old fable about a man going on a journey (to SomeWhere) and as he meets travelers he gets steered to side stops and cute towns and scenic routes.. The travel wears him out before he gets anywhere near SomeWhere, Every distraction was rationally justified… even with some saying that staying on the standard route was like wasting the trip.. missing all of the charm by staying on the sterile highways.
Sometimes.… often?… the actual planned destination was the point of the entire journey. Without extreme stamina and extra resources, the divergent traveler ends up missing the whole purpose.
I called Edge SEO “Wedge SEO” because of the way devs sidelined knowledgeable SEO people by driving a wedge between those hoping to pursue new goals (which then demanded Edge techniques as a means of SEO survival) and those chasing the original goals… which were business goals. Merit-based SEO chasing performance got sidelined for “modern” solutions that cost a lot more time and resources, with complexity, and yet… didn’t often perform well.
With complexity hidden in the edge cloud tech, further compounding problems.
I want to believe! But Firefox support isn’t 100% yet:
View Transition API – Web APIs | MDN https://share.google/lHz2xma4MTlhMaTeQ
This is amazing. I really love the idea of defaulting to native browser features until there’s a real need for something more advanced.
Frameworks are great for developer experience — no doubt. But too often, they push you into shipping JS-heavy solutions that simulate what the browser already does natively.
Are there frameworks that offer the same great DX, but lean into these new native capabilities — things like View Transitions and Speculation Rules — and handle it all at build time? And ready for production and heavy traffic?
You should try Astro! It has support for View Transitions (even fallback support for Firefox), Speculation Rules (called client prerendering, with fallback support for Firefox and Safari) and a whole lot more!
https://astro.build
Exactly what I was thinking. Astro just got even more irresistible!
It seems to me that libraries like HTMX and Datastar follows the approach expressed in this article. You might want to take a look at them. Regards
Oh, and I’ve already seen people complaining that those libraries are not used in real, production sites 🙂
Sorry, that was mildly rude to leave only that prior comment without more context. In general — I appreciate your advocacy for _just using the modern web because it’s awesome_. Thanks!
Just commenting in jest, but command-clicking the link to your “JavaScript broke the web (and called it progress)” article caused _this page_ to navigate alongside the standard “open in new tab” behavior. You may have broken the web.
Thanks – fixed! Ironic stupid JavaScript bug 😊
You almost sound like you’re anticipating Datastar.