It's time for modern CSS to kill the SPA
It's time for modern CSS to kill the SPA

It's time for modern CSS to kill the SPA

It's time for modern CSS to kill the SPA
It's time for modern CSS to kill the SPA
We’ve been adding mountains of JS to “feel” fast, while making everything slower.
I disagree that that is a bad thing. I’m not providing evidence here, so disagree if you wish. In my opinion, users don’t care if something is fast. They don’t take out stopwatches to time the page transitions. They care if something feels fast. They want the web page to react immediately when they issue an action, and have the impression that they’re not waiting for too long. Feeling fast is way more important than being fast, even if the feeling comes at a performance hit.
It takes about 120ms for a human to detect (not react to) a stimulus [1] (For the gamers: That’s 8FPS). So if you page responds to a user action in that time frame, it feels instantaneous.
If you want to try it yourself, paste this code into your browser console. Then click anywhere on the page and see if the delay of 125ms feels annoying to you.
let isBlack = true; document.body.addEventListener('mousedown', () => { setTimeout(() => { document.body.style.backgroundColor = isBlack ? 'red' : 'black' isBlack = !isBlack }, 125) })
I dunno. As a user, if you throw me a code that takes 120ms for my CPU to execute, and tell me there is an alternative that takes 2x-3x less time, I would look for someone who provides that instead.
No i want it to actually be fast, and stop using unholy amounts of ram for basic tasks. I am not pulling out a stopwatch you are correct but when I try to use one of my still perfectly functional but older systems I can feel the effects of all this JavaScript bullshit.
Maybe I missed it, but how do I share state between page transitions? For instance, I’m currently displaying a user list. How do I carry the user over to the details page without re-fetching it, and more interestingly, without re-instantiating the User instance from the data?
I imagine (though I’m the first to admit that I don’t know every modern API by heart) I would have to move the user to local storage before allowing the browser to navigate. That sounds annoying, since I’m either persisting the whole user list, or I need to check which link the user clicked, prevent the navigation, store the relevant user, and then navigate manually.
With an SPA the user list just lives in a JS variable. When I’m on the user’s details page, I just find the relevant user in that list without even making another HTTP request.
How do I carry the user over to the details page without re-fetching it
why are you making this restriction?
without re-instantiating the User instance from the data
same here, why are you making this restriction? You're arbitrarily limiting your design to make sure that an SPA is the right choice. If you designed with a static page in mind, re-retrieving the user list would be instantaneous. You'd have the list partially cached or fully cached on the server and a retrieval would be instant, faster than any SPA would switch contexts. Why does re-instantiating the User matter at all? Don't store the entire state in the browser and you don't need to reinstantiate so much.
Talking about users as the data that is displayed as well as the user who visits the site gets confusing. So I’m changing the viewed data to be customers for this discussion.
How do I carry the [customer[ over to the details page without re-fetching it
why are you making this restriction?
Because re-fetching data that the client already has is wasteful.
re-retrieving the [customer] list would be instantaneous
Nothing that goes over the wire is ever instantaneous. Even if I hit the cache, then I’d still have a round-trip to confirm that.
faster than any SPA would switch contexts
For the apps I develop, latency is usually about 20ms. So you’re assuming that (given 1 million instructions per second, which is on the very low end) an SPA would need more than 20 million instructions to switch context?
Why does re-instantiating the [customer] matter at all?
Because it is the frontend’s responsibility to display data, not the backend’s. The backend will, for instance, only store the customer’s birthday, but users might be interested in their age. It is not the backend’s responsibility to mangle the data and swap out the birthday for their age.
This is why my customers aren’t just data objects, but have methods to get, for instance, their age. They might also have a method to check how much they should pay for a product, given the ir other data.
If I weren’t writing an SPA, then showing the expected cost for buying a product would require displaying a form (that was always there, but hidden via CSS) and having the user enter the product. Then they’d send off that form (refreshing the page in the process, which means downloading 90% unchanged HTML again for no reason). This refresh cannot even be sensibly cached or prefetched, because there are over 200 products to choose from. Confirming the booking would refresh the page again (though this time prefetching is an option).
If the user wants to go back to the customer list, pick a different customer, and do the same process again, we’re at 4 more requests that only download data the client should already have.
Also notice that the backend had to render a
<select>
with 200 options just in case the user wanted to book a product. How would you facilitate searching this on a static page? Refresh the page again each time the user presses a button?Compare this to an SPA:
The client downloads the instructions on how to draw the customer list, and the customer data. Then the client downloads the instructions on how to draw the customer details (without all the forms, because most of them won’t be needed, which saves bandwidth).
Then the user clicks the ‘buy product’ form, which triggers two requests: one for the instructions on how to render the form, one for the products list. The client can then compute the price themselves (using the smart customer object).
If the user confirms, all that needs to be sent is a tiny “book this” message. The server doesn’t even need to respond with data because the client can compute all the changes itself.
If the user wants to do this process on another customer, or 100 customers, then it only needs to re-send the ‘book this’ messages for each customer, because all the rest is cached.
The customer list is a table with 2000 entries, sortable by 10 different columns. How would sorting work? Because the client doesn’t have access to the ‘raw’ data, only the HTML, it couldn’t sort that itself. Each sorting operation would need to be a request to the server and a re-render of the whole page. Do you expect the client to pre-load all 20-factorial sorting schemes?
I get what you are asking for but I don't think it is even necessary to have a list of users on the client in the first place using the MPA approach. I would guess the list of users is just available to the server, which pre-renders HTML out of it and provides it to the client.
So we’re back to fully static pages and rendering HTML on the server? That sounds like a terrible idea. I don’t want to preload 10 different pages (for opening various filtering forms, creation forms, more pages of the user list, different lengths of the user list, different orderings of the list, and all combinations of the above) just in case a user needs one of them, which they mostly don’t.
in your case the user list would be rendered by the server and the client wouldn't care, it would receive a new rendered copy when you changed pages.
it seems like their argument was all just sites that should have been fully static to begin with, and for some reason have made it sound like that's the main use of SPAs. It's a silly article and I wouldn't change anything I'm doing based on it. if your site is a content based site(showing docs/articles/etc.) then you shouldn't be using an SPA, especially just for page transitions. otherwise you have a valid use for an SPA and should continue regardless of these new APIs
If only "fullstack" or "typescript" devs weren't so scared of CSS. They can optimize a weird join, they know the Big O notation of a function that operates on a list that'll never ever exceed a size of like 1000 items so who the fuck cares, but as soon as you ask them about grid, layers, container queries, or even what things like houdini can hopefully do one day, they just collectively shit themselves.
What corporations and enterprise software development practices have done to the web makes me crash out, sorry. Very few people who actually care about CSS are making a living wage making things with it as their job.
This would've been a much more exciting article, if it was actually supported across the web and not just in Chromium...
It's already on Safari and Firefox Nightly too.
What would happen on unsupported browsers?
Hopefully the page would load just as well, but the transition would be less smooth.
If that allows the website to render fast, without JS, on all browsers then it's more than worth it.
Yes, firefox users would just miss the transitions until it's added to the browser. It's absolutely worth it, as JavaScript frameworks use have gone out of hand as the article says...