> Traditional frameworks hydrate entire pages with JavaScript. Even if you've got a simple blog post with one interactive widget, the whole page gets the JavaScript treatment. Astro flips this on its head. Your pages are static HTML by default, and only the bits that need interactivity become JavaScript "islands."
Back in my days we called this "progressive enhancements" (or even just "web pages"), and was basically the only way we built websites with a bit of dynamic behavior. Then SPAs were "invented", and "progressive enhancements" movement became something less and less people did.
Now it seems that is called JavaScript islands, but it's actually just good ol' web pages :) What is old is new again.
Bit of history for the new webdevs: https://en.wikipedia.org/wiki/Progressive_enhancement
You're making the opposite mistake: you're seeing someone's description of a tool's feature and confusing it with the way we've already done things without even checking if the tool is transformative just because it kinda sounds similar.
Astro's main value prop is that it integrates with JS frameworks, let's them handle subtrees of the HTML, renders their initial state as a string, and then hydrates them on the client with preloaded data from the server.
TFA is trying to explain that value to someone who wants to use React/Svelte/Solid/Vue but only on a subset of their page while also preloading the data on the server.
It's not necessarily progressive enhancement because the HTML that loads before JS hydration doesn't need to work at all. It just matches the initial state of the JS once it hydrates. e.g. The <form> probably doesn't work at all without the JS that takes it over.
These are the kind of details you miss when you're chomping at the bit to be cynical instead of curious.
>e.g. The <form> probably doesn't work at all without the JS that takes it over.
What a value!
I guess I may be chomping at the bit to be cynical, but I have quite a bit of experience in these fields, and I don't think Astro sounds especially transformative.
What is the value in first sending dysfunctional HTML and then fixing it with later executed JS? If you do that, you might as well do 100% JS. Probably would simplify things in the framework.
Sending functional HTML, and then only doing dynamic things dynamically, that's where the value is for web _apps_. So if what you point out is the value proposition for Astro, then I am not getting it, and don't see its value.
Dysfunctional HTML in the sense that interactivity is disabled, but visually it is rendered and therefore you can see the proper webpage immediately.
Just compare the two cases, assuming 100ms for the initial HTML loading and 200ms for JS loading and processing.
With full JS, you don't see anything for 300ms, the form does not exists (300ms is a noticeable delay for the human eye).
With frameworks such as Astro, after 100ms you already see the form. By the time you move the mouse and/or start interacting with it, the JS will probably be ready (because 200ms is almost instant in the context of starting an interaction).
This is not new at all, old school server side processing always did this. The advantage is writing the component only once, in one framework (React/vue/whatev). The server-client passage is transparent for the developer, and that wasn't the case at all in old school frameworks.
Note that I'm not seeing this is good! but this is the value proposition of Astro and similar frameworks: transparent server-client context switching, with better performance perceived by the user.
I'm sorry to go off-topic, but since when hydrating web pages with JavaScript became a thing? First time I've heard the term.
Edit: according to WP history, around December 2020
https://en.wikipedia.org/w/index.php?title=Hydration_(web_de...
Second edit: "Streaming server rendering", "Progressive rehydration", "Partial rehydration", "Trisomorphic rendering"... Seems I woke up in a different universe today.
I didn't read their comment as particularly cynical, and at a high level they're still correct, but so are you.
I think your comment gets at a very specific and subtle nuance that is worth mentioning, namely that typically if you were a proghance purist, you'd have a fallback that did work; a form the submitted normally, a raw table that could be turned into an interactive graph, etc..
I don't think these details are mutually exclusive though, and that it was certainly valid in those days to add something that didn't have a non-js default rendering mode, it's just that it was discouraged from being in the critical path. Early fancy "engineered" webapps like Flipboard got roasted for poorly re-implimenting their core text content on top of canvas so they could reach 60fps scrolling, but if JS didn't work their content wasn't there, and they threw out a bunch of accessibility stuff that you'd otherwise get for free.
Now that I'm thinking back, it's hard to recall situations *at that time* where there would be both something you couldn't do without JavaScript and that couldn't also have a more raw/primitive version, but one example that comes to mind and would still be current are long-form explanations of concepts that can be visualized, such as those that make HN front page periodically. You would not tightly couple the raw text content to the visualization, you would enhance the content with an interactive visual, but not necessarily have fallback for it, and this would still be progressive enhancement.
Here's another good example from that time, which is actually only somewhat forward compatible (doesn't render on my Android), but the explanation still renders https://www.ibiblio.org/e-notes/webgl/gpu/fluid.htm
> you're chomping at the bit to be cynical instead of curious.
Nicely sums up a lot of interactions these days
Agreed! Astro is fantastic, but the biggest barrier I had in learning it was getting my head around range of terms that developers who entered the workplace after 2010 have invented to describe "how the web works".
I really appreciate that innovation can sometimes reinvent things that exist out of pure ignorance and sometimes hubris. I’ve seen people turn mountains out of molehills that take on a lore of their own and then along comes someone new who doesn’t know or is a little too sure of themselves and suddenly the problem is solved with something seemingly obvious.
I’m not sure javascript islands is that but I appreciate a new approach to an old pattern.
I agree with that it’s not a new concept by itself, but the way it’s being done is much more elegant in my opinion.
I originally started as a web developer during the time where PHP+jQuery was the state of the art for interactive webpages, shortly before React with SPAs became a thing.
Looking back at it now, architecturally, the original approach was nicer, however DX used to be horrible at the time. Remember trying to debug with PHP on the frontend? I wouldn’t want to go back to that. SPAs have their place, most so in customer dashboards or heavily interactive applications, but Astro find a very nice balance of having your server and client code in one codebase, being able to define which is which and not having to parse your data from whatever PHP is doing into your JavaScript code is a huge DX improvement.
> Remember trying to debug with PHP on the frontend? I wouldn’t want to go back to that.
I do remember that, all too well. Countless hours spent with templates in Symfony, or dealing with Zend Framework and all that jazz...
But as far as I remember, the issue was debuggability and testing of the templates themselves, which was easily managed by moving functionality out of the templates (lots of people put lots of logic into templates...) and then putting that behavior under unit tests. Basically being better at enforcing proper MVC split was the way to solve that.
The DX wasn't horrible outside of that, even early 2010s which was when I was dealing with that for the first time.
The main difference is as simple as modern web pages having on average far more interactivity.
More and more logic moved to the client (and JS) to handle the additional interactivity, creating new frameworks to solve the increasing problems.
At some point, the bottleneck became the context switching and data passing between the server and the client.
SPAs and tools like Astro propose themselves as a way to improve DX in this context, either by creating complete separation between the two words (SPAs) or by making it transparent (Astro)
Well, that's a way to manage server-side logic, but your progressively-enhanced client-side logic (i.e. JS) still wasn't necessarily easy to debug, let alone being able to write unit tests for them.
> but your progressively-enhanced client-side logic (i.e. JS) still wasn't necessarily easy to debug, let alone being able to write unit tests for them
True, don't remember doing much unit testing of JavaScript at that point. Even with BackboneJS and RequireJS, manual testing was pretty much the approach, trying to make it easy to repeat things with temporary dev UIs and such, that were commented out before deploying (FTPing). Probably not until AngularJS v1 came around, with Miško spreading the gospel of testing for frontend applications, together with Karma and PhantomJS, did it feel like the ecosystem started to pick up on testing.
There wasn't as much JS to test. I built a progressively-enhanced SQLite GUI not too long ago to refresh my memory on the methodology, and I wound up with 50-ish lines of JS not counting Turbo. Fifty. It was a simple app, but it had the same style of partial DOM updates and feel that you would see from a SPA when doing form submissions and navigation.
Not usually, but in the context of
> Astro find a very nice balance of having your server and client code in one codebase, being able to define which is which and not having to parse your data from whatever PHP is doing into your JavaScript code is a huge DX improvement.
the point is pretty much that you can do more JS for rich client-side interactions in a much more elegant way without throwing away the benefits of "back in the days" where that's not needed.
Bingo.
Modern PHP development with Laravel is wildly effective and efficient.
Facebook brought React forth with influences from PHPs immediate context switching and Laravel’s Blade templates have brought a lot of React and Vue influences back to templating in a very useful way.
I remember when it was called AJAX. We have completely lost the plot.
It's why Alan Kay calls our industry a Cargo Cult and says it's much more akin to the fashion industry than engineering.
Absolutely.
Htmx and, even moreso, Datastar have brought us back on track. Hypermedia + Ajax with whatever backend language(s)/framework(s) you want. Whereas astro is astro.
I remember it was called DHTML. ( And the code never works perfectly on IE and Netscape / Mozilla )
DynamicDrive my beloved
The first web frameworks really got it right with stateless websites and server rendering.
This field (software) in general, and especially web stuff, has no memory. It’s a cascade of teens and twentysomethings rediscovering the same paradigms over and over again with different names.
I think we are overdue for a rediscovery of object oriented programming and OOP design patterns but it will be called something else. We just got through an era of rediscovering the mainframe and calling it “cloud native.”
Every now and then you do get a new thing like LLMs and diffusion models.
This. So much this. I'm not even 42 and I feel old when I read stuff like this. Like you say, there's no memory in the web software field.
Even if you are 10 years younger than you and youve been doing it webdev in your late teens you’ve seen like three spins of the concepts wheel to come back again and again.
Wasted effort reinventing things and rediscovering their limits and failure modes is a major downside of the ageism that is rampant in the industry. There is nobody around to say “actually this has been tried three times by three previous generations of devs and this is what they learned.”
Another one: WASM is a good VM spec and the VM implementations are good, but the ecosystem with its DSLs and component model is getting an over engineered complexity smell. Remind anyone of… say… J2EE? Good VM, good foundation, massive excess complexity higher up the stack.
Software engineering/development has had a long September.
https://en.wikipedia.org/wiki/Eternal_September
Back in your day, there wasn’t a developer experience by which you could build a website or web app (or both) as a single, cohesive unit, covering both front-end and back-end, while avoiding the hydration performance hit. Now we have Astro, Next.js with RSC, and probably at least a dozen more strong contenders.
WebObjects.
That is a perfect description of it, released 1996. So far ahead of its time it’s not even funny. Still one of the best programming environments I’ve ever used almost <checks calendar> 30 years later.
> build a website or web app (or both) as a single, cohesive unit, covering both front-end and back-end, while avoiding the hydration performance hit.
Isn't that basically just what Symfony + Twig does? Server-side rendering, and you can put JS in there if you want to. Example template:
The syntax is horrible, and seeing it today almost gives me the yuckies, but seems like the same idea to me, just different syntax more or less. I'm not saying it was better back then, just seems like very similar ideas (which isn't necessarily bad either)I find that syntax more palatable than what is going on in JSX to be honest. I actually like having a separate declarative syntax for describing the tree structure of a page. Some languages manage to actually seamlessly incorporate this, for example SXML libraries in Schemes, but I have not seen it being done in a mainstream language yet.
Yeah, not a huge fan of either either to be honest, but probably JSX slightly before Twig only because it's easier for someone who written lots of HTML.
Best of them all has to be hiccup I think, smallest and most elegant way of describing HTML. Same template but in as a Clojure function returning hiccup:
Basically, just lists/vectors with built-in data structures in them, all part of the programming language itself.Why do you think that? You could absolutely "build a website covering FE and BE without the hydration performance hit" using standard practices of the time.
You'd render templates in Jade/Handlebars/EJS, break them down into page components, apply progressive enhancement via JS. Eventually we got DOM diffing libraries so you could render templates on the client and move to declarative logic. DX was arguably better than today as you could easily understand and inspect your entire stack, though tools weren't as flashy.
In the 2010-2015 era it was not uncommon to build entire interactive websites from scratch in under a day, as you wasted almost no time fighting your tools.
> there wasn’t a developer experience by which you could build a website or web app (or both) as a single, cohesive unit, covering both front-end and back-end
How much of the frontend and how much of the backend are we talking about? Contemporary JavaScript frameworks only cover a narrow band of the problem, and still require you to bootstrap the rest of the infrastructure on either side of the stack to have something substantial (e.g., more than just a personal blog with all of the content living inside of the repo as .md files).
> while avoiding the hydration performance hit
How are we solving that today with Islands or RSCs?
It can cover as much of the back-end that your front-end uses directly as you’d care to deploy as one service. Obviously if you’re going to have a microservices architecture, you’re not going to put all those services in the one Next.js app. But you can certainly build a hell of a lot more in a monolith that a personal blog serving a handful of .md files.
In terms of the front-end, there’s really no limit imposed by Next.js and it’s not limited to a narrow band of the problem (whatever that gibberish means), so I don’t know what you’re even talking about.
> How are we solving that today with Islands or RSCs?
Next.js/RSC solves it by loading JavaScript only for the parts of the page that are dynamic. The static parts of the page are never client-side rendered, whereas before RSC they were.
WebObjects and ColdFusion come from the 90s
>Traditional frameworks hydrate
Dear God. In 20 years people will hire HTML experts as if they are COBOL experts today.
I look forward to that day, because it implies something better would have replaced it.
The headline is literally “returns to the fundamentals of the web” so that’s the point.