There are a bunch of utilities that don't actually _do_ anything useful. The proxy in this example is used for nothing other than debug logs. The DOM utility layer just slightly reduces the number of LOC to create a DOM node.
And then you end up with consumer code that is not actually declarative? The final code still directly manipulates the DOM. And this shows the simplest possible example - creating and removing nodes. The difficult part that libraries/frameworks solve is _updating_ the DOM at scale.
For me the debate never reaches the end because different kinds of developers build fundamentally different kinds of products.
If you are building a website, a forum, or a generally document based application with little to no interactivity (beyond say, “play media”) then absolutely make a server rendered html page and sprinkle it with a bit of JavaScript for accordions.
If what you are building is a complex editor (image, text), is highly interactive (with maps, and charts and whatever) and users will generally spend a lot of time navigating between almost same pages. Basically when there would be no expectation that this should work with JavaScript disabled… then just build a purely client rendered application in the framework of your choice.
To me the dispute comes when one bleeds to another. I also think that mixed modes are abominations unless you truly have actual performance gains (maybe if you have 1B+ customers), which I’d argue is true for almost no one.
> For the life of me I don’t understand why people absolutely insist on using JavaScript to render HTML. Backend frameworks do HTmL just fine.
There’s an entire universe of front-end developers who don’t even know JavaScript. React is the only thing they’ve ever touched and they’re completely helpless without it.
Morphing the web user agent into something akin to an X11 server made total sense to me when I started doing such in 2000. If we developers had demanded a true distributed windows system, then we would have been spared this bag of hurt.
I remember demoing the Andrew Window Manager to colleagues in 1989 and them feeling like they had glimpsed the future. Alas, that future never came.
Rendering the whole DOM tree (instead of VDOMs) is a fast process. The slow part is attaching (committing) elements to the doc. For example, I have a test of 20,000 elements which takes <30ms to render, while attaching them takes 120ms.
Since the performance is mainly bound to the commit phase, with a DOM merging library, or hopefully, if we get a native API such as `document.replaceChildren(...App(), { merge: true })`, this approach could be better.
Caveats:
Although it restores focus, that's not the only thing we need to preserve, we also need to preserve scroll position and cursor position.
So to work around that, I still have to step out fully declarative, by just replacing the part that changed. For example, here I had to do manually mutate the DOM:
The real problem vdom and more complex frameworks solve for me is dealing with much more complex state i.e. lists.
When dealing with lists there are so many possible ways of updating them (full updates, insertion/removal at an index, update at an index, ...) that manually mounting and unmounting single items by hand gets unbearable. You must then do some kind of diffing at the framework level to get good performance and readable code.
I would like to see "VanillaJS" articles talk both more and more in depth about this problem.
This is a really weird website, I glanced over a bunch of different articles and all read like AI slop to me.
Indeed, a detecting tool like GPT Zero is "highly confident" that 97% of this article is AI generated, while AI Detector returns "We are 100% confident that the text scanned is AI-generated".
Curious if this is an uncanny valley situation, because there aren't that many tells (dashes, etc.) in the text itself. Does it feel the same to you?
Didn’t look at it too closely, but the whole article as it stands is almost completely copy-pastable from a llm chat. Another comment pointing out that there’s some code that doesn’t do anything is another clue.
(Not saying it was, but if I’d ask the llm to create and annotate a HTML manipulation poc with code snippets, I’d get a very similar response.)
Edit: Pretty sure the account itself is only here to promote this page.
And then you end up with consumer code that is not actually declarative? The final code still directly manipulates the DOM. And this shows the simplest possible example - creating and removing nodes. The difficult part that libraries/frameworks solve is _updating_ the DOM at scale.
DOM manipulations can be simplified to just a few actions: remove, Add, change.
The other types of manipulations and interactive features can be sprinkles of JavaScript instead of hundreds of kilobytes of the stuff.
HTMX, Hotwire/Turbo, LiveView are just so much saner to me.
If you are building a website, a forum, or a generally document based application with little to no interactivity (beyond say, “play media”) then absolutely make a server rendered html page and sprinkle it with a bit of JavaScript for accordions.
If what you are building is a complex editor (image, text), is highly interactive (with maps, and charts and whatever) and users will generally spend a lot of time navigating between almost same pages. Basically when there would be no expectation that this should work with JavaScript disabled… then just build a purely client rendered application in the framework of your choice.
To me the dispute comes when one bleeds to another. I also think that mixed modes are abominations unless you truly have actual performance gains (maybe if you have 1B+ customers), which I’d argue is true for almost no one.
There’s an entire universe of front-end developers who don’t even know JavaScript. React is the only thing they’ve ever touched and they’re completely helpless without it.
I remember demoing the Andrew Window Manager to colleagues in 1989 and them feeling like they had glimpsed the future. Alas, that future never came.
https://mirrors.nycbug.org/pub/The_Unix_Archive/Unix_Usenet/...
https://github.com/ericfortis/mockaton/blob/main/src/client/...
Results so far:
Rendering the whole DOM tree (instead of VDOMs) is a fast process. The slow part is attaching (committing) elements to the doc. For example, I have a test of 20,000 elements which takes <30ms to render, while attaching them takes 120ms.
Since the performance is mainly bound to the commit phase, with a DOM merging library, or hopefully, if we get a native API such as `document.replaceChildren(...App(), { merge: true })`, this approach could be better.
Caveats:
Although it restores focus, that's not the only thing we need to preserve, we also need to preserve scroll position and cursor position.
So to work around that, I still have to step out fully declarative, by just replacing the part that changed. For example, here I had to do manually mutate the DOM:
https://github.com/ericfortis/mockaton/blob/main/src/client/...
https://react.dev/reference/react/createElement
[0]: https://github.com/jorgebucaran/hyperapp/tree/main/packages/...
When dealing with lists there are so many possible ways of updating them (full updates, insertion/removal at an index, update at an index, ...) that manually mounting and unmounting single items by hand gets unbearable. You must then do some kind of diffing at the framework level to get good performance and readable code.
I would like to see "VanillaJS" articles talk both more and more in depth about this problem.
Indeed, a detecting tool like GPT Zero is "highly confident" that 97% of this article is AI generated, while AI Detector returns "We are 100% confident that the text scanned is AI-generated".
Curious if this is an uncanny valley situation, because there aren't that many tells (dashes, etc.) in the text itself. Does it feel the same to you?
(Not saying it was, but if I’d ask the llm to create and annotate a HTML manipulation poc with code snippets, I’d get a very similar response.)
Edit: Pretty sure the account itself is only here to promote this page.
Dang, he submitted about 50 times that website to HN.
Can an admin please take a look?