Part 9: In Conclusion
htmx gives us features that should have been included in HTML, namely:
declarative controls for updating web pages locally with targeted snippets of HTML.
Notice that being able to do so blurs the distinction between web sites and web apps. In fact, it turns out that this distinction is largely artificial:
- there's no reason web sites cannot behave like web apps
- web apps are just web sites with lots of user interactions.
In other words, there is nothing magical about web applications that makes them radically distinct from ordinary web pages, and necessitates that they be built with a dedicated tech stack. Web apps are simply interactive web sites!
As we have learned throughout this course, the mechanism that bridges the (artificial) gap between web pages and web apps is called transclusion — and was first implemented in Sketchpad in 1963, even before getting its name from Ted Nelson in 1980.
In reality, transclusion is just a natural extension of the link and form elements in HTML, and should be as ubiquitous as those two. Including (parts of) document B into document A should be as natural as navigating from A to B.
Compensation mechanisms
Because browsers and the HTML standard don't implement hypermedia in its full form, developers had to resort to compensatory mechanisms in order to achieve that "native" feel we know from mobile apps. Typically, this means implementing a "thick client" that uses a form of "reactive" UI.
In a thick client model, the rendering logic is implemented on the front-end as opposed to on the server side. In practice, this often means that the domain model (the business logic) must be duplicated on the client-side — at least to some degree.
Reactive programming means that the UI is implemented as a directed graph structure in which the nodes (the UI elements) are listening to each other for events. A state change in one element instantly causes the depending (or downstream) cells to update their values.
A good example is online spreadsheets. A client-side domain model is rendering the grid of cells to the view, and a dependency graph manages the calculations performed (the application state).
At first glance, this is a fitting architecture for a web application. After all, can't we model the DOM tree as a directed graph? If we could also keep track of state change in JavaScript, we could have fully responsive UIs with a native feel!
Not so fast. Reactive programming is a subcategory of a larger paradigm called Dataflow. Dataflow is not just a programming paradigm — it's a class of computer architecture, and an alternative to the von Neumann architecture our computers are based on.
In other words, dataflow is a foundational concept in computer science — and to implement it (well) typically means that one must re-think the entire application architecture from the ground up.
Within a web application, dataflow must co-exist with REST, which is a set of application-level principles for data transmission, resource architecture, and state change. So now we have two foundational concepts (with rather overlapping concerns) that we must harmonize!
This requires more than "full-stack JavaScript engineering", so most frameworks that try this end up doing neither reactive programming nor REST as end result.
Dataflow is (usually) overkill
When applied to web applications, dataflow can also be defined as:
An architecture in which the communication between an interface element (cell) and its neighbors is more extensive than that cell's communication with the server.
In other words, data transfer mostly happens between cells. But most web apps don't work like that; 80% of interactions are centered around CRUD operations 80% of the time. These map roughly to the 5 HTTP core verbs:
- Create:
POST
- Read:
GET
- Update:
PUT
&PATCH
- Delete:
DELETE
This means that in the vast majority of cases, dataflow based on a thick-client architecture isn't relevant; we just need REST and (a functional) hypermedia.
Dataflow isn't inherently opposed to REST, just as well as it can co-exist with the von Neumann architecture. But to implement it well one must first make sure that:
- One is doing reactive programming, and not something else.
- One is doing REST, and not something else.
Once we have these two boxes ticked off, we can start employing reactivity in order to to augment our already-functional hypermedia-based app as needed — not as a compensation for the latter's deficiencies. One way to do this is to use Alpine.js in conjuction with htmx.
Other reactive frameworks are Solid, etc. It'll be interesting to see this combo.
Overengineering leads to dead ends
What we don't need is an approach that essentially tries to duplicate the browser:
- event dispatching and bubbling
- virtual DOM
- files with a custom extension
- four transpilation steps (at least) to build those files
- managing client-side state
- writing custom logic for rendering HTML from the JSON response
- importing a host of dependencies that you have no control over
- breaking the web
- being unproductive, confused, and miserable!
This is over-engineering — and I can't imagine how many hours and dollars in lost productivity this has caused. This entire time, we have been treating the symptom of the problem, not its underlying cause: lack of a functional hypermedia.
It has taken about 30 years since the birth of hypertext for dynamic web page updates to catch up — and do so in a way that's harmonious with the design principles that the web as platform is built on.
Well, better late than never.
Get involved
You can learn more about htmx here: