The Journey to Isomorphic Rendering Performance
How SolidJS's unique take on SSR lead it to being the fastest JavaScript renderer in the browser and on the server.

I'm the author of the SolidJS UI Library, known for being one of the most performant libraries in the browser. I knew that I would need an isomorphic solution where the developer could run the same code in the browser and on the server. But, when the time came to look into server rendering I slowly realized I was dealing with a very different sort of problem.
Today, I share my journey developing Solid's isomorphic rendering solution. How I developed an approach to server-side rendering, which benchmarks so well, it establishes Solid as a contender for the absolutely fastest library on both the client and server.


Introduction to Server Side Rendering
After years of optimizing operations for the DOM in the browser, clever heuristics, and finding ways to leverage pre-compilation, I was sure if I looked hard enough the best way to do Server Side Rendering(SSR). That there would be some catalogue of secret tricks that when combined would produce the optimal result. I did find what I was looking for but it wasn't what I expected.
To start there are a lot of ways to render interactive JavaScript websites and applications. I mean a lot of different ways. There are so many permutations that the first thing to realize is when one says SSR they could mean dozens of different things.
There is on-demand rendering vs static site generation (SSG). There are buffered vs streamed responses. There is synchronous versus asynchronous rendering. There are Single Page Apps (SPA) vs Multi Page Apps (MPA). There is no hydration, full hydration, partial hydration and progressive hydration.
And while all are potentially good things on their own, not all approaches are complimentary with each other.
Note: Understanding Hydration is essential to having a full discussion on SSR performance. Reading Rendering on the Web by Jason Miller and Addy Osmani is a strongly recommended foundation to the topics covered in this article.
SSR Architecture
Initially, I worried that I'd need to develop 16 different solutions to cover the bases. But following the use cases I was able narrow things down. From my research 2 approaches have surfaced as the forerunners.
JAMstack
I will start here because it is by far the simplest on the surface coming from a SPA mentality. JAMstack stands for JavaScript + APIs + Markup. The idea is to statically serve the HTML and have the client manage the rest. While not limited to Single Page Apps they are pretty common here as you can pre-render your static shell, host it up on something like Netlify and use JSON/GraphQL APIs hosted on some serverless technology like AWS or Cloudflare Workers handling all the data.

The choice to statically pre-render doesn't completely have you side step all the issues. You have fast First Contentful Paint (FCP), but often Largest Contentful Paint (LCP) and Time to Interactive (TTI) are delayed. This is because you still have to wait for the JavaScript to load the majority of dynamic content and you still pay a steep price in hydration. It improves the traditional SPA, but you still don't get to request dynamic data until the page has loaded in the browser.
When the focus is solely put on the initial render it might be simple enough to say we need less JavaScript. However, SPA-like characteristics can make the app experience drastically smoother after the first load. There is a reason there are so many SPAs these days despite poorer initial loading times. Once the JavaScript assets are cached any slowdown is basically non-existent. It is difficult to beat that.
There has been some progress providing similar UX in MPAs with Portals and TurboLinks which can drastically smooth page transitions. But that is only one aspect of SPAs more fluid interactivity.
Islands
Multi-Page Apps/Sites are ones where the routing happens on the server. Each navigation leads to rendering new HTML pages. The key benefit of routing on the server is related to the nature of hydration.
If the top level of your page is mostly static you need very little JavaScript and each dynamic part can act as in isolation. Jason Miller, author of Preact, recently wrote an article dubbing this the "Islands Architecture". But it is not a new idea by any means. MarkoJS was built on this premise, using Partial Hydration in production at eBay scale for over half a decade.

Streaming the HTML response can improve loading dramatically. For example, MarkoJS flushes initial render synchronously with placeholders and then streams in script tags to insert the dynamic sections as they complete on the same response. Michael Rawlings details the approach in this article. I soon expect this space to be more crowded as many libraries are working on similar solutions.
Ultimately this approach represents the best initial render performance we can have for largely dynamic content. The amount of JavaScript and DOM to hydrate is just less. Data requests can start as soon as the server receives the initial request. Hard to argue with that for things like eCommerce.
JavaScript Server Render Performance
With those approaches in mind I set off to the next task of figuring out how to actually render on the server. It was not as simple as I would have thought.
DOM on the Server

Initially, I thought let's just bring the DOM with us. This will be easy. Solid JSX already creates DOM nodes. Support for Web Components. Pure isomorphic experience. Hell you could even use jQuery if you wanted to. I started with JSDOM and then tried lighter layers like basicHTML, but I was hard pressed to find a popular library out there slower in benchmarks.
I read about how to make rendering with the DOM on the server performant by "warming it up" and came across a similar topic for the VDOM called "blueprints". Blueprints is this idea of warming up the renderer by pre-constructing the tree and then just passing in new data and serializing the output.
However, a reactive library with a granular mentality was not going to really be able to leverage this. There is no single entry to propagate data down. In granular libraries the tree is broken into many small independently updating nodes.
In the end the DOM on the server is a sort of Virtual DOM. A less proprietary one, but one that has idiosyncrasies designed for the browser. On the positive it provides compatibility with the platform. But all indications suggested that a VDOM was still going to be better up to the task than an emulated DOM. And I realized I really wanted nothing to do with either.
Reactivity on the Server

So I created new runtime methods specifically for SSR that didn't use DOM APIs and instead created strings directly. This worked as long as you turned off DOM interop points like events and refs. It was drastically more performant. It was similar to some other libraries like React or Preact but was not a standout.
And it made sense. Granular reactivity is only so performant in the browser because we do this dance to avoid unnecessary DOM operations. It is optimized for update. The way we avoid initial costs in the browser is have the compiler batch DOM Node creation, and minimize walking which still has a real cost.
Without having this overhead from the DOM we are just left with all the extra mechanisms. The problem is if you want to have asynchronous rendering on the server it doesn't make sense to eschew the change management system. How do you update anything?
This only gets more challenging with how freely change propagates with granular reactivity. One update to a signal can travel independently through the tree without any sort of top down constraints. So it is more than just identifying render boundaries like Suspense to fully settle things.
If we are going to update reactively we are still creating injection points into our string template. So we still have this Virtual Dom of sorts with static parts and dynamic imports. And to serialize it we basically have to wait until the end to fetch out all the values.
At this point I did not like where this was heading. We were just trading where the slowdown happened. This was more than adequate solution for SSG and JAMstack approaches, but I was hoping for more.
Architecture Revisited
Re-framing the problem

Up to this point I wasn't really happy. We have the JAMStack on one side and this "Islands" architecture on the other. These are good solutions for many things. But, I wasn't particularly content with these options. Why sacrifice one for the other. Why can't we be incredibly dynamic with good load times and seamless SPA-like UX.
I mean the hidden benefit of JAMStack building ahead of time is we don't have to be accountable for how badly our client side library performs on the server. But what if I want to render on demand? What if we want to render in a Cloudflare worker?
Marko's streaming seems the most promising but there seemed like there would be a lot of complexity here with a system built for dynamic updates.
Why is JavaScript-heavy SSR so complicated?
Looking for Alternatives
I started thinking about how we used to progressively enhance our server rendered pages with libraries like jQuery and KnockoutJS. But this would never fly for a certain class of sites/applications.
If you had a sufficiently static page (and routing on the server) there is a use case for partial hydration and server-only rendered components. But as the complexity of those apps approach the modern SPA these techniques don't help them. And that is the area I felt Solid could make the biggest impact.
It's no secret I'm biased here. I wasn't even interested in SSR or SSG with Solid initially. I had shown in benchmarks that I could give those methods a run for their money purely client side rendering with a small library and thoughtful code splitting. See Solid's Realworld Demo comparison.
So what is the ideal approach for a library like this, where rendering on the server can provide tangible value?
Identifying Goals
What are the biggest weaknesses with SPA architecture? Arguably FCP is going to trail a bit depending on the heaviness of the library. But I actually think that it's LCP and consequently TTI that suffer more. With heavy rehydration TTI isn't actually going to differ that much between the approaches. But it will be impacted how long it takes to load the main content.
In my experience it's the data loading that takes longer than the JavaScript on those subsequent loads. This true of dynamic imports on initial load as well. You need to unblock data loading. React documentation refers to this as render-as-you-fetch. Don't wait for separate JS chunks to load to start loading data. I heavily use this pattern in Solid regardless of SSR.


Looking at the timeline you can see that Solid has already loaded the API data before Svelte even starts its request. Then add in the time it takes to server render vs send the static mostly empty HTML page. This translates to almost double the time it takes to get to show the main content.
This an improvement most SPAs can make today and has profound impact on loading time. But how do we make it better? Well, we have to stream.
Streaming means that we will want to synchronously flush our renders. We will show placeholders until the content loads to get content up in the browser as quickly as possible and keep those quick FCP numbers.
Recent UI patterns that have been popularized in the browser help us here. With Suspense Components we can identify loading placeholders. With Resources, a special primitive designed to handle reads of possibly unavailable values, we know both when data is requested and completes, as well as where async values are read.
The use of special primitives here is very intentional to prevent "coloring" the development experience. Async Functions have the tendency of being invasive where once you introduce them you need to have them flow all the way up. Generators can suffer similar issues. This was a large motivator behind React's approach of "throwing promises" and comes naturally to reactive systems, which already relies on independent propagation of change.
But we are still faced with the problem of propagating granular change throughout a view already shipped to the browser without the overhead of reactivity. We can't get rid of the pipeline. There is no other mechanism to update. Ironically the more granular the system the trickier this becomes.
The SSR SPA?
Well, that definitely stumped me for a bit. I started acknowledging maybe there are some things that granular reactivity doesn't excel at. This is not a an easy problem as all predictability goes out the window when you go async. And that's what made me recognize this was very similar to a previous problem I had with hydration.
Hydration without a VDOM makes it harder to gather nodes. There is no "template" to work off. We do everything single pass with JSX that executes inside out and JSX's dynamic nature makes it not able to be statically analyzed in this case. The solution was I could use the the fact that we render on the server to act as the first pass to encode the data we need right into the HTML string.
I needed to stop thinking of client and server as isolated problems. We already have the pipeline. We have a reactive graph ready and willing to render our application. It just isn't on the server.
The approach is to render everything synchronously on the server and when you hit a Suspense boundary continue executing to trigger fetching but immediately render a placeholder to get streamed to the client.
When the Resource load on the server completes you write a script tag into the page so the corresponding Resource on the client can read it. Essentially have a Promise initiate on the server and resolve in the client. The server acts as distributed sources to the client's reactive graph.
What this accomplishes is data begins fetching as soon as possible on the server and the client doesn't have to wait to see anything. All non-static data is only shipped once to the client as we aren't sending it twice as data and markup.
Most importantly for me that meant I could focus my SSR efforts with Solid on purely synchronous execution performance on the server and achieve an async isomorphic model.
Building The Solution
Revisiting Server Rendering Performance
So we don't need a reactive system on the server to maintain a consistent model between client and server. Having the compiler output differently for server is straightforward for native elements. But user code has primitives in it.
The solution for that was to write a completely different version of the runtime and re-writing the import statements. This way the exact same code can render isomorphically. Reactive signals become simple value getters, and computations become IIFE's.
As I worked on this more I realized the secret to server performance wasn't that interesting at all. It is literally how fast you can combine strings. Nothing more.
I re-learned obvious things like while Template Literals are fast Tagged Template Literals are much slower. So if you are escaping holes in the template you are better to inline the calls or, if using a function to merge, stay away from Tagged Templates.
When dealing with lists using a for
loop and combining a string is much faster than a map operation as maps need to clone the array. And regex replace operations are much much slower than a quick regex match test and manual iterative string replacement.
Luckily the custom compiler + runtime allows for these sort of things. In fact the only limitation I hit was due to the nature of JSX allowing any JS expression. This means I would need special wrapping of templates to prevent duplicate escaping as the templates combined. This overhead is considerable but was not enough to prevent Solid from topping server side JS Benchmarks. (Re: top of the article).
const doINeedToBeEscaped = "Yes";
// Needs to be escaped
const view = <div>{doINeedToBeEscaped}</div>
const doIStillNeedToBeEscaped = <span>Static Text</span>;
// Doesn't need escaping as would be handled by child template
// Don't want to encode the <span> tag itself
const view2 = <div>{doIStillNeedToBeEscaped}</div>;
The take away is server side JS rendering still has a lot of room for improvement. However, without the DOM putting this heavy constraint on us more than likely the bottleneck is going to be in user code.
In the browser the render cost is so expensive that we do so much as framework writers to prevent unnecessary work. Here those same preventative measures have an actual measurable overhead. If the framework reduces to 1000s of string concatenations the majority of cost is born from the actual application logic.
Looking at the Performance Timeline
Time to see it in action. I made a simple cascading load example written in Solid to see the effect of different choices to SSR/Hydration techniques in a SPA. They all share the same source code for the Components, use best practices like render-as-you-fetch, and automatically serialize data between client server using Solid's Suspense and Resource APIs.
The example is a simple tab navigation and we will be doing page refresh on a Profile page which includes a main JavaScript bundle for Solid and one chunk for the Profile Page. We simulate loading 2 chunks of data, general profile information (400ms load time) we use to display the page, and some additional data needed to display other information about the user (800ms load time).
All the source can be found: https://github.com/ryansolid/solid/tree/master/packages/solid-ssr/examples.
I have split it up so that there is a shared folder where all the application source is. And the other folders representing the different applications taking their own respective approaches.
- Async SSR (/async) - Includes the reactive system on the server and resolves everything on the server before sending the view + data over and fully hydrates on the client.
- Hybrid SSR (/ssr) - On demand rendering that ships the view synchronously but then leaves the client to do all additional data loading and rendering.
- Streaming (/stream) - The same as SSR except now the data is loaded on the server and streamed to the client as it finishes.
I'm not minifying or gzipping here as I have these examples for educational purposes but all are equivalent so that should not impact this. Lighthouse reported suspiciously similar numbers for TTI so I will be looking just at the Chrome Timeline.
Async SSR

FCP - 882.3ms
LCP - 882.3ms
Last Event - 907ms
This is the most naive approach to SSR. This example is a bit exaggerated. You would probably lazy load the second request in a real application. And being on the server you might have quicker access to the data. But this can happens when you don't stream and try to render everything on the server.
Even though this example uses reactivity to asynchronously render, it is is fairly representative of the common approach to Isomorphic SSR. Load the data and then render synchronously the completed view. This is a small page but we are paying the full cost here.
Hybrid SSR

FCP - 102.9ms
LCP - 502.8ms
Last Event - 901ms
In this example we render synchronously on the server without reactivity. And then the client handles all the async data loading. This allows the server to respond quickly without waiting.
However the bulk of the fetching can't start until the client has loaded the JavaScript. We can see this quite clearly. After the first data loads almost exactly 400ms after the FCP we see the LCP. And then additional 400ms we are done. Done at the exact same time mind you.
Now this example is often how JAMStack works in practice. While that is statically generated and this server renders, all significant data loading happens on the client after the page has loaded. All the resources load pretty quickly here. But if they don't this can be a pretty serious slowdown.
Streaming

FCP - 101.3ms
LCP - 434.6ms
Last Event - 825.0ms
This time we see the profile.html load take place over most of the timeline but it doesn't block the other resources loading. This is the stream of content. In the end we see the best LCP and load complete numbers.
You may have expected Async and Streaming to have the same total load time since they can leverage the early loading from the server. But with streaming the JavaScript/CSS beats the data to the client allowing it to render immediately when received. In a non-code splitting example the difference might not be as big but this exemplifies the benefits of not blocking.
Best of all this improves all aspects of a SPA. FCP is faster since the content is rendered on the server. LCP is faster since data fetching starts on the server sooner. And the whole timeline finishes sooner.
Conclusion
I started this without knowing really what to expect, and I was suspect of the need for a "complex" isomorphic experience. I explored the the state of art and had a lot people help along the way as I tried to wrap my head around it (https://github.com/ryansolid/solid/issues/109). Over the course of this investigation I joined the MarkoJS core team, leaders in this area, which greatly broadened my perspective.
I initially struggled to find something to fit the goal that seemed to be laid out before me. And it was only through changing the focus that I managed to both simplify the problem and come up with a solution that adds value to what Solid was already doing. Synchronous rendering in a granular reactive of library who would have thunk it?
I'm pretty happy with the result. Full non-blocking streaming of view and data. Progressive Hydration that works through code splitting. Unparalleled raw server render performance. Improved chrome timeline numbers across the board.
It might not solve every website's SSR need. But, I feel it shows an approach that doesn't alienate the already heavy JavaScript application. And does so in a consistent way that leverages existing client-side patterns for a truly isomorphic experience.
It is delightful to share the exact same code with the server, and to have it stream directly into the client's reactive system as if part of the same single system. And even more so when you realize this works automatically without user intervention. You could write a modern client side SPA and make it isomorphic without changing any component code.
What's next? I look forward to seeing how I can leverage this to take an existing client app and hoist it straight into a cloudflare worker. Partial Hydration also isn't completely off the table even if not the primary objective for Solid.
On the Marko side we are working on a new way to express state/state compositions as part of the declarative template and new granular client runtime. With these and compile time analysis we can better isolate what needs to be shipped to the client at even a subcomponent level.
A framework author's work is never done.
References (in order):
SolidJS - https://github.com/ryansolid/solid
MarkoJS - https://markojs.com/
JS Framework Benchmark - https://github.com/krausest/js-framework-benchmark
Isomorphic UI Benchmark - https://github.com/marko-js/isomorphic-ui-benchmarks
Rendering on the Web by Jason Miller and Addy Osmani - https://developers.google.com/web/updates/2019/02/rendering-on-the-web
The Cost of Client-Side Rehydation by Addy Osmani - https://addyosmani.com/blog/rehydration/
Islands Architecture by Jason Miller - https://jasonformat.com/islands-architecture/
Async Fragments: Rediscovering Progressive HTML Rendering with Marko by Patrick Steele-Idem - https://tech.ebayinc.com/engineering/async-fragments-rediscovering-progressive-html-rendering-with-marko/
Maybe you don't need that SPA by Michael Rawlings - https://medium.com/@mlrawlings/maybe-you-dont-need-that-spa-f2c659bc7fec
Hands-on with Portals: seamless navigation on the web by Yusuke Utsunomiya - https://web.dev/hands-on-portals/
TurboLinks - https://github.com/turbolinks/turbolinks
JSDOM - https://github.com/jsdom/jsdom
basicHTML - https://github.com/WebReflection/basicHTML
Virtual DOM SSR Performance by Boris Kaul - https://medium.com/@localvoid/virtual-dom-ssr-performance-5c292d4961a0
The Fundamental Principles Behind MobX by Michel Westrate - https://hackernoon.com/the-fundamental-principles-behind-mobx-7a725f71f3e8
The Double Diamond Process by Ari Tannenen - http://stopandfix.blogspot.com/2015/07/the-double-diamond-process.html
A Solid RealWorld Demo Comparison of JavaScript Framework Performance by Ryan Carniato - https://levelup.gitconnected.com/a-solid-realworld-demo-comparison-8c3363448fd8
Suspense for Data Fetching (Experimental) - https://reactjs.org/docs/concurrent-mode-suspense.html#approach-3-render-as-you-fetch-using-suspense