From its beginnings as a tool to enhance static webpages built from HTML and CSS, JavaScript has become a core building block of the web. 

JavaScript enables us to build engaging and responsive webpages and to deliver interactive experiences through a website — from editing documents to playing games and buying anything from groceries to cars. But all that sophistication comes with a cost. 

In this article, we’ll explore what happens when you load a page that relies on JavaScript and the implications that has for SEO.

What is JavaScript rendering?

Put simply, rendering is what your browser does when navigating a webpage. When you open a web browser like Chrome, Firefox, or Safari and click a link or type a URL, your browser requests the computer hosting the website for the data to display the page in question. It then decodes that data and uses it to construct and display — in other words, render – the webpage.

The decoding process involves multiple steps, which result in the browser constructing two models: the domain object model (DOM), which defines the relationships between each of the HTML elements on the page, and the CSS object model (CSSOM), which defines how those elements are styled. 

Having done this, the browser combines the DOM and CSSOM to determine how to lay out the content on the page. Once this stage is complete, the browser finally starts drawing the pixels to the display.

When webpages were built using mainly HTML and CSS, perhaps with a bit of basic JavaScript to implement Google Analytics tracking code or display a confirmation message, this rendering process was relatively quick and simple. 

The HTML file returned in response to the initial request contained most of what was needed to render the webpage, so the browser could immediately start decoding the content and building the page (subject to a few calls to fetch images as needed).

    <title>Hello World</title>
    <meta name=“description” content=“Hello World”>
    <p>Hello World</p>
    <img src=“example-image.jpg” alt=“Here is some alt text”>

However, the picture becomes a little more complicated when JavaScript is used to load and display the primary content on the webpage — including body text, images, videos, navigation, and more sophisticated interactive components. 

Instead of receiving an HTML file with all the content set out in divs and paragraphs, the browser receives a file referencing one or more scripts.

Because these scripts can change both the page content and styles, the browser has to fetch and execute them before it can construct the DOM and CSSOM, and, therefore, before it can determine the layout and start drawing pixels on the screen.

    <title>Hello World</title>
    <meta name=“description” content=“Hello World”>
    <p>Hello World</p>
      var addImg = document.createElement(“img”);
      addImg.src = “example-image.jpg”;
      addImg.alt = “Here is some alt text”;

Using scripts to construct the page has several advantages. Because you can pass values into the script each time it is run, you can tailor the response to user inputs and create an interactive experience. 

You can also update content dynamically (rather than requiring the user to reload the page) and provide functionality that was previously only possible using an installed application. 

Unfortunately, there are also downsides: parsing JavaScript requires additional functionality on the part of the browser, and fetching and executing scripts can add considerable time to the rendering process.

Why does rendering speed matter?

For your content to appear in search engine result pages, it must first be crawled and indexed by that search engine. The process starts with a search engine crawler discovering the URL and making a request for the page content –— just as your browser does when you click a link — and waiting for the response in the form of an HTML file.

Historically, one of the main issues with using JavaScript to load the primary content on a page was that search engine crawlers could not parse the JavaScript and would not see the content at all. 

Google updated its search crawler, Googlebot, in 2019, so it’s always up-to-date with the latest version of Chrome and can parse the latest JavaScript features. While this doesn’t fix everything (not all other search engine providers have followed suit, and nor have social media crawlers), it means the focus for Google is on how search bots read JavaScript-laden pages.

Although Googlebot can now parse JavaScript, it does not do so on the first pass of your web page. Instead, when the crawler encounters a page containing JavaScript that needs to be executed, it moves the page to a second queue so that it can be rendered before being processed. 

There are no guarantees on how long this process will take — it could be seconds, but it could be much longer. It can take longer for pages containing JavaScript to be indexed and for search result listings to be updated when you make changes to those pages.

Speeding up the script execution and reducing the amount of JavaScript that must be rendered before the crawler can index a page will increase your content’s indexability.

Search engine crawlers aside, there’s another reason why slow JavaScript rendering can impact SEO. While you might not notice a delay if you’re viewing a website on your laptop over a wired connection, the same is not true when viewing a JavaScript-heavy site on a phone or tablet over a cellular connection or flaky Wi-Fi signal.

Slow and unresponsive pages are frustrating for users and result in higher bounce rates, damaging your search rankings.

What are the options for JavaScript rendering?

Fortunately, as the use of JavaScript has become increasingly widespread, various strategies have evolved to speed up page load times and make it easier for search engines to index content on JavaScript-heavy sites.

Client-side rendering

Client-side rendering is your default starting point if you’re using a JavaScript library such as React, Angular, or Vue to build your website.

As we’ve seen, the time taken to fetch and execute scripts can make pages slower to load and prevent search engine crawlers from reading the content on your site, harming your search rankings. That’s not to say that client-side rendering should never be used — if your page response varies with each request, then it may be the best approach.

Various strategies have been developed to improve performance on client-side rendered pages, such as keeping script size down by splitting code bundles, lazy-loading, and caching scripts that are re-used across multiple pages.

Using these techniques helps reduce load times for users accessing your site on mobile devices and increases the likelihood and speed with which search bots will index your content.

Server-side rendering

Server-side rendering (SSR) refers to how websites used to be served; when a request for a URL is made to the server, the HTML is generated on the server and then sent to the browser for display. Generating the response on the server means fewer and faster roundtrips to fetch resources (such as scripts and images) compared to client-side rendering but at the expense of slowing down the initial response.

Popular JavaScript frameworks include functionality to support SSR so that you can opt for server-side rendering if it suits your needs better. React has Next.js, Angular has Angular Universal, and Vue has Nuxt.js.

Static site generation

Static site generation, also known as pre-rendering, is an alternative to server-side rendering that works well for some use cases. Whereas server-side rendering requires the page to be generated on the fly when it is requested, static site generators produce the HTML for each URL ahead of time — usually as some kind of build step for the site — so that it’s ready to serve whenever it’s requested. 

This means you can use a CDN to cache content on edge servers, further reducing page load times. Gatsby and Jekyll are two popular static site generators, while Next.js combines static generation with server-side rendering.

Generating pages in advance works well for sites with a lot of static content, such as blog posts, product listings, articles, marketing pages, or help documentation.

The downside is that you can’t respond to “live” data in the request, so it isn’t suitable if the content is frequently updated or changes with each request. In these cases, you need to render the page dynamically on the client, the server, or a hybrid combination of the two.

Hybrid rendering options

There are advantages to both client-side and server-side rendering (and its cousin, static site generation). Rather than choosing one over the other, combining both techniques within a single page is possible.

When the initial request for the page is made, some of the content is rendered on the server (or, in the case of SSG, the pre-rendered file is retrieved) and sent to the client, where any remaining JavaScript is then rendered. This process is known as (re)hydration and is useful where only some of the page content needs to be dynamic.

Multiple variations on this theme need to be implemented carefully to avoid creating a disjointed experience or presenting users with an unresponsive page.

Dynamic rendering

If the previous alternatives to client-side rendering are not suitable for your use case and you’re concerned your site is not ranking well in search results because of client-side JavaScript, dynamic rendering might be the solution you need.

Dynamic rendering is a workaround to the SEO issues associated with client-side rendering and is recognized as a legitimate technique by search engines, including Google and Bing.

With this approach, you have two versions of each page: a statically rendered version that is served only to bots and the regular client-side rendered version for human users.

For each request, you first need to detect whether the request is from a real person or a crawler (for example, via the user agent) and then serve the relevant response.

When using dynamic rendering, it’s essential that both pre-rendered and client-side rendered content are very similar. Returning substantially different content is considered cloaking – something you want to avoid at all costs.

Of course, this approach doesn’t speed up content delivery to real users. If performance is an issue, you will also want to use other techniques to accelerate time to interactivity for your client-side rendered pages.

Wrapping up

If you’re using JavaScript to create richer web experiences for your users, understanding how JavaScript is rendered in the browser and what the implications of this are for page load times, user experience, and search engine crawlers will help you to improve your site’s performance and ensure that your pages rank higher in search results.

While no single approach solves all problems, you’re not limited to choosing just one strategy.

You might choose sever side rendering or static site generation for pages with little or no interactive content, such as product listings and blogs, and keep client-side rendering for highly interactive pages and add dynamic rendering to improve their crawlability.

Finally, consider that rendering strategies are just one element of technical SEO for JavaScript.

Optimizing your JavaScript-based site for users and search bots alike, don’t forget to generate metadata specific to each page, lazy load images, and ensure you’re returning the correct HTTP status codes.