JavaScript has become an important way to make web pages more interactive, by adding more dynamic elements, animations and special visual effects that would be difficult or impossible to achieve using HTML and CSS.

It has been around for decades – it was reportedly created in just ten days in September 1995 by Netscape programmer Brandan Eich – and was originally a method by which HTML pages could interact with web applets written using the programming language Java.

The two – Java and JavaScript – are not the same thing, and when working on web pages generally speaking, you’ll use JavaScript (or JS for short). In this guide, we’ll look in detail at JS and its implications for search engine optimisation (SEO).

Is JS Significant for SEO?

The short answer is yes. There are several different ways to use JavaScript to display content to end-users:

  • Server-side rendering, which uses JavaScript frameworks as the basis for websites that are pre-rendered and then displayed to the user.
  • Client-side scripting, which runs JavaScript on the user’s device to dynamically affect the content they see.

In either case, you should keep in mind that your content needs to be visible, crawlable and indexable by the search robots, which have not always been highly capable of parsing JS code and understanding its output.

Server-Side Rendering

Server-side rendering is a better option for SEO. It gives you direct control over how your JavaScript is interpreted by search robots. Content is pre-rendered by the server so that the client receives complete HTML code, plus any external information like CSS stylesheets.

The search engines are good at crawling content served in this way. You’ll find server-side rendering used on websites programmed in ASP, PHP and Ruby and many others. You might even have used it without realising if you’ve had a website built in a content management system (CMS) like Magento or WordPress.

Client-Side Rendering

Client-side rendering makes things more difficult for the search bots, which is usually a bad idea for SEO. The initial page the bot receives may be almost blank before the JavaScript executed on the client-side populates it with dynamically rendered content.

It’s not impossible to create crawlable websites in this way, and there are plenty of tools that emulate what the search robot would see; however, server-side rendering is generally held to be the ‘safer’ option.

What Does Googlebot See?

When Googlebot encounters a page written using JS, it crawls, renders and then indexes the content (subject to any meta tags or robots.txt rules that prevent it from accessing the page).

Client-side rendering is more resource-intensive, so Googlebot, working flat out to index as much of the web as it can reach, may take several days to crawl a JS-heavy website. It may also miss important SEO content, such as page titles and meta tags, if they are not rendered server-side.

Static vs. Dynamic Content

In the context of a web page, static content includes any text, images and other objects that remain unchanged once they are loaded. Most parts of most ordinary pages are ‘static’ in that, once the HTML code has been rendered and displayed, the page remains the same until the user navigates elsewhere via a hyperlink or their browser’s ‘back’ button.

Dynamic content is any element that can be created or changed after the page has already loaded. That could be some sort of image slideshow, video or animation, for example. JavaScript is a common way to achieve this and the search robots, including Googlebot, will normally wait 3-4 seconds for any JS to execute on page load, before taking a snapshot of the outputted HTML to index.

Crawled vs. Indexed

The difference between a page being crawled vs. being indexed is quite subtle:

  • Crawled means Googlebot has ‘seen’ the page and taken a snapshot of its content.
  • Indexed means that content has been analysed and included in the search results.

For the best chance to appear high in the search results, content should be easy to crawl and easy to index. JS makes content more opaque and if you decide to use it, especially with client-side rendering, you should test thoroughly to ensure it is as lightweight and fast to load as possible.

What Version of JavaScript Does Googlebot Use?

Googlebot is regularly updated to ensure it uses a recent version of JavaScript to render content, so you can be fairly confident that it will have the basic capabilities needed to load your page.

It runs on the Chromium engine – the open-source software that powers web browsers including Google Chrome and (since 2020) Microsoft Edge. So viewing your website in a Chromium-based browser is a good way to get an instant preview of how your content might be seen by Googlebot – there are also plenty of tools that can give you a more exact impression of how Googlebot will render your page.

SEO Concerns in JavaScript

When deploying a website using JavaScript, it’s important to keep aspects of SEO firmly in mind. These range from the technical – ensuring the page renders within 3-4 seconds without errors – to content-based SEO issues like meta tags and navigation links.

Transparent Links

Search robots discover new pages by following hyperlinks from pages they have previously found, so it’s essential to make your navigation visible using <a> anchor tags for your hyperlinks, and recognisable URL values.

It’s OK to use JavaScript to generate the URL, for example by executing a script once the page loads using a JS onload event, but make sure this is contained within an <a> element on the page so it’s apparent to the robots that it is a hyperlink.

This is one of the most important and basic things to implement on a JS website. Without clearly visible internal links, you risk publishing pages that are impossible for the search bots to find, which in turn means they will never be properly crawled or indexed, or appear in search results.

Accessible Content

As well as being discoverable, content should be accessible. Make sure it is rendered correctly by using one of the many ‘view as Googlebot’ tools available online – we’ll list some of them below.

Good accessible content is a strong basis for SEO efforts in general, so adhere to SEO best practices: quality content, structured well, and not duplicated elsewhere online (you can and should use rel=”canonical” tags in the page header to indicate that a page is the ‘master copy’ and any others found online are duplicates).

No Session Data

JavaScript can work with server-side session data to store information throughout a user’s visit until they close the browser tab or navigate away from the website; however, search robots typically load each page completely afresh, with no session data stored.

It’s worth remembering this and testing each page completely independently, to ensure there are no JavaScript errors if the page is loaded without a session identifier or any cached data or browser cookies.

How to Improve SEO on JS Websites

We’ve already mentioned some fundamental ways to improve SEO on JS websites:

  • Use server-side rendering (or a server-side framework) to reduce client-side resource demand.
  • Improve page load speeds so final rendering occurs within 3-4 seconds.
  • Make sure content is visible and easily understandable – including navigation links.

For more specific and detailed improvements, it’s useful to know how your page renders to the search robots – and you should use emulator tools to check the output seen by both desktop and mobile users, since mobile has become a significant ranking factor in recent years.

Testing Content in the Browser

A quick way to test page content is to right-click on any page element and choose ‘Inspect’. This will bring up the Developer Tools panel and should go directly to that element in the HTML code. (You can also launch Developer Tools by pressing F12 in most modern browsers.)

If you can’t see the information you expect using the Inspect tool, it’s possible that the element is being displayed using client-side rendering, and that might mean the bots cannot see any associated SEO content such as ‘alt’ and ‘title’ attributes.

You can also navigate to the Sources tab in Developer Tools, which will list all of the resources required to load the page. Here you can identify any external JavaScripts that were executed – including any you were not already aware were being used.

Google Search Console

Google Search Console provides a more powerful way to interpret your page content as Googlebot would see it – the URL Inspection Tool. You can start this process by just pasting the page URL into the search box at the top of Search Console.

This is a good way to get a snapshot of your page as it appears to mobile devices, which should be a priority in any present-day SEO campaign. If content that renders correctly on desktop devices is not visible to mobile users, consider making the necessary changes to make it accessible across all platforms, operating systems, browsers and screen sizes using responsive web design techniques.

Other Testing Tools

Finally, here’s a list of some more Google testing tools and third-party search robot emulators to help you render your content as the bots would see it, and diagnose any problems that need to be put right for improved SEO.

Google Mobile-Friendly Test

Google’s Mobile-Friendly Test tool will give you quick results and score your page for accessibility on mobile devices – a fast way to identify any fundamental design flaws.

Google PageSpeed Insights

Another Google tool, PageSpeed Insights, can help you to accelerate your page loading times – crucial for any that are over 3-4 seconds.

Third-Party Tools

There are a huge number of third-party tools to emulate search robot rendering, compare raw code against the rendered output, and test small tweaks directly in the browser.

Examples include:

  • BuiltWith – A free tool to identify what framework a website is built on.
  • DeepCrawl – Crawls an entire website, ideal for mass testing of sitewide rendering.
  • Diffchecker – Looks for differences between original page code and rendered output.

By adopting best practice for SEO and JavaScript websites from the outset, you can reduce your reliance on testing tools and should find fewer issues to resolve when you use them.

Ultimately, JS websites are not a problem for SEO in the modern era – they just need a little intelligent design and forward planning to keep your content fast, transparent and easy to navigate.