Websites built using JavaScript frameworks have not always been considered the best option for SEO. But while ranking highly in the search results has continued to become more and more important over the years, client-side rendered JavaScript sites are no problem.

This article will look at some of the technical considerations when using JavaScript frameworks in SEO driven websites, to make sure your pages rank well.

Googlebot and JavaScript

Googlebot has steadily improved in its ability to crawl and index JavaScript pages. It ‘sees’ pages just as they would load in a human user’s browser, but it does not allow for functionality such as cookies and session data, which means each new page is loaded completely afresh, without any session or state retained.

The Google crawler robot is also capable of following JavaScript redirects and will include the redirected URL in its search index. In this sense, you can think of JavaScript redirects as similar to 301-page redirects, the HTML error code that indicates a page or resource has been moved elsewhere.

How does Googlebot follow JavaScript links?

In general, Googlebot can follow links provided that they are contained within an <a> anchor tag, including in the following ways:

  • Functions like “javascript:window.location” contained within the href attribute
  • External functions e.g. “javascript:followlink()” called from the href attribute
  • Functions trigger on the click of an <a> element, but no in the href attribute

The important factor here is the use of an <a> tag to act as a signpost for Googlebot to recognise that there is a link, and to then determine how to follow that link and crawl the content it points to.

Once the link is followed, the target content should render relatively quickly – ideally within 3 seconds or less – to make sure Googlebot sees it, crawls it correctly and indexes it with the appropriate SEO value.

How to Make JavaScript Frameworks SEO Friendly

There are a few ‘best practice’ techniques for SEO friendly JavaScript websites, including:

Server-Side Rendering

Server-side rendering gives you good control over what Googlebot sees, as the rendering takes place on your side, rather than at the whim of the bot’s client-side capabilities.

This has benefits for SEO. By serving pre-rendered content to Googlebot, you can essentially optimise that content for SEO just as you would any static web page.

Media Elements

If your page has multiple media elements, for example in a dynamically updated image gallery or slideshow, make sure the search robots can see the location of all the media files, and not just the first one that appears by default on the pre-rendered page.

JavaScript is somewhat limiting in this circumstance, but you can instead use CSS to update the image displayed. By doing so, it should be possible to provide Googlebot with the URIs of all the images, increasing your chances of getting them all included in Google Images search results.

Pagination

Paginated content can be problematic but the best practice approach here is relatively simple: make sure each index page has its own URL and is not just dropped dynamically into the original index page.

By creating true pages with distinct URLs, you give Googlebot addresses that it can resolve, and you increase the number of pages on your website that can be crawled and indexed. This applies across all navigation on your site and has natural benefits for accessibility too.

Metadata

Be consistent about how you update metadata as the search bots move between pages when crawling your site. It’s good to ensure every page has at least the basic metadata added to it, such as:

  • <title> tags for the page title that appears in the browser tab or title bar
  • <meta name=”description”> tags to provide search bots with a summary
  • <link rel=”canonical”> tags for pages that may be duplicated elsewhere
  • <link rel=”alternate” hreflang=”en-gb”> tags for pages in multiple languages

The end justifies the means here. Choose a solution that you are happy with, and comfortable using, as long as it leads to complete and correct metadata on every page.

How to Test JavaScript Sites for SEO

There are a growing number of tools that not only render JavaScript pages to emulate search bot crawling, but can also allow you to directly compare the rendered page with the raw code.

Some examples include:

  • Botify
  • DeepCrawl
  • Screaming Frog

Render the page content first using your preferred tool to process the JavaScript just as one of the search robots would do, and then you are in a good position to start troubleshooting any problems with the SEO elements and content on the page.

Fetch as Google and URL Inspection Tool

In the past, Google’s own ‘Fetch as Google’ tool allowed you to see your web pages exactly as Googlebot would do. It has now been replaced by the URL Inspection Tool accessible via Google Search Console.

To use:

  • Paste the URL you want to test into the search box at the top of Google Search Console
  • Click ‘Test Live URL’ at the top-right of the result page
  • Click ‘View Tested Page’ on the main ‘URL Inspection’ panel to see your page code and a screenshot of how Googlebot sees your page

Even if you use the third-party tools mentioned above, it’s good practice to run pages through the URL Inspection Tool for a quick and easy impression of how Google sees your site, and any content it cannot see at all.

Turn Off JavaScript

Finally, a less technical way to test your site is simply to turn off JavaScript rendering in your browser, and then load your website. Any content that does not load with JavaScript turned off is likely to prove troublesome for SEO and your search rankings, and is a good place to start when making updates and fixing problems with your search presence.

You can deactivate JavaScript using your browser settings, or if you’re comfortable using the Developer Tools panel, just open it up, head to the ‘Network’ tab and choose ‘Block Request URL’ on the context menu of the resource you want to block.

Reload the page and see what’s changed – you might identify updates you can make to improve your SEO, or you might just find a resource that is not needed for the page to function correctly, which can then be removed to improve your page loading speed.

Conclusion

All SEO, from good old-fashioned meta tags to microformats and microdata markup, is about ensuring content is not only visible but optimised to be crawled and indexed by the search robots.

JavaScript frameworks are not inherently problematic for this – they just need a little forethought and some basic best practice principles to be applied. By doing so, you can ensure you claim your place at the top of the search results while providing a fast, logical and accessible website to your human visitors too.