It has been around for decades – it was reportedly created in just ten days in September 1995 by Netscape programmer Brandan Eich – and was originally a method by which HTML pages could interact with web applets written using the programming language Java.
Is JS Significant for SEO?
In either case, you should keep in mind that your content needs to be visible, crawlable and indexable by the search robots, which have not always been highly capable of parsing JS code and understanding its output.
The search engines are good at crawling content served in this way. You’ll find server-side rendering used on websites programmed in ASP, PHP and Ruby and many others. You might even have used it without realising if you’ve had a website built in a content management system (CMS) like Magento or WordPress.
It’s not impossible to create crawlable websites in this way, and there are plenty of tools that emulate what the search robot would see; however, server-side rendering is generally held to be the ‘safer’ option.
What Does Googlebot See?
When Googlebot encounters a page written using JS, it crawls, renders and then indexes the content (subject to any meta tags or robots.txt rules that prevent it from accessing the page).
Client-side rendering is more resource-intensive, so Googlebot, working flat out to index as much of the web as it can reach, may take several days to crawl a JS-heavy website. It may also miss important SEO content, such as page titles and meta tags, if they are not rendered server-side.
Static vs. Dynamic Content
In the context of a web page, static content includes any text, images and other objects that remain unchanged once they are loaded. Most parts of most ordinary pages are ‘static’ in that, once the HTML code has been rendered and displayed, the page remains the same until the user navigates elsewhere via a hyperlink or their browser’s ‘back’ button.
Crawled vs. Indexed
The difference between a page being crawled vs. being indexed is quite subtle:
- Crawled means Googlebot has ‘seen’ the page and taken a snapshot of its content.
- Indexed means that content has been analysed and included in the search results.
For the best chance to appear high in the search results, content should be easy to crawl and easy to index. JS makes content more opaque and if you decide to use it, especially with client-side rendering, you should test thoroughly to ensure it is as lightweight and fast to load as possible.
It runs on the Chromium engine – the open-source software that powers web browsers including Google Chrome and (since 2020) Microsoft Edge. So viewing your website in a Chromium-based browser is a good way to get an instant preview of how your content might be seen by Googlebot – there are also plenty of tools that can give you a more exact impression of how Googlebot will render your page.
Search robots discover new pages by following hyperlinks from pages they have previously found, so it’s essential to make your navigation visible using <a> anchor tags for your hyperlinks, and recognisable URL values.
This is one of the most important and basic things to implement on a JS website. Without clearly visible internal links, you risk publishing pages that are impossible for the search bots to find, which in turn means they will never be properly crawled or indexed, or appear in search results.
As well as being discoverable, content should be accessible. Make sure it is rendered correctly by using one of the many ‘view as Googlebot’ tools available online – we’ll list some of them below.
Good accessible content is a strong basis for SEO efforts in general, so adhere to SEO best practices: quality content, structured well, and not duplicated elsewhere online (you can and should use rel=”canonical” tags in the page header to indicate that a page is the ‘master copy’ and any others found online are duplicates).
No Session Data
How to Improve SEO on JS Websites
We’ve already mentioned some fundamental ways to improve SEO on JS websites:
- Use server-side rendering (or a server-side framework) to reduce client-side resource demand.
- Improve page load speeds so final rendering occurs within 3-4 seconds.
- Make sure content is visible and easily understandable – including navigation links.
For more specific and detailed improvements, it’s useful to know how your page renders to the search robots – and you should use emulator tools to check the output seen by both desktop and mobile users, since mobile has become a significant ranking factor in recent years.
Testing Content in the Browser
A quick way to test page content is to right-click on any page element and choose ‘Inspect’. This will bring up the Developer Tools panel and should go directly to that element in the HTML code. (You can also launch Developer Tools by pressing F12 in most modern browsers.)
If you can’t see the information you expect using the Inspect tool, it’s possible that the element is being displayed using client-side rendering, and that might mean the bots cannot see any associated SEO content such as ‘alt’ and ‘title’ attributes.
Google Search Console
Google Search Console provides a more powerful way to interpret your page content as Googlebot would see it – the URL Inspection Tool. You can start this process by just pasting the page URL into the search box at the top of Search Console.
This is a good way to get a snapshot of your page as it appears to mobile devices, which should be a priority in any present-day SEO campaign. If content that renders correctly on desktop devices is not visible to mobile users, consider making the necessary changes to make it accessible across all platforms, operating systems, browsers and screen sizes using responsive web design techniques.
Other Testing Tools
Finally, here’s a list of some more Google testing tools and third-party search robot emulators to help you render your content as the bots would see it, and diagnose any problems that need to be put right for improved SEO.
Google Mobile-Friendly Test
Google’s Mobile-Friendly Test tool will give you quick results and score your page for accessibility on mobile devices – a fast way to identify any fundamental design flaws.
Google PageSpeed Insights
Another Google tool, PageSpeed Insights, can help you to accelerate your page loading times – crucial for any that are over 3-4 seconds.
There are a huge number of third-party tools to emulate search robot rendering, compare raw code against the rendered output, and test small tweaks directly in the browser.
- BuiltWith – A free tool to identify what framework a website is built on.
- DeepCrawl – Crawls an entire website, ideal for mass testing of sitewide rendering.
- Diffchecker – Looks for differences between original page code and rendered output.
Ultimately, JS websites are not a problem for SEO in the modern era – they just need a little intelligent design and forward planning to keep your content fast, transparent and easy to navigate.