JavaScript (JS), initially named Live Script, was developed by Brendan Eich in 1995. Put simply, JS is a scripting language that enables web developers to insert code within a website.

The script also allows performing tasks that are not possible in the traditional HTML markup language.

These are mostly interactive behaviours, such as displaying hamburger menus, or zooming in or out on an image.

JS’ advantage lies in the fact that it is easy to learn, allows rapid prototyping and development, and makes a website more interactive for maximum user experience.

With this in mind, we have put together a list of advice for common errors that may occur while using JS.

Don’t block JS from robots.txt

Historically, search engines were unable to crawl JS files. Therefore, they were stored in directories and blocked robots.txt, a text file created by website runners to instruct search engines on what content to crawl.

Since then, however, search engines have evolved, and five years ago Google announced that it could crawl JavaScript and CSS.

This meant that webmasters no longer needed to block robots.txt.

It’s important, therefore, that you don’t block JS if you want the page to be indexed by search engines.

You can check whether this is the case by logging into your Google Search Console and inspecting a URL.

If you have a lot of JS files across your site, then Google could send a message informing you that the Googlebot cannot access JS files.

Conduct a site crawl and unblock robots.txt access to fix the issue.

Preserve site speed by placing JS below the fold

For a browser to be able to display a web page, it must render its content, whether HTML, CSS, or JS.

This means that rendering JS does take a little more time, as the browser must firstly request the script file and wait until it is downloaded from the server before it is executed.

JavaScript loads in order of appearance, therefore, to increase your site speed, place it lower on the page and leave it above the fold space for content that does not interfere with site load speed.

Low site speed decreases user experience and slows page crawling. In fact, if the page takes longer than five seconds to load, it might not be indexed by search engines.

Assess whether your JavaScript files are essential to the webpage, and if they are worth making your users wait longer for it to load.

Only inline small JS in the <head>

Continuing with the previous point, only small size JS files are suggested for inlining in the <head> of the web page, which is especially important when the text includes other essential SEO elements such as canonical, hreflang, index, and follow tags.

To avoid issues, place the most crucial SEO attributes at the top and before any JavaScript files. To check whether you have any inlined <script> in the <head>, click “view page source”. Place <link> tags near the top of the <head> element, so they are rendered before the JavaScript files. For optimal performance, only inline small scripts.

Rethink JS impact on content rendering

If your website relies on JavaScript for content rendering, Google must recrawl it using its Web Rendering Service before anything meaningful is indexed by Google.

DOM stands for Document Object Model, which is a digital interface neutral to platform and language and is a representation state of the website in the browser.

Googlebot initially crawls HTML content, and only then does it crawl JavaScript. Depending on the crawl budget that Google has allocated to your website, the process can take several days or even weeks.

Also, as it is a two-step crawling process, there might be issues when the crawled content does not match, and JS rewrites the traditional HTML content.

You can disable JS to see if any of the content is affected by it or check it using Google’s Mobile-Friendly Test.

If you detect JS in DOM, then switch to creating elements in HTML to improve crawlability, because if the crawlability is suffering, fewer pages will be discovered.

Avoid JS redirects at all costs

Another element that can slow down site speed and perturb user experience is the use of JS redirects. JS redirects are commonly used by developers, as Googlebot can understand it and treat as a standard redirect.

The issue lies in the fact that as previously mentioned, JS is crawled in the second round of website crawling, meaning that the JS redirects may take days or weeks to get crawled and indexed.

It is important to note that all site speed and consecutively user experience tests are subject to the original device and internet connection quality.

Furthermore, results from different devices and locations cannot be meaningfully correlated. Certain countries have a significantly worse internet connection than others.

For example, a user in Australia accessing a website hosted in the USA will experience a considerably longer loading time than a user located in the USA.