The script also allows performing tasks that are not possible in the traditional HTML markup language.
These are mostly interactive behaviours, such as displaying hamburger menus, or zooming in or out on an image.
JS’ advantage lies in the fact that it is easy to learn, allows rapid prototyping and development, and makes a website more interactive for maximum user experience.
With this in mind, we have put together a list of advice for common errors that may occur while using JS.
Don’t block JS from robots.txt
Historically, search engines were unable to crawl JS files. Therefore, they were stored in directories and blocked robots.txt, a text file created by website runners to instruct search engines on what content to crawl.
This meant that webmasters no longer needed to block robots.txt.
It’s important, therefore, that you don’t block JS if you want the page to be indexed by search engines.
You can check whether this is the case by logging into your Google Search Console and inspecting a URL.
If you have a lot of JS files across your site, then Google could send a message informing you that the Googlebot cannot access JS files.
Conduct a site crawl and unblock robots.txt access to fix the issue.
Preserve site speed by placing JS below the fold
For a browser to be able to display a web page, it must render its content, whether HTML, CSS, or JS.
This means that rendering JS does take a little more time, as the browser must firstly request the script file and wait until it is downloaded from the server before it is executed.
Low site speed decreases user experience and slows page crawling. In fact, if the page takes longer than five seconds to load, it might not be indexed by search engines.
Only inline small JS in the <head>
Continuing with the previous point, only small size JS files are suggested for inlining in the <head> of the web page, which is especially important when the text includes other essential SEO elements such as canonical, hreflang, index, and follow tags.
Rethink JS impact on content rendering
DOM stands for Document Object Model, which is a digital interface neutral to platform and language and is a representation state of the website in the browser.
Also, as it is a two-step crawling process, there might be issues when the crawled content does not match, and JS rewrites the traditional HTML content.
You can disable JS to see if any of the content is affected by it or check it using Google’s Mobile-Friendly Test.
If you detect JS in DOM, then switch to creating elements in HTML to improve crawlability, because if the crawlability is suffering, fewer pages will be discovered.
Avoid JS redirects at all costs
Another element that can slow down site speed and perturb user experience is the use of JS redirects. JS redirects are commonly used by developers, as Googlebot can understand it and treat as a standard redirect.
The issue lies in the fact that as previously mentioned, JS is crawled in the second round of website crawling, meaning that the JS redirects may take days or weeks to get crawled and indexed.
It is important to note that all site speed and consecutively user experience tests are subject to the original device and internet connection quality.
Furthermore, results from different devices and locations cannot be meaningfully correlated. Certain countries have a significantly worse internet connection than others.
For example, a user in Australia accessing a website hosted in the USA will experience a considerably longer loading time than a user located in the USA.