Otherwise known as guided navigation, faceted navigation uses product metadata that users can then utilise to refine search queries and category listings.
Most enterprise and e-commerce sites use types of faceted navigation so that users can quickly navigate the site for the products that they desire.
Although this is indeed very wholesome for users, faceted navigation can cause a variety of issues for SEO, as these sites can have tens of thousands of similar versions of pages.
This means that if faceted navigation isn’t implemented correctly, or doesn’t adhere to best practices, your site could be wasting eye-watering amounts of crawl budget, creating duplicate content, and causing index bloat.
What is faceted navigation?
Typically, you will find examples of faceted navigation within sidebars on e-commerce sites, where users can filter product listings to their specifications.
Put simply, facets are intelligent layers and act as extensions to a site’s primary categories.
A unique value is assigned to every selection made within faceted navigation, and these values are passed through URL parameters to change the content of a page.
Designed to narrow down items within a site’s listing page, filters dynamically change what is displayed on a page by appending parameters to the URL.
This results in the generation of similar dynamic URLs, which search engines will treat as duplicate pages if left to decide themselves.
This can happen when all facets and filters are crawlable and indexable, which will waste crawl budget and dilute internal link equity.
As the number of parameters increase, so will the number of pages that are close to being duplicated, which will in turn limit a page’s exposure within search.
Cumulatively, this also means that your site risks keyword cannibalisation issues, which means that your pages will compete with each other within SERPs.
If Google realises this, it could end up, in the worst-case scenario, discounting a page that you do want to rank over one that you do not.
Avoiding issues with faceted navigation is, therefore, necessary for any e-commerce site that wants to do well in search.
Faceted navigation best practices and resolutions
There is a range of best practice solutions that you can implement before faceted navigation harms your SEO campaign.
If you’re working on a new website — use AJAX
When faceted navigation is implemented, AJAX applications inhibit the creation of new URLs when users apply filters so that the sourcing happens without involvement from the web server.
When using this option, you’ll need to ensure that a crawl path is accessible to the pages and that search engines can access every necessary page.
The catch here is that you can only apply AJAX on a new site, so unless you’re undergoing redevelopment, this won’t be a solution that applies to you.
Implement canonical tags
Canonicalisation is a great way of telling search engines that similar pages have a preferred version; the one which a search engine should provide to users after submitting a search query.
This also means that it is possible to consolidate link equity to preferred pages.
At the same time, however, it’s important to note that wasted crawl budgets will still occur, and search engines can also ignore canonical tags if they so wish.
When it comes to faceted navigation, therefore, many sites choose to use this solution in conjunction with another.
Google has written a best practice guide about defining a canonical page, which is worth taking the time out to read.
Optimise your robots.txt
If you need to block URLs as a result of faceted navigation, then implementing a disallow for specific sections of a site can provide a fast and customisable solution.
The risk here is that this will only be a directive rather than an enforcement, which means that search engine spiders might choose to ignore them — although this is typically rare.
Set a custom parameter to designate all the filter combinations and facets that you need to block and add it to the end of each URL string.
Once you have done that, disallow all URLs that contain this parameter in your robots.txt file.
If you want Google to crawl your robots.txt file more quickly, you can use the robots.txt Tester tool to notify the search engine of your changes.
Using your robots
Another possibility is to stop specific directories from being crawled using a disallow directive, which will help save some crawl budget.
That said, if the pages have external links or canonical tags pointing to them, they can still be indexed, so you might want to combine this with a noindex tag at page level.
Be aware, however, that from 1 September 2019, you will no longer be able to do this within the robots.txt file.
A noindex tag alone does not stop your pages from being crawled, which means that your budget will still be wasted.
You can combine page level noindex with a disallow command to prevent pages from appearing in the index and being crawled.
This solution is useful for when faceted navigation has gone wrong and you need to clean up duplicated parameter URLs from search engines.
Use Google Search Console
As an option of last resort, you could use Google Search Console to instruct Google on how your site should be crawled.
Using the URL Parameters Tool, you can indicate the purpose of each parameter on a page, as well as how Google should treat them.
This, however, does not correct your issue, and this will only apply to Google, and not other search engines such as Bing, Yahoo, or DuckDuckGo.
Apply nofollow within internal links
If you need to curtail crawl wastage, there’s also the option to nofollow internal links to facets that you don’t want bots to crawl.
Again, this won’t solve the issue entirely, as duplicate content will still be indexed, and link equity will still be affected.
Whatever solution you choose, do not rely solely on one if it does not resolve indexing issues, page equity dilution, and crawl wastes; you need to deal with all three.
As with any element of technical SEO, you will need to take great care in choosing the options that are the best for your site.