There can be many reasons why a website might suddenly lose traffic.

From changes in search engine algorithms, through to simple tracking errors, the reasons for a traffic drop can be either simple or agonisingly complex.

At SALT.agency, we provide comprehensive traffic drop audits for companies that are experiencing depreciated levels of traffic.

Before you contact us, however, it might be worth checking to see if there a few simple fixes or issues that might be worth checking first.

Check for tracking errors

It’s too easy to wipe tracking codes from websites and this is a mistake that has been made by several clients.

If you find that there are suddenly zero sessions currently being recorded in Google Analytics, then it could be that the tracking codes are either broken or removed from the website.

The good news is that fixing the issue is easy.

With the proper access, you should be able to fix the code yourself, and you can find out how to do so in this analytics help guide.

Do you have redirect errors?

Broken redirects can be the bane of any large website, and tend to be added via a .htaccess file.

When a site is undergoing significant changes, it is common to implement 301-redirects, which tells search engines that a page has moved.

Users will also be redirected to a new address.

If a 301-redirect is not implemented correctly, however, there will be no redirect.

As a consequence of this, you could find that your site is losing traffic.

You can check through your redirects by using a web crawler such as Screaming Frog.

You could have incorrect robots.txt rules

Another surprisingly common problem is when a website accidentally blocks search engines from crawling them.

This issue can sometimes happen when a site has undergone a migration, and the developer has forgotten to change the robots.txt files.

You can check this by going to your site’s robot.txt file. It should read as the following:

User-agent: *

Allow: /

Sitemap: http://www.example.com/sitemap.xml

If it says “disallow” instead of “allow” you will need to remove the rule and resubmit the file through Google Search Console.

XML Sitemap changes

Only indexable URLs that return a 200 response should be visible in sitemaps, although if there is a change in your XML sitemap, you could see a decline in traffic.

Crawl every URL in a sitemap and ensure that they return a 200 OK response.

If for some reason, the sitemap contains fewer URLs than there are on the site, you will need to regenerate it and resubmit it via Google Search Console.

Do you have a manual action?

Should one of Google’s human reviewers find that your site is in breach of Google’s webmaster quality guidelines, your website could receive a manual action penalty.

It is easy to find whether you have a manual action penalty, for if you do have one, you should have received a notification in the Manual Actions Report within Google Search Console.

From there, expand the description panel within the report and see which pages are affected, alongside the type of action and an explanation regarding why you have received it.

You can learn more about how to fix a manual action in this Search Console Help guide.

Bear in mind that it can take up to two weeks for a reconsideration review to complete.

Your server could be overloaded

Servers that encounter large amounts of traffic can overload and crash.

Sites with shared servers are more likely to experience a crash as other sites can cause the surge and bring down your own.

Furthermore, hosting providers can take your site down if you exceed your plan’s bandwidth limit.

A site with lengthy periods of downtime can negatively affect your rankings, and of course, your traffic with it.

There are a variety of solutions to prevent server overload, including the blocking of unwanted traffic using firewalls, and to provide alternate sources for content delivery via site caching.

The most effective method, however, is through load balancing, which eases the flow of traffic to servers.

Learn more about load balancing in this tutorial by Digital Ocean.

Is your site suffering from cannibalisation?

For sites that have many pages covering the same topic, there is a chance that they could be competing with each other in the search engine results pages.

This confusion means that your site traffic will be drastically diluted.

Furthermore, if Google realises that two pages are offering the same information, it could choose to deindex one of them. The issue here is that it could deindex the page that you need.

There are a variety of solutions for fixing a cannibalisation issue.

Has there been an algorithm update?

Search engine algorithm updates can have huge impacts on websites and their traffic.

A significant example of this was when Google released its June 2019 Core Update, and the Daily Mail lost 50% of its traffic over a period of 24 hours.

Google releases multiple updates every year and has started making announcements about when the changes are to roll out.

The only way to tell whether an algorithm update has impacted you is to keep a close eye on news from official Google sources.

The Google SearchLiaison on Twitter is the account you need to pay attention to for algorithm updates and other news in Google Search.

If you suspect that an update has hit you, try and find out as much information as possible to see how you can improve your site.

It’s important to remember that there’s no instant fix if your site has been affected by an algorithm update.

Has your site lost traffic?

If you’ve reached this article because your site has recently lost traffic, and you’re still unsure what the cause might be, get in touch with our technical SEO experts through our contact page.