Google Penguin 4.0: What does this mean for SEO?
Thousands of people have been waiting for a very long time, (in some cases with bated breath) for Google Penguin to run again, and at last there is finally cause for celebration as Google have confirmed the update is now live via their Webmaster Central Blog.
Previously, this algorithm ran separately from the main ongoing core algorithm, the last one being way back in October 2014.
This was a long time ago for many that suffered directly from the effects on their businesses, or indirectly, via the thousands of SEO consultants who’ve worked on penalised websites with little results for important keywords.
Who will this update help?
- This update/refresh will allow websites that have had the correct remedial work done to the backlink profile, and will once again have the potential at least to rank for particular keywords without this algorithm automatically ‘pulling’ a website down in its ranking positions.
- Now that it is part of the core Google algorithm it should be quicker to ‘fix’ algorithmic devaluations for manipulative links.
Google Penguin 4.0 will run continually
There have been numerous setbacks with making Penguin, the anti-link spam algorithm, part of the core algorithm.
This is apparent when Google representatives have continually underestimated how long it will be until Penguin runs again.
In January there was some very encouraging news, as reported by Jennifer Slegg, that it would shortly be continually running.
@mihaiaperghis yes, which also means we’ll stop talking about it (or I will anyway) @rustybrick
— Gary Illyes (@methode) January 19, 2016
This was followed by a lively debate on various social media platforms, as to what an algorithm being part of the core algorithm actually en-tales, as summarised by Barry Schwartz. We know that Google have been trying to make this part of the core algorithm for a while now, and at long last it seems it will be! This perhaps explains why it’s been well over a year since it last ran.
What’s the difference to previous Penguin updates & refreshes?
In a response by Gary Illyes he used an analogy to the starter engine as used in cars.
…more convenient, but essentially nothing changed.
So apparently there is little change in algorithm itself, it’s more that it just will be less noticeable. This sparked a conversation among the technical SEO team at SALT, which I wanted to explore further with some help of some SEO friends around the world.
How will a continually running Google Penguin algorithm affect SEO?
As we all know, the links which point to a website dramatically affect the performance of a website in Google organic search results. These links can have a positive, and (with both Google Penguin & manual actions) negative effect. In light of this news that Penguin will now be constantly evaluating backlink profiles within the main core algorithm, how long should a Penguin recovery take now that it’s live, and in future now that it is continually running? Should we change, or modify how we run SEO campaigns to reflect this? Will we now see a reduction in link spam effectiveness, both in a positive, and negative way? I outreached a number of leading SEOs, asking for their thoughts on the Penguin algorithm update and below are some two responses from those who responded:
Penguin Update & Recovery
There are likely changes to how the algorithm evaluates links in order for it to run continuously.
Why do you think it is taken so long to make it part of the core algorithm?
Andrew Girdwood: I don’t know if it has taken a long time. We don’t really have much context to measure things by. I’d suggest that Google was in no rush. The Penguin updates we’ve had so far have not just been effective and changing the SERPs they’ve changed how many SEOs apply their trade.
Gerry White: I think one of the biggest issues that Google have faced is trying to understand the shitty links, differentiating between poor quality and spam links, as well as the naturally earned link profiles that sites acquire. Some of the people involved in building links are quite adept at disguising low quality links so it’s not so black and white.
How long would you expect a website take to recover based on previous updates/refreshes?
Andrew Girdwood: I think that depends on how often Google’s algorithm would have “liked to” return the site for a specific search results but was otherwise prevented from doing so by the penalty. Google wants to give searchers a good experience and if people are looking for a particular site then Google wants to help them find that particular site. This is the reason why big brands look like the clear through penalties faster than others. It’s probably the cumulative effect of navigational searches adding up giving the algorithm signals.
Gerry White: The greatest issue among some web masters at the moment is believing that they have been hit by penalized rather than looking at competitor activity or general link flux, a great example of this is if they have done something technically wrong and they’ve just assumed it’s been a Google update they’ve fallen foul of. If there is actual confirmation that you have been hit by an algorithm update, it’s important to understand what the backlink profile audit would look like if the links are removed, often a high number of links are disavowed without any further acquisition ongoing at the same time.
What reasons might there be if a website doesn’t recover, and what would be your advice would you give someone who is disappointed?
Andrew Girdwood: Websites that have been hit often may have nearly insurmountable challenges to face in terms of earning back the algorithm’s trust. This is compounded if they’re chasing competitive keywords and few searchers are looking for the site specifically. Starting over, in some situations, is not a bad idea. Don’t expect it to get any easier to trick Google. If you’ve been caught multiple times before and keep on trying to trick the system you will be caught again.
Gerry White: The biggest problem is that acquiring great links takes time, you have to have great content and great presence, as well as a team who understand how to generate good quality coverage, and invest time into this. We’re finding increasingly that the high quality sites (such as newspapers, high quality blogs) are now no-following their outbound links, it’s now harder than ever to build a high quality link profile. Patience is also essential as the impact of these links isn’t immediate. Some backlink profiles are that bad, that it’s sometimes better to create a second site without putting in a 301 redirect to the new site.
Link Auditing & Management
Truly understanding a link profile is an important part of SEO, and has been for many years.
Will understanding, monitoring, and taking the necessary precautions (auditing, link removal & disavow) with a link profile be more, or less important as a result of this change?
Andrew Girdwood: I’d hope link audits are a basic part of most site’s SEO “hygiene” already. Where abouts Penguin sits in the algorithm shouldn’t matter.
Gerry White: What I would say is don’t obsess over it, I’ve seen companies obsess over every single link and this is time that could be spent on other opportunities. Using tools like Ahrefs you will find that you can keep an eye on what’s going on and monitor the situation.
How often would you suggest someone audit a link profile now to ensure the disavow file accurately reflects the sites which they do not want to be associated with, and would this differ depending on the size of the website?
Andrew Girdwood: They don’t have to happen every week but they should certainly be in the calendar for at least every quarter. In more competitive industries I’d recommend every month. Stick to this and you’ll be okay. Checking your Search Console, however, should be a daily pleasure. If there’s signs of a problem then investigate.
Gerry White: I would advise that you set some time aside to look over the site’s link profile, especially the newly acquired links and you can tell pretty quickly if the new links look fine or if there any issues.
How important moving forward will making the correct decision whether a link is ‘good’ or ‘bad’ be with a continually running algorithm to fight link spam?
Andrew Girdwood: This way madness lies. Don’t cheat the system. That way you’ll have clarity on any dodgy links that appear. There’s no need to start disavowing over a few – you only whisk out that disavow file if you see a proper effort to hit you with negative SEO. That’ll be crystal clear.
Gerry White: You can generally make better decisions based on gut and feel, this means you look at the link and how it’s been built and you can determine whether or not it’s been spammy. This is better than focusing on metrics such as Domain Authority. DA can be easily manipulated by people who sell links, the sites may not be worthy of having a 40+ DA in Google’s eyes and not be as relevant a link to acquire. If you start focusing more on the metrics of the link, you lose sight of the relevancy. You need to go back to basics and understand if a user would navigate to that page and then would they likely follow that link?
Blackhat & Negative SEO
Whilst (I’d like to think) the majority of SEO consultants add value to the internet through making websites better, and creative marketing. Unfortunately, there is another dark side of SEO.
Will the practice link spamming a website to artificially make it rank highly be less attractive/rewarding for blackhat SEOs?
Andrew Girdwood: You’d have to ask the black hats. I guess if there’s fear that negative SEO is effective and common then black hats will use that to prey on site owners, poor SMEs and cautious brands.
Gerry White: Blackhat techniques will continue to work, the people involved in them are clever and will adapt and evolve their techniques to, it’s a fight that I don’t think Google will actually ever win, but it’s also a challenging fight from a black hat perspective. In terms of the negative stuff, Google is aware of what’s going on and we’ve seen a number of small agencies will send a lot of low quality links to a site, then call the site owner up and say “we’ve got all these links, do you want them?” or “your current agency has built all these bad links, do you want us to remove them?”. We’ve seen this pattern a number of times so we’re sure that Google are becoming wise to this. Negative SEO will have a much bigger impact on websites that have a smaller link profile that isn’t as strong as others.
Will Google Penguin being part of the core algorithm lead to less spamdexed websites in search results?
Andrew Girdwood: The easier it is for Google to update and fine tune their spam fighting efforts the less spamdexed sites will rank. The jury is out as to whether Penguin core will make it easier or harder for Google’s team to make updates.
Gerry White: Yes, a smaller website can be damaged a lot more quickly. That said it should be able to recover a lot more quickly, the issue is smaller companies and smaller websites don’t always have the ability to monitor their backlink profile and visibility in the same way that larger companies do.
Unfortunately, there have been many people/companies offering negative SEO services (via link spam) in recent years. How do you think this continually rolling Penguin algorithm will affect this practice?
Andrew Girdwood: I think negative SEO needs fear in order to sell. If there are high profile examples of sites being hit by negative SEO, examples that create that fear, then negative SEO strategies will continue to sell.
Gerry White: This is going to increase the number of negative SEO instances because the impact will be much more seen. One thing Penguin has been known for is being very slow in its update frequency, however now people will have a much quicker gratification as to whether or not what they are doing is working.
Thank you to Andrew Girdwood (@AndrewGirdwood) and Gerry White (@dergal) from Usable Content for contributing to this article.