At the end of the first VirtuaCon virtual conference, SALT’s Dan Taylor hosted a question and answer session with Google’s John Mueller.

Tackling questions around how Google perceives site structure in the age of context vectors and semantic information architecture, as well as whether or not Google will roll out algorithm updates during the global COVID-19 crisis.

Below is a transcribed section of the Q&A panel, and you can view the full video below.


This is a discussion point that comes up a lot in conversations, especially when working with international companies, and that is around the location of your hosting. Given that we live in a world of CDNs and edge networks, how much of an impact does hosting location have in Google’s overall view of a website and whether or not it’s relevant for international markets?

JM: I’d say it has very little impact. If its an occasion when we don’t know a lot about the site then hosting can help give us a guess, but when we know more about the website – we have a lot more than just the hosting location, we have geo-location settings in Google Search Console, and other signals to judge whether or not a site is relevant for the market.

Good search results are also secure search results, and Google has put a strong emphasis on HTTPS, and through the Chrome browser deprecating older TLS connections, is it too much of a stretch that in the future Google may introduce a form of passive scanning to the crawl process for things like the OWASP Top 10 and other common issues?

JM: I think that’s kind of tricky. So one thing we do is watch out for CMS versions, especially if we are aware of larger security issues affecting specific CMS versions and this is something we can identify through the header of the pages we will flag this through Search Console, that’s fairly easy.

More intrusive scanning, such as looking for XSS issues is more tricky as we’re then almost attacking a website to see if the vulnerability is there. I know this is something the security team would love to do more on, but I imagine it’s kind of a legal nightmare to navigate that, and it may be something that people have to opt-in for, and if they are opting in then they’re likely aware of these issues anyway so it’s kind of a tricky situation.

Follow-up question: So if Google can detect out of date CMS versions and understand they may have vulnerabilities, does this affect rankings in any way?

JM: Nope.

One of the emerging trends of the past few years, and spurred on by concepts such as EAT, and the introduction of BERT is that of context vectors and creating stronger information architectures on a website, whether its through URL structure or internal linking elements. When planning out a URL structure, does Google find URL structures easier to understand if they’ve been expanded to include additional, “relevant” subfolders versus more simplistic structure, e.g.

  1. website.com/blog/article/ = standard
  2. website.com/blog/category/article = additional subfolder

JM: They’re both pretty much the same. I think it’s good to focus on generating clean URLs that are understandable, but especially when you look at mobile phones when most people are browsing they tend not to be able to see or look at the URLs anyway so most of the time the people who see the URLs are mostly those running the website, which is kind of a weird situation I guess.

If you have clean URLs we can pick those clean URLs as canonicals and we can show those URLs in search, they’re much easier for you to track, it’s a lot easier to double-check when things go wrong, and with that in mind that is something where I would focus on URLs but I wouldn’t expect the URLs themselves to drive any specific ranking effect.

You can watch the full interview with John Mueller here, and the full VirtuaCon conference here.