The Blog

  • New Approaches To Interaction Latency

    Over the past two years, the lines between technical SEO as a pure process and wider marketing practices and functions have become increasingly blurred. Core Web Vitals is one of the more prominent, with obvious considerations nesting in the intersection between technical SEO (page speed) and user experience. You can read more about Core Web […]

    Read more
  • How to perform actionable SEO Competitor Analysis

    With daily keyword ranking changes and forever evolving search engine algorithms, it’s difficult to maintain a constant overview of your business’ digital presence against competitors. Add to that the recent world events of the past few years, leading to significant demand for greater digital exposure for all industries, now is the perfect time to understand […]

    Read more
  • An introduction to video SEO

    SEO for video is often disregarded. With more focus on content SEO, such as keyword research and internal linking, video optimization can sometimes be put on the backburner. With the amount of time and money spent on creating the video for it to not rank or drive the traffic you would hope for, it’s evident […]

    Read more
  • What you need to know about search intent

    Search intent – sometimes called ‘user intent’ – is about understanding not only what your audience is searching for when they find your website, but also why they are conducting that search. It’s a powerful way to learn more about how people find your website and to identify reasons why your SEO content might not […]

    Read more
  • The Ultimate Guide to Gatsby & SEO

    In this article, we’ll take you through our favourite tips, tricks and best practices for making your Gatsby sites as search engine friendly as possible. Overview If you’re already familiar with Gatsby, static site generators and JAMstack, feel free to skip ahead to the next section. If you’re new to these latest trends in web […]

    Read more
  • How to avoid crawl depth issues

    For websites to be indexed within the results pages of search engines, search engine web crawlers (often called a “spider” or “spiderbot”), must first explore their pages. These crawlers provide essential information to search engines so that the engines can supply users with the most useful and accurate results. In order for crawlers to efficiently […]

    Read more
  • How to manage third-party script performance

    Third-party scripts include scripts hosted on third-party domains as well as those bundled in app packages. Examples of the former include analytics scripts, advertising tags, and Facebook and Twitter buttons, while examples of the latter include Google Analytics, other performance tracking libraries, and tag management scripts. These scripts are often referred to as “third-party” to […]

    Read more
  • What is Structured Data?

    Structured data is a way for website owners to tell search engines like Google what the content on their website means. It adds context to content. The way we discover websites has gone through several changes over the years. To begin with, to visit a website, you just typed it into your address bar. Soon […]

    Read more
  • Guide to Setting Up Google Analytics 4 Property and Custom Conversion Events

    Google Analytics (GA4) is a wonderful platform and tool for those looking to gain additional insight into their website traffic, user profiles and behaviour, as well as specific actions that users take whilst interacting with a website. GA4 moves from traditional website sessions to an events-based approach, enabling website managers to tailor their reporting and […]

    Read more
  • What you need to know about duplicate content

    Duplicate content is a major cause for concern for webmasters and online marketers. If a page is flagged as duplicate content, you run the risk of it vanishing from the search results entirely. However, this does not always happen. In some cases, you might find your page ranks lower in Google Search results than a […]

    Read more
  • How to use Unavailable_After tags to increase crawl efficiency

    Working with large websites (1 million+ pages), or websites with high levels of time-sensitive content can sometimes be tricky when it comes to ensuring new content is crawled (and processed) so that users can discover it as quickly as possible. This is where the unavailable_after meta robots tag can come in handy. Unavailable_after tags provide […]

    Read more