Skip to content

Radically Better Results

Avatar of Scott McLay

It’s a competitive world out there and it’s important you grab any advantage in your quest for ecommerce perfection. We’ll take you from the basics to the finer technical details and the data insight that should drive your ecommerce strategy.  

Warning! This is a long guide with lots of detail and practical examples, to help you navigate here are the main areas we cover:

  1. Keyword research

  2. Forecasting

  3. Technical SEO

  4. Data-driven insights

  5. Content optimisation

  6. Reporting

1. Keyword research for your ecommerce site

Keyword research is one of the most important pieces of data you have at your disposal and it should be at the heart of your on page SEO strategy for ecommerce.

A search engine is a database of intentions. By using search data to interpret search intent, you can provide insight which will help you not only market existing products but develop new products.

Consider your keyword research as reliable, honest and accurate market research data that will help shape how you talk about products on your site.

How you approach your keyword research is vital to its success. It needs to go beyond the basic keyword and search volume and take many other factors into account, with understanding the intent of each keyword being one of the most valuable outputs from your keyword research.

Keyword research, if done properly, is a manually intensive process. It involves many steps and one of the most important is categorising your keywords properly. This will then help you later on the process when prioritising which categories and pages you need to optimise first.

Take DC Thomson shop for example. They sell multiple different product types, from gifts to homeware and magazine and newspaper subscriptions. They know which of these categories drive the most revenue so they can prioritise optimising category and product pages based on this. However, they can also use keyword research to understand what the opportunity is for each of these categories based on total average search volume for that category and the total traffic opportunity if they were to achieve the best rankings for each of the keywords within each category. Take a few steps back, and look at how you get to the final opportunity for each category.

As with all keyword research, it starts with a tool – or three or four. Everyone has their favourite, we always start with Google’s Keyword Tool in Google Adwords. It has two very helpful and simple functions. First, a tool to help you find new keywords:

Enter relevant keywords, your own URL and also competitors’ or relevant companies URLs to find associated keywords:

Nice and easy and using the DC Thomson Shop homepage as the URL, it provides some helpful information to get started.

Secondly, the tool helps you to get search volumes, historic search metrics and forecasts for how they might perform in the future.

It provides valuable information on seasonality which also helps inform your SEO strategy so you can target certain keywords ahead of an increase in searches.

There’s multiple keyword research tools available and it’s important that you cast your net far and wide when looking for relevant keywords with good opportunity. It pays to be thorough at the start.

Considering there are around 2 trillion searches every year and 16% of searches each day are completely new and unique, your keyword research document should be an ever growing and evolving document.

Once you’ve identified all your consumer intent keywords, monthly search volume for seasonality, the competition – which measures how difficult that keyword is, split by low, medium and high to help you determine how valuable that keyword is – and they’re all in your spreadsheet, it’s time to start the real work. This is where the manual but rewarding work helps you determine what the potential opportunity is for your keywords. We have many steps in our process and we’ve outlined a few of the key ones below:


Manually review each keyword and categorise it accordingly. These should be categorised to match your site map. If you don’t have a sitemap yet, the categorisation stage can be used to help determine your sitemap. For example, with the previous DC Thomson example, their keywords could fit into the following categories: magazines, newspapers, gift ideas, homeware, clothing & accs, etc. You can then categorise them further into sub categories such as food and drink, subscriptions, seasonal gifts etc.


Manually mapping a keyword to an existing page on your site helps you easily and quickly optimise page by page. Mapping is the process of determining which page is most relevant to which keyword (or visa versa). It highlights the potential opportunity not just at a category level but at a URL level. It can also help you determine if you have any gaps. If you don’t have an existing page you can map the keyword to then you should create one.

Remember to check:

  1. Click-through rate

  2. Current rank

  3. Ranking page

We use AWR tool to help us track current rankings and ranking page. This helps you understand how the site is currently performing in terms of rankings and visibility and set a benchmark for your results. 

Take your keyword research further...

1) Calculate the "glass ceiling" rank using:

  • Current rank

  • Keyword difficulty

2) Calculate realistic traffic forecasts using:

  • Average click-through rates

  • Search volumes

3) Prioritise the pages and sections of your site to focus on.

2. Forecasting

SEO is unlike many other marketing channels, there is no easy way to estimate the impact that your budget will have. With TV advertising you can see viewer estimates before a slot has been paid for, with PPC you know the price you are paying per click – the more you pay the more visible you will be, but with SEO there is very little information available.  

While we have an idea of what keywords we want to target and their search volumes, there is no easy way to predict rankings and the resulting traffic as so many factors make up the search algorithm.

At Yard, we use five key data points to help predict monthly traffic levels for clients.

Data for forecasting

  • Historic traffic – last four years’ data from analytics if possible

  • Rankings – current rankings and potential glass ceiling for each

  • Brand / non-brand split – all available data from Google Search Console

  • Last 12 months’ keyword volumes – data from Google Keyword Planner

  • Google Trends – top level trends data for each product category

Bringing the data together

The best place to start is to try and predict what the glass ceiling (maximum achievable) rankings are. To do this use a combination of the following data:

  • Current ranking

  • Keyword difficulty

  • Search volumes

  • Marketplace analysis

The marketplace analysis should reveal who is currently performing will give you an idea of what is achievable at a product and category level.

The next step is to calculate search volumes for the year ahead by blending keyword volumes with Google trends data. Using all available data, the average rate of change can be calculated monthly which is then applied to the previous 12 months search volume data. This provides insight into what the year ahead might look like.

Now that both potential search volumes for the year ahead and glass ceiling have been forecasted, the keyword traffic opportunity can be calculated.

We take average non-brand CTR (from Google Search Console) and potential search volume to determine monthly non-brand traffic. We then factor in the proposed strategy and the work that will be delivered over the course of the year to create a projected improvement curve. The last step is to calculate brand and miscellaneous keywords traffic to provide a complete picture.

3. Technical SEO

Technical SEO is the first port of call for any SEO activity, SEO campaigns won’t get far without getting the basics right and ensuring best practices are being followed – after all you wouldn’t build a house without strong foundations. 

The great thing about technical SEO is that it doesn’t just impact organic traffic, it also improves the experience for users across all channels though minimising website errors and increasing website speed. When it comes to ecommerce websites, there are eight key areas that we put a lot of focus on: 

  • Fixing spider traps 

  • Improving URL structures 

  • Implementing structured data

  • Adding social markup 

  • Creating a plan to deal with discontinued / out of stock products 

  • Monitoring broken links 

  • Restructuring XML sitemaps 

  • Improving website speed 

Spider traps

 A spider trap is a section of a website where a web crawler gets stuck in a loop and results in poor crawl efficiency and causes important pages such as product and categories to be crawled less often. These are usually made up of duplicated / “junk” pages that can be accessed by multiple URLs. 

 While this issue is not limited to ecommerce websites, it tends to be more prevalent for online retailers due to the technologies used to build their sites. 

 The two main causes of this are: 

  • Malformed links 

  • Faceted navigation 

Malformed links

 Malformed links can cause an infinite number of pages to be generated, all displaying the same content. This issue mostly occurs when the website meets the following criteria: 

  • URL rewrite rules are used to turn a query string URL into a “friendly” URL

    i.e.: /products.php?id=3423 gets rewritten to /category/3423/product-name

  • The rewrite rules are configured to ignore parts of the URL string, in this case after the product ID

    i.e.:/category/3423/product-name and /category/3423/random-name would display the same content 

If these two conditions are met and there is a malformed link of the page i.e.: 

A relative link with no leading slash: 

<a href=”resources/product-name”> instead of <a href=”/resources/product-name”>

  • Forgetting http:// or https:// on an absolute link

    i.e.: <a href=””> instead of <a href=””>

  • Then an infinite number of pages could be generated by clicking on the links i.e.: 


  • /category/3423/product-name 

  • /category/3423/resources/product-name 

  • /category/3423/resources/product-name/resources/product-name 

  • /category/3423/resources/product-name/resources/product-name/resources/product-name

Missing http / https 

  • /category/3423/product-name 

  • /category/3423/ 

  • /category/3423/ 

  • /category/3423/ 

These issues can usually be resolved by simply fixing the malformed links, although long term it would be advisable to force all links to use absolute URLs and double check for missing http:// or https:// within link mark-up. 

Faceted navigation 

Faceted navigation is a navigation element that lets users filter or sort results or products on a webpage by specific attributes such as colour, size etc. 

While faceted navigation can be useful for users, by helping them find what they are looking for quickly, it can also be problematic when it comes to crawling. Each selectable option can generate a new URL on the website by adding parameters such as colour or size. For example, the following pages would look very similar to search engines: 





Having just 5 unique filtering options could generate as many as 3,125 different URLs per category page depending on how your faceted navigation is setup. Stopping search engines from crawling these additional pages is relatively simple, only requiring two changes: 

  1. Use nofollow on all facet links 

  2. Deploy meta robots or x-robots noindex, nofollow on all additional pages 

URL structures

URLs are usually the first thing that both users and search engines see prior to loading a page and serve as the building blocks of a website’s hierarchy. While by default a CMS may use query strings / parameters to serve pages ( this is usually overwritten during the development stage to produce friendly URLs ( 

The URL structure should give key information to both users and search engines regarding what the page is about and where in the hierarchy it sits. Ideally the URL structure should follow the hierarchy set out within the main navigation whilst keeping naming conventions as short as possible and including keywords relevant to the content. 

For example, a jewellery website could be setup with the following structure: 

There are two common approaches when it comes to optimising product URL structures: 

  1.  Product defaulting to the root of the domain i.e. 

  2. Products sitting within category folders i.e. 

Root URLs are traditionally easier to implement and provide shorter URLs. Category level product URLs can help boost SEO efforts by grouping content in to relevant silos – leading to two major benefits: 

  1. Increased topical authority due to relevant content being grouped together 

  2. External links added to a page can benefit all other pages within that silo 

This can lead to obtaining more organic traffic on a smaller budget.  

While there is no right or wrong option, there are advantages and disadvantages for each: 

Root URLs


  • Most ecommerce content management systems will do this by default 

  • Less chance of product page duplication via improper configuration 

  • Shorter URLs 


  • Additional work may be required to implement meaningful breadcrumbs 

  • Lack of content silos can lead to lower topical authority 

  • Difficult to obtain performance data of products at a category level 

Category URLs


  • Easy to obtain performance data of products at a category level 

  • Breadcrumb trail can be implemented based on URL structure 

  • All content is contained in silos, helping to increase topical authority 


  • Additional development work / plugins required to achieve this structure 

  • Can generate multiple pages for each product if they are in more than one category (which can be resolved via assigning primary URLs or canonical tags) 

  • Slightly longer URLs 

Structured data

Structured data is a standardised format for classifying entities on a webpage, helping search engines to better understand content and websites. Adding structured data to pages can also help enhance the appearance of search results – making them more likely to be clicked on. 

While structured data has been around for many years, many ecommerce websites have not implemented it and are missing out on additional organic traffic via rankings they already have. There are five types of markup which are recommended for ecommerce sites, these are: 

  • Organisation 

  • Website 

  • Breadcrumbs 

  • Site Navigation 

  • Product 

While there are different ways to implement schema markup, we’ve found it easier to deploy via JSON-LD in most cases as it requires less modification of html elements on the page, although deployment will depend on the type of markup being implemented and how a website has been built i.e. CMS used. 

Organisation markup

Organisation markup helps to generate brand signals for search engines as well as enhancing Google’s knowledge graph and search engine results page (SERP) snippets. This markup should only be added to one page on your website, usually a page where there is detailed information about the company, such as the about page. 

Implementing organisation markup is relatively simple as it only requires basic information that is unlikely to change and can be hardcoded into the page via JSON-LD. The recommended markup is: 

<script type="application/ld+json"> {"@context" : "", "@type" : "Organization", "legalName" : "Business Name", "url" : "Website URL", "contactPoint" : [{ "@type" : "ContactPoint", "telephone" : "Telephone Number", "contactType" : "Phone Number Type i.e. customer service" }] "logo" : "Logo URL", "sameAs" : [ "Facebook URL", "Twitter URL", "Youtube URL", "Wikipedia URL", } </script>

Alternative implementation methods and additional parameters can be found here

Website markup

This markup can help generate the Sitelinks Search Box for brand related searches, letting users search for information or products on your website directly from Google’s search results. This helps users get to the content they want quicker, providing a better overall user experience. 

This markup should only be added to the homepage and as with the organisation markup, it can be hardcoded into the HTML head as it is unlikely to change often. The JSON-LD required to implement this is: 

<script type="application/ld+json">  {"@context" : "", "@type" : "WebSite", "name" : "Business Name", "url" : "Website URL", "potentialAction" : { "@type" : "SearchAction", "target" : "URL of Search Page i.e{search_term}", "query-input" : "required name=search_term"} } </script> 

Alternative implementation methods and additional parameters can be found here

Breadcrumb markup

The breadcrumb markup allows for breadcrumb rich snippets to show on your listings within the SERPs, giving slightly more information to users.  

This markup is best added via microdata, which requires small changes to the HTML markup used to insert traditional breadcrumbs: 

<ol itemscope itemtype=""> <li itemprop="itemListElement" itemscope itemtype=""> <a itemprop="item" href="Homepage Link"> <span itemprop="name">Page Name</span></a> <meta itemprop="position" content="1" /> </li> <li itemprop="itemListElement" itemscope itemtype=""> <a itemprop="item" href="First Directory URL"> <span itemprop="name">Page Name</span></a> <meta itemprop="position" content="2" /> </li> <li itemprop="itemListElement" itemscope itemtype=""> <a itemprop="item" href="Second Directory  URL"> <span itemprop="name">Page Name</span></a> <meta itemprop="position" content="3" /> </li> </ol> 

Additional parameters can be found here

Site navigation markup

This markup can help the influence organic sitelinks provided for your website within the SERPs and can also help search engines understand your website’s structure slightly better.  

Like the breadcrumb markup, this is best implemented via microdata, making small additions to the HTML markup generated for the main website navigation element: 

<ul itemscope itemtype=""> <li itemprop="name"><a itemprop="url" href="Page URL">Page Name</a></li> <li itemprop="name"><a itemprop="url" href="Page URL">Page Name</a></li> <li itemprop="name"><a itemprop="url" href="Page URL">Page Name</a></li> <li itemprop="name"><a itemprop="url" href="Page URL">Page Name</a></li> <li itemprop="name"><a itemprop="url" href="Page URL">Page Name</a></li> <li itemprop="name"><a itemprop="url" href="Page URL"></a></li> <li itemprop="name"><a itemprop="url" href="Page URL">Page Name</a></li> </ul> 

Additional parameters can be found here

Product markup 

Product markup can improve your organic SERP snippet by adding additional details such as price, availability and review ratings – giving users information they need before clicking through to your website. 

Adding product schema to a website via JSON-LD is relatively easy as all the information required should exist within the database, the recommended basic markup is: 

<script type="application/ld+json"> { "@context": "", "@type": "Product", "name": "Product Name", "image": ["Product Image URL"], "description": "Product Description", "brand": { "@type": "Thing", "name": "Brand Name" }, "aggregateRating": { "@type": "AggregateRating", "ratingValue": "Rating", "reviewCount": "Number of Reviews" }, "offers": { "@type": "Offer", "priceCurrency": "Currency i.e. GBP", "price": "Price i.e. 99.99", "availability": "" } } </script> 

Alternative implementation methods and additional parameters can be found here

Social tags

Social markup such as Open Graph and Twitter Cards help define how the snippet should look when a page is linked to via social platforms such as Facebook and Twitter. While this does not provide SEO value it can help the clickthrough rates experienced via social media and in turn drive additional traffic. 

Open graph metadata

Open Graph markup is relatively easy to implement, requiring basic information about the content to be inserted into meta tags within the HTML head. All this information should already exist within your website’s database, so it should only require simple database calls to implement the basics: 

<meta property="og:title" content="Page Name" /> <meta property="og:type" content="Content Type" /> <meta property="og:url" content="Page URL" /> <meta property="og:description" content="Short description of content" /> <meta property="og:image" content="Image URL" /> 

A full list of all available Open Graph properties can be found here

Twitter card metadata 

As with Open Graph, Twitter Cards are implemented via meta tags within the html head, the basic markup is: 

<meta name="twitter:card" content="summary" /> <meta name="twitter:site" content="Twitter Handle" /> <meta name="twitter:title" content="Page Title" /> <meta name="twitter:description" content="Short description of content" /> <meta name="twitter:image" content="Image URL" /> 

Additional information for implementing Twitter cards can be found here

Dealing with discontinued and out of stock products

Products come and go all the time, dealing with these pages correctly can make all the difference when it comes to SEO. Ecommerce websites deal with this problem in a variety of ways such as removing the page completely or redirecting to another URL, but very few handle this situation correctly. 

When a product becomes discontinued or out of stock, redirecting or removing the page can often result in losing any organic rankings and resulting traffic due to the relevant content ceasing to exist. To get around this we suggest opening your analytics package and researching how valuable your product was. 

If the product was valuable then you can leave the page live with an Out of Stock or Discontinued message, keeping the rest of the content unchanged but adding in a list of alternative products to ensure you are retaining as much of the value as possible. If the product was not performing via organic search, then it is suggested that the URL is redirected to the most relevant category page in order to retain any residual value i.e. authority from external links. 

Monitoring broken links

Many webpages are removed from the web each day and if a website is not kept up to date then links can easily become broken – which can provide a poor user experience and a dead end for web crawlers. While a small number of broken links within a website is often not an issue, it can send a signal to search engines that the content is not up to date. With all else being equal, this could be the factor that leads to another website gaining preference within the rankings. 

It’s important to keep a close eye on all internal and external links, ensuring that any which become broken are fixed within a timely manner. The best way to do this is to perform an audit on either a monthly or bi monthly basis, this can be done by using a desktop based web crawler such as ScreamingFrog to crawl the website, alternatively cloud based tools such as Deep Crawl can be used to continually monitor your website. 

When resolving broken link issues there are three actions that you can take: 

  1. Remove the link 

  2. Find the new URL of the content being linked to 

  3. Link to an alternative source 

While outright removing the link would be the easiest / quickest option, the link was placed there for a reason, usually improving overall UX so in most cases updating the link would be the best option to take. 

XML sitemap setup

An XML sitemap is a list of all the pages that exist on a website, it is used by search engine crawlers as a definitive list of what they should crawl and index. While it is a useful tool, many websites automatically generate this file and the process can lead to unwanted URLs or the file becoming too large for search engines to process. 

Sitemaps can be built manually using tools such as Screaming Frog to either build the sitemap or to get a list of all pages on the website then adding the relevant markup. While this process can alleviate some of the challenges of automated sitemaps, manually building sitemaps can take time and new pages need to be added manually – although there should be no negative impact if new pages are missing for a short period of time, allowing for manually built sitemaps to be produced on a bi monthly basis. 

One key advantage to manual sitemaps is that they can be broken down into smaller files, in terms of ecommerce websites you could have one sitemap containing key pages & category URLs and another containing all product URLs – this ensures that file sizes are kept as low as possible. 

  • Fixing spider traps

  • Improving URL structures

  • Implementing structured data

  • Adding social markup

  • Creating a plan to deal with discounted / out of stock products

  • Monitoring broken links

  • Restructuring XML sitemaps

  • Improving website speed

Website speed

Advancements in technology have increased significantly – as has users’ expectations for instant answers to their questions. It has been widely reported that 47% of users expect a website to load in less than two seconds and 40% could leave if the loading process takes more than 3 seconds. 

Ecommerce CMS systems such as Magento have always been known for slower loading times, usually due to the additional processing power these types of CMS’s require vs traditional CMS systems such as WordPress or Umbraco. In order to keep an ecommerce website’s page load as low as possible, it is recommended that the following areas are investigated: 

  • Hosting Platform 

  • Number of HTTP requests 

  • Overall page weight 

  • File Compression 

  • HTTP Caching 

Hosting platform 

One of the key elements of website load time is your tech stack, if the technical infrastructure isn’t up to scratch then the website will be slower giving an initial response – increasing the time taken for the first byte of information to be sent to the end user. While there are many hosting options available, shared hosting and in some cases virtual private servers just don’t cut it. 

You should monitor the resource usage of your hardware, such as CPU, RAM and network usage during peak hours to see how your current setup is performing. If you’re frequently hitting 100% usage in any area, then it’s a good idea to upgrade to make sure there’s no impact to your customers. 

 In addition to hardware, moving from HTTP/1.1 to HTTP/2 can also give your infrastructure a boost by reducing overall latency and accelerating content download times. A full overview of the differences between HTTP/1.1 and HTTP/2 can be found here

Number of HTTP requests

Browsers are limited to the number of simultaneous HTTP connections that can be made from a single domain, this limits the number of images, CSS & JS files that can be downloaded at the same time. Once the limit has been reached, the browser must wait for the current downloads to be complete before moving on to download the rest of the resources required to make your website work. 

Since websites have become more interactive over the years, the number of additional resources, mainly JavaScript, has increased. Some ecommerce websites use 20+ different JavaScript files to add additional functionality not present within standard HTML. 

The best solution to combat this is to combine all similar resources together i.e. if you have 4 CSS files and 10 JS files you can most likely reduce this to having one of each. In addition, all images could be served from a CDN (content delivery network), adding a second domain into the mix and thus doubling the number download lanes to ensure that images, CSS and JS are downloading in parallel – resulting in faster website load times. 

Overall page weight

Page weight refers to the combined file size of your website in MB’s. The larger it is, the longer the user needs to wait on it downloading. Since today’s websites require much more code, mainly due to websites becoming more complex, the overall weight of websites have increased. 

The rise of mobile browsing, where download speeds are much slower, has resulted in a need for websites to shed some of that weight. The two best ways to do this are: 

  • Image Optimisation 

  • Code Minimisation 

Image optimisation

While it may be convenient to upload high resolution images to your website, in most cases it is not required and can result in a huge increase in overall page weight. In most cases, images can be optimised to reduce the file size using image editors with a two simple steps: 

  1. Reduce the height and width of the file to a similar size to what will be shown on the website 

  2. Reduce the quality of the image as far as you can until it becomes noticeable to the human eye. 

In most cases these two factors can result in 70 to 80% reduction in image file sizes which will decrease the overall load time of the website. 

Code Minimisation 

Code is always written in an easy to read format, making use of tabbed spacing and white space between different elements. The issue with this is that the white space adds to the overall weight of the code and removing it can result in a reduction of a least 10% across HTML, CSS and Javascript resources without impacting functionality.

As the name suggests, file compression reduces the size of files (usually up to 70%) so that they can be transferred to the end user much faster. 

Checking if your website has Gzip compression enabled is pretty simple, all that is needed is for you to enter your website address into this online tool and it will tell you if compression has been turned on or off. Enabling compression can be slightly harder, while some CMS systems have a simple tick or enable option or have plugins / modules which add this functionality – others require some server configuration changes. 

Apache servers

The following needs to be added to the .htaccess file: 

AddOutputFilterByType DEFLATE application/javascript AddOutputFilterByType DEFLATE application/rss+xml AddOutputFilterByType DEFLATE application/ AddOutputFilterByType DEFLATE application/x-font AddOutputFilterByType DEFLATE application/x-font-opentype AddOutputFilterByType DEFLATE application/x-font-otf AddOutputFilterByType DEFLATE application/x-font-truetype AddOutputFilterByType DEFLATE application/x-font-ttf AddOutputFilterByType DEFLATE application/x-javascript AddOutputFilterByType DEFLATE application/xhtml+xml AddOutputFilterByType DEFLATE application/xml AddOutputFilterByType DEFLATE font/opentype AddOutputFilterByType DEFLATE font/otf AddOutputFilterByType DEFLATE font/ttf AddOutputFilterByType DEFLATE image/svg+xml AddOutputFilterByType DEFLATE image/x-icon AddOutputFilterByType DEFLATE text/css AddOutputFilterByType DEFLATE text/html AddOutputFilterByType DEFLATE text/javascript AddOutputFilterByType DEFLATE text/plain AddOutputFilterByType DEFLATE text/xml # Only required for older browsers BrowserMatch ^Mozilla/4 gzip-only-text/html BrowserMatch ^Mozilla/4\.0[678] no-gzip BrowserMatch \bMSIE !no-gzip !gzip-only-text/html Header append Vary User-Agent  

IIS servers

  1. Open IIS Manager and navigate to the level you want to manage. For information about opening IIS Manager, see Open IIS Manager (IIS 7). For information about navigating to locations in the UI, see Navigation in IIS Manager (IIS 7).

  2. In Features View, double-click Compression.

  3. On the Compression page, select the box next to Enable dynamic content compression.

  4. Click Apply in the Actions pane. 

HTTP caching

HTTP caching is essentially storing a website’s resources (CSS, images etc.) locally on a user’s machine on their first visit, so that they can be reused for each subsequent page load without the browser having to download them again. As there is no need to download the resources again, every visit after the first initial page load is faster, giving users a better overall user experience as they browse through the website. 

One issue with caching is that things on the web change all the time such as small changes to CSS or functionality. If users have already cached these files, then they won’t be able to see those changes. To get around this last modified header, ETags or Expiry headers can be used to ensure users can get the most up to date version of your website. 

A full guide to HTTP caching can be found here

Phew! Congratulations if you made it this far. Let us know if you’d like to pick our technical team’s brains. 

4. Data-driven insights

To start creating more insight data-driven content, you need to first understand the basics. Below we describe three SEO data insights from simple, to more complex, which you can use to drive your strategy and results. 

By using the underlying data combined with some essential data science skills, and by accepting “if you can’t measure it, you can’t improve it”, you’ll always be one step ahead of the game.

Understand channel traffic

It’s essential you have a solid understanding of where your traffic is coming from in order to understand how your customers are finding you. Are they finding you organically through search engines, are you paying for traffic with PPC or maybe it’s through social media? 

Most analytics solutions such as Google Analytics, have an inbuilt channel dimension which breaks down visitors to your website based on where they come from – organic search, social media, direct, PPC brand, PPC generic and more. Google Analytics also allows you to see what portion of your SEO-related visitors came from Google, Bing, and Yahoo. The insights gained can be used for adjustments on marketing spend.

Go beyond keyword rankings – find audience intent using Google Search Console

Undeniably, keyword rankings are a solid measure of SEO. It’s to be expected that the higher you rank for a keyword, the higher the traffic. However, reporting on keywords alone devalues the marketer’s role and doesn’t show the full picture of why SEO is so important to your company.  

Keyword data can tell us a lot about what our audiences are looking for. However, what should drive the content is user intent. 

Understanding intent will enable you to create more valuable content for your audience 

Make use of Google’s free Search Console product, that provides data and analytics to help improve your site’s performance in Google search.

A few tips on how to get the best insights from Google Search Console’s data:

Click-through rate
Pay attention to your click through rate (CTR) on the queries and pages that you already rank for. If you are getting visibility, but no one is clicking through to your page or article, spend a few minutes looking at how you can optimise your title, meta description or URL to entice people to click through.

If you see that one of your posts is ranking really well for a specific query but has a low time on page or high bounce rate, that could be an indicator that your content doesn’t match up with your audience’s intent. If you can optimise that post to align with their search intent, you’ll improve the likelihood of them sticking around.

Topic authority
Find pages with high impressions and low CTR that rank on the 2nd page to find additional content topics that will increase your topical authority on the subject matter.

Export up to 1,000 of the queries you are showing up for within search. This data from Google Search Console will give you impressions, clicks, CTR and position. After exporting, categorise your queries by topic/category to understand the groupings of keywords that your brand is showing up for. This data will show you the areas where you need to devote more time to get more impressions/clicks within organic search and topics to increase your clickthrough (assuming you generate a lot of impressions but few clicks).

Don’t rely on a last touch attribution model to evaluate SEO

A customer can have many interactions with a brand, via multiple channels, before they purchase. You may know which channel/landing page the user converted on, but what about all the social posts, ad impressions, content and site pages that contributed towards the decision to buy.  

All these interactions with your brand should change how you allocate resources so you can maximise ROI. This complex buyer’s journey makes the job of allocating budget or resource extremely difficult. A more sophisticated way of measuring which channels are creating sales opportunities is required. That’s why marketers are relying on attribution modelling. 

Attribution modelling is a set of rules for assigning credit to touch points in the conversion path. There are many different attribution models out there, depending on the product and the length of your buying cycle, one might make more sense than others.  

Most business’ fall into the trap of using the most common attribution model, the ‘Last Touch’ model. The last touch model assigns all credit to the most recent touch point of a buyer’s journey, hence, assigns all credit to the channel through which the conversion occurred.  

This model can be bad news for effective SEO evaluation. On many occasions, Natural Search is generally one of the first touch points in a user’s journey – it’s falls within the acquisition phase. Once the customer is brand aware, they are likely to revisit the website via a channel such as Direct or PPC brand, which tends to be the given the most credit in a last touch model.

The above image shows a more sophisticated multi-touch attribution model, which takes into consideration all touch points, sale and none-sale in a user’s journey. Using a multi-channel attribution model, will give you a true understand how much of an impact SEO has on conversions, and how much marketing resource should be spent there. 

Always keep on top of the content you push live

Last, but by no means least, always keep on top of the content you push live to your site. Build your own custom dashboards or keep analysing the pages you change in your Analytics tool. Regardless of the results being good, or bad, it’s better to be aware of how your content is doing, so you that you are ready to make changes where necessary.   

Activity Superstore have kept on top of their new content after we built them their very own dashboarding tool using R/Shiny combined with their analytics data. The below graph shows entries to a content-changed page ‘/for-couples’, from the ‘Natural Search’ channel. The red line represents the date content was pushed live, which enables Activity Superstore to recognise and understand the impact of their new content, pre and post the live date.  

We also added metrics such as Revenue, Orders and Conversion Rate, at both a visit, and a visitor attribution level, which they can apply to content-changed pages to get deeper insight. By analysing ‘visitor-level’ attribution, it escapes the problems of a last-touch ‘visit-level’ attribution model and gives credit to pages that may not have been in the user’s sale visit, but a previous visit leading up to a sale. 

Having the ability to know which content works, and which doesn’t, allows you to have complete control over your content strategies. Stop producing content that doesn’t work, learn from your mistakes, and keep creating better and better content. 

Data should always be at the heart of your SEO and content strategy, however it’s open to interpretation, which is why marketers need to be data-informed. Digging into why something failed or took off is more important than tossing out a failed tactic or doubling down on a successful one. Without this analysis and insight, you could be making rash decisions that don’t produce the results you’re looking for.

5. Content optimisation

Having a dedicated content optimisation plan is the next step in delivering growth for your ecommerce site. There’s multiple ways you can deliver this: employ the services of an SEO agency (hi) who specialise in ecommerce strategies (hi again), or if you’re a smaller business with a lower budget, you can recruit a dedicated member of staff to create and implement your SEO strategy or work with a talented freelancer.  

 At a top level, your approach to optimising your site should look something like this: 

The driving force is your ever evolving keyword research document. Target priority categories and products first, or categories/pages with the most opportunity. Consider how much resource you have at your disposal. How many pages can you realistically research, write, edit and publish on a monthly basis internally?   

If you’re outsourcing it, your agency or freelancer will determine this for you. We think a collaborative approach is best. We like to create the strategy and the schedule and work with the in-house team to deliver the content. It’s vitally important that you’re always up-skilling your staff.  

Once you’ve confirmed your resource and how many pages you can optimise each month, create a schedule. You’ll find a whole bunch of tools that will help you create your content schedule, they’ll promise to be the most collaborative, intuitive and life changing tool you’ll ever use. We prefer to keep it simple by using the humble spreadsheet.

Put simply, SEO is anything that increases the performance of a particular website or webpage within the search engine’s natural results (i.e. not paid advertising).

There needs to be a balance between SEO and usability, however more often than not, these can go hand-in-hand if done correctly. After all, the search engines’ aim is to produce the best results for users. 

SEO is an ever evolving practice. Google’s algorithm and the online environment as a whole changes daily. By using existing best practices and addressing the core principles behind the search engines we can negate the effect of future changes to a certain degree. It is, however, inevitable that we need to change and adapt to whatever the environment demands.

Remember that you are writing content for humans, not search engines, so if you’re reading your copy and you feel you’ve overused any word, then you should address it. 

As a general rule, your most prominent keyword(s) should be used in:

  • URL 

  • Page title 

  • Meta description 

  • H1 tag 

  • Body copy – generally as close to the top of the copy as possible and where required naturally in the rest of the copy.

For additional insight into understanding what people are searching for, a neat tool is Answer the Public: 

Use variations and semantic keywords – variation is key to success, so you should use not only the exact keywords, but also variations and semantic alternatives to build a more natural page and attract long-tail traffic.

Avoid repetition of keywords, as overuse can be considered as ‘keyword stuffing’. However, if you consider the term ‘engagement ring’, there are no variations you can use so it is acceptable to use the word numerous times on the page.

For the purpose of this guide, we’re using luxury jewellery retailer ROX as an example. 


When editing a URL at page level, it is only the last part which can be defined, the rest of the URL is made up of the domain and parent folder levels.

For example:


Parent level:  engagement-rings/ 

Page:  yellow-diamond

How does this affect SEO?

Although URLs do not hold a great deal of weighting in SEO, it is best practice to use keywords where possible.

Always use lowercase characters only and use hyphens(dashes) not underscores to separate words.

Keep URL strings as short as possible. The page section of a URL on its own may not seem long, however if a website has several levels of folders/pages in its hierarchy, URLs can easily become very long.

Important – Before changing any URL, the implications need to be considered. A page may have links pointing to it and built up authority with search engines. Changing the URL could lose these if the correct measures are not taken. These effects may not be isolated to just one page, if they are child pages (i.e. if this page forms the ‘parent level’ of the URL for other pages) then changing the URL could in turn change theirs.

Page title

The page title is displayed in the top bar of a user’s browser:

The title is also used if the page appears in search results as the title of the page’s listing:

How does the page title affect SEO?

The maximum length of a page title should be no more than 70 characters in length as this is the most Google (usually) displays in search results – the rest of the characters are cut off – making them effectively pointless. However, bear in mind that the longer a page title is, the more words there are for authority to be distributed between. For this same reason, try to avoid using superfluous words or characters (unless forming part of keywords). 

The authority assigned by Google to words in a page title is weighted from left to right. (I.e. the first words in a page title are given the most value). For this reason it is important to try and structure the page title with priority keywords as close to the start as possible.

Given that the title is also used in search results, usability must also be considered. The page title needs to work alongside the meta description in search results to entice searchers to click through to the webpage.  

Consider what keywords the page may appear for and what the searcher may be looking for when using them. Google takes into account click-through rates as an indication of relevance and this can also have a knock-on effect for SEO.

A tool to check the length of your page title and to give you a visual representation can be found here:

Meta description

The meta description is used to convey a summary of each page in the search results. You should consider your meta description as your advert to promote what is on the page. You’re competing against all other results on Google and other search engines, so make sure your description is relevant, covers appropriate search terms and is engaging. You want to entice the user to click through to your page.

This is how the meta description looks in search: 

The maximum length of a meta description should be no more than 156 characters in length as this is the most Google (usually) displays in search results. However, there is occasions when Google will truncate descriptions.

The meta description has a negligible direct effect on rankings. However the importance of meta descriptions in relation to the search engines should not be underestimated, as detailed below. 

The meta description must work alongside the page title and any rich snippet information that is displayed as a result of schema mark-up to encourage searches to click through to the page. As with the page title it should be considered what keywords the page may appear for and what the searcher may be looking for when using them.

As previously stated, keywords are not required in the meta description to directly affect rankings. However, if the keyword used in a search appears in the meta description, this will be displayed in bold in the search results. For this reason keywords can be used to attract searchers attention.

For example, the following search:

Highlights the following meta description: 

The meta description should always be seen as an opportunity to entice click throughs by selling what the page is offering. It’s worth taking the time to write both the page titles and meta descriptions, making them as unique and interesting as possible. 

H1 heading

The H1 heading / H1 tag is the main heading for any webpage.  

Note that the H1 can sometimes be referred to as the page title. This should not be confused with the true ‘page title’ (see previous section). To distinguish between the two; the H1 heading can be seen on-page, whereas the page title is only used in the browser and by search engines.

How and where an H1 heading appears visually can vary from page to page as this is controlled by the styling of a site or page.

There should only be one instance of an H1 heading on any one page.

How does this affect SEO?

As with the page title, search engines use H1 headings as an indication of the page theme by looking for keywords within the heading. Therefore, it’s important that you use your most important keyword(s) in the H1 heading. 

Body copy

Your online audience tends to rapidly skim visible content for relevance.

Use primary keywords within the first paragraph, this gives an early indication to search engines of what the page content is about. Ensure that primary keywords are distributed throughout the document. 

There is no ‘magic number’ of keywords a page should have. If content starts to sound unnatural and forced, then this is a sign that it has likely been over optimised. This can be just as damaging as not optimising if Google thinks the page is trying to artificially manipulate search results. 

Typically, online users have low attention spans and scan pages for points of interest. Solid blocks of text do not usually perform well online, similarly lengthy text heavy pages struggle to keep readers attention. 

It’s important to break up content within the page and highlight key areas to draw users in and provide easy to digest snippets. 

Sub headings using H2 tags can help users navigate to their search query quickly, bullet points and lists can help break down a page, whereas bold and italic styling of keywords/phrases can provide a visual draw to readers. 

If there are several sub sections/topics within a page, a good approach is to concentrate on each of the sub-themes individually. These can then all go together to strengthen the overall page.

Category pages

If you were to wander round a department store, you’d expect similar products to be kept together i.e. a floor devoted to electronics perhaps, and within that, you’d expect to find the TV section, the computers section and a section for white goods etc.  

The same should apply to your website. The products you sell should be grouped in a logical way that will help users find what they’re looking for. There may be several ways in which you can group your products, by the type or need they fulfill, the brand, the colour, material or any other of a myriad of factors.  

The Information Architecture (IA) of your site should reflect these categories, with pages for each product grouping. However, one of the advantages you have with your site over a physical store is that products can be displayed in multiple relevant categories. This means you can create category pages that match how your potential customers search for those products. 

Before we go any further, it’s probably worth a note on naming these pages. Within a site that is likely to have several layers between the homepage and a product page, the term category page can be substituted for a variety of other terms; hub-page, brand page, sub-category page, [category name] page, [category name] homepage…the list could go on. 

For clarity, this section is referring to all those pages using a green box in this example of a fashion retailer.

There is a distinction to be made between that first layer of category pages and the layer(s) below in this example. That first level, Men’s, Women’s and Kids’ should be mapped to quite broad keywords i.e. men’s designer clothes. From such a search, it would be logical to assume that there is not a specific product type that the user is yet looking to purchase. Therefore, these pages don’t necessarily need to feature all the products that fit in to that category. They should, however, direct users to the category pages in the level below that. As a result, more copy can feature on these pages, which can help them be relevant to a broad range of searches. 

This is where the keyword mapping and optimisation described above becomes crucial. Match the content of the page to the intent of the user. Chances are, if someone is searching for men’s jeans, they are in the research phase, seeing what’s out there and what styles or brands they like the look of. 

Having gathered that information, the next search they may perform might be for men’s straight fit jeans. It’s better for you if they don’t leave your site to perform a search and click straight through to your relevant page.  

Equally, if the research has been done on a competitors site or other marketing campaigns have been effective, you still need your lower level category pages to capture as much traffic as possible from organic results as the relevant searches are made by people who have clear intent to make a purchase. 

It’s on these lower level pages where we see a bit more of the interplay between SEO and other disciplines. Copy should be optimised around relevant search terms, but it should also be used to help push users towards certain products or ranges.  

There shouldn’t be too much copy at the top of the page, a couple of hundred words at most of carefully selected, expertly crafted copy. You don’t want to make the searcher have to scroll to see your products when they are looking to buy. 

Where sub-categories are referenced in the copy (see, that naming convention again) the mention of the sub-category should be used as anchor text for a link through to the relevant page. This helps users locate it and encourages the search engine spiders to crawl that page by following the link. They will also use that wording as a summary of the contents of the page and therefore understand it’s relevance to associated searches. 

Refining results on category pages

Once on a category page, if a user likes what they see, they may want to refine the products that they are viewing, clicking through to a sub-category page is one way they may do that. If your site has filters that allows them refine the products displayed, applying these filters will add additional dynamic elements to the URL string. That’s fine, it’s better that a user is easily able to find what they’re looking to purchase. If you’ve skipped ahead, return to the Faceted Navigation chapter of the Technical SEO section of this piece to learn about handling that effectively. 

There is of course, the chance that the user visiting your site has plenty of time on their hands and is happy to work through a sequence of pages, if you’ve got more products that will fit on one page. Here we’re dealing with what’s known as pagination, where content or product offerings are split over a series of numbered pages. 

Product selection and order

If you have your keyword mapping right and a site structure that reflects the behaviour and search patterns of the user, the order in which the products are displayed on your category page shouldn’t have too much impact on how well the pages rank. However, it may well impact how well they convert. 

It’s also wise to ensure the keyword choices or mapping and product offering match. Don’t be tempted to chuck all products that may have a vague connection to the category on to the page. If it’s too hard for the user to find what they are looking for they’ll leave your site and a high bounce rate from the search results can harm your rankings.  

Equally, don’t be tempted to go after keywords that aren’t highly relevant just because they have a high search volume.   

While your objective is to likely to be selling as much stuff as you can through your site to organic visitors through SEO, the whole CRO and online merchandising piece is much broader. 

If you’re lucky enough to work with a merchandising team, collaborate with them as much as you can. Work together to drive maximum traffic and let them work on the order products are displayed in a way that generates the highest sales. 

If you don’t have that advantage, mining your analytics to understand customer journeys and which landing page results in the purchase of which products and combining it with your ranking data and even understanding stock levels and profit margins can help you decide the best order for the products on your category page to maximise sales.   

Product pages


  • Focus on product specific keywords. Those that are relevant to the whole range rather than a single product should be the focus of category pages. 

  • Match the amount of copy to the level of information required for a user to make the purchase decision. Less involved purchases require less information, more technical or high price products are likely to require more information.  

  • Use references in the copy to the category the product belongs in as anchor text for a link back to the category page. 

  • Make use of all the available schema markups that can be applied at product level, this will help search engines understand precise details of the product, increase the chances of attracting relevant traffic and improve the information displayed for your listing in the search results. If you are also running a Google shopping campaign, much of this information is required for your shopping feed, depending on how that is generated, it could be more straightforward to apply the markup than it may at first appear.   


  • Waste your crawl budget on allowing the search engines to crawl additional widgets that are embedded in your product pages to help sales. Use the robots.txt file to exclude these from any crawls. 

  • Get trigger happy with applying redirects when a product is out of stock. If a 302 (temporary) redirect is in place for a while, the search engines will assume it’s permanent and stop crawling and indexing that page. As a result, if a product comes back in stock, the product page won’t rank. It’s better to remove all links to the page and not refer to it in the sitemap if a product is going to be out of stock for a while. 

  • Make it hard for a user to actually add the product basket or make the purchase. Work with CRO and UX experts to make this process as slick as possible. 

  • Over promise and under deliver, Sure, you should write with enthusiasm about the product, put across the benefits in a way that is meaningful to your target audience and effectively convey other messages that may help sales i.e. next day delivery. However, false claims about a product will result in a poor customer experience and can even run the risk of a legal infringement.

Now that you’ve created an ecommerce experience that is going to deliver above forecasted results, it’s key that you report on your success – and failures. Make sure you’re reporting with accurate and reliable data and that you’re highlighting the importance of SEO against all marketing channels. 

6. Reporting

There is no one size fits all solution for reporting, at Yard we tailor our reporting to each client’s requirements, although every ecommerce client we work on has always asked for the same three things: 

  • Number of sales 

  • Revenue 

  • Revenue split by product type 

This has allowed us to build a template that can be used as a starting point, giving the client a full overview of how their products are performing via organic search. 

Organic visibility

Reporting on individual rankings can provide very little value to clients and causes them to fixate on just one or two keywords with large search volumes instead of looking at the bigger picture. Organic visibility is our proprietary metric that calculates how visible a website is and enables us to benchmark clients websites against competitors to gain insight into what is working and where.  

While many other tools offer similar metrics, Yard only score the non-brand keywords relevant to the SEO campaigns to ensure performance gains can be clearly reported. 

In addition to overall visibility, we have the ability to calculate organic market share at a keyword group level, giving stakeholders better understanding about marketplace performance for products and services.