How We Grew Visibility by 40% Through Solving the Ecommerce Duplicate Pages Problem



How We Grew Visibility by 40% Through Solving the Ecommerce Duplicate Pages Problem
06 February 2023
SEO is a living organism. It evolves with every new update introduced by Google. That’s why, having an open mind, keeping your hand on the pulse, and seeking new ways to deal with those developments definitely works to your website’s advantage. However, what if the reports from analytical tools show that what you’re doing actually harms your online business? Keep reading to find out how to battle this problem!



Table of contents:

Sadly, it’s getting increasingly common for e-commerce store owners to unintendedly damage their website ranking. It’s worth realizing that what is supposed to improve one thing, may have a disastrous effect on others. Therefore, all modifications should be consulted with an SEO specialist before they are introduced to the website for good. Otherwise, you may face up to some awfully dire consequences.

There’s a reason I’m bringing this issue up. Now, when everybody speaks about UX and its importance for users, some people forget about the effect this process can have on SEO.

That’s why I think it’s time to realize that improving your website UX without taking expert advice may cause tons of problems, such as duplicate pages, for example. Using one of my clients’ cases, I’ll try to explain to you why consulting an SEO helps you steer clear of problems like duplicate content in e-commerce.

What you will learn:

  • Why duplicate pages are a big problem
  • How to check your website for duplicate content
  • Criteria for selecting one page from its duplicates + how to make use of the latter
  • Benefits of doing all of the above

Few Words to Begin with

It seems that nowadays the process of improving website visibility in Search comes down to building synergy among four elements: SEO, business, marketing, and UX. Indeed, if you want to note a significant boost to a website’s ranking position, you need to focus on mastering and combining search engine optimization with user experience.

Why should SEO and UX go hand in hand?

Optimizing a website for main keywords to make it appear at the top of SERPs for those phrases is one thing. However, if the website’s design and UX flounder, there is a slim chance user would be eager to stay and explore the website for longer. What I want to say is that even the best SEO strategy may not bring you the desired results if your website is hard to navigate and causes confusion among the users.

This is especially true for e-commerce stores offering a large number of items grouped into many categories.

Why is it a bad idea?

Because when a user considers finding a particular product hard, they are mercifully quick to leave the ill-organized website.

One of my clients was well aware of that, therefore she made a couple of important UX-improving changes to her website. Surprisingly, instead of skyrocketing the visibility, those modifications caused serious harm to the website’s performance.

What went wrong?

The client intended to improve navigation on her website by expanding the category section. Doing so, she aimed at helping the customers find the right products more effortlessly.

What she didn’t take into consideration though was that such a practice would lead to theproblem of duplicate pages.

Effect? I think I’ll put it in the following way:

the more duplicate (or near-duplicate) pages are registered within one domain = the more severe beating the website visibility takes

Sadly, it’s that serious.

My advice to you is: Carry on reading to learn how to avoid being in such deep trouble. I’m going to show you how to identify the duplicate pages problem and grapple with it.

Here is a story of how I succeeded in reversing this downward trend and restoring my client’s online visibility.

See how we increased the website impressions by over 400% for our client from the blockchain industry!

Learn more

Duplicate Pages – Why Should You Care?

To make sure we’re on the same page, let’s briefly discuss the issue of duplicate pages.

Known also as duplicate content, duplicate pages describe the situation when the same piece of content is published on at least two pages. Those pages, of course, have different URLs, but that is basically the only thing that makes them different.

Now, think about Google bots. When a user types a query, the bots start looking for the most relevant content to respond with. But what are the bots supposed to do – or which page should they pick – if they find identical pieces of writing on multiple pages? Which page should the bot display on the user’s screen if the presented content is exactly the same?

Google doesn’t have time for choosing, picking, comparing, basically for dealing with such stuff, therefore it often punishes duplicate pages by lowering their ranks. Now imagine how much a website suffers when the number of duplicate pages is large.

Oh, and there is yet another downside of duplicate content to mention. Namely, once Google bots lower the ranks and devalue the duplicate pages pushing them down the SERP, your rivals’ websites are automatically given higher positions.

What I intend to say is that by having multiple duplicate pages registered within one domain, you basically sideline your website and your business. You give up your top places on SERP, making room for your closest competitors.

And this is exactly why you don’t want to have the same content published on multiple pages.

Okay, okay but how will I know if the problem of duplicate pages is the factor that pushes my website down the SERP?

Well, you need to learn…

How to Check for Duplicate Pages

First, assess your website’s performance. Truth be told, I realized there is something wrong with my client’s website after taking a look at the visibility score and positions of the main keywords – I noticed those two metrics falling a long way down.

To stop this quagmire and help my client’s website get back on the right track, I had to find the source of the falls.

To do so, I went through three steps:

Duplicate Pages Check STEP 1: Keyword Position Analysis

Since the visibility and position of keywords – even the main ones – decreased considerably, I reached out to SERPRobot. This tool allowed me to check which pages ranked for what keywords.

To check the same metrics yourself, you need to have a rank-tracking tool that has been already collecting data on your keywords for some time. For me, the tool of choice is SERProbot – so that’s what I’ll use as an example here. However, feel free to go for alternative solutions if you want to. Naturally, the longer the tool has been gathering information on the keywords positions, the more extensive, hence accurate data you will have.

That being said, my second piece of advice to you is to start using such a rank-tracking tool as soon as possible – unless you have already done that, of course. If you decide to use SERProbot here is how to get started with it to track the performance of your keywords.

1. Now, open your project, and type the keyword into the search bar – it’s marked below with the green arrow:

serp robot keyword search

2. After selecting the keywords which position you want to check, choose View full keyword details.

keyword details in serp robot

This report provides you with detailed information on the keywords’ positions and the pages they call out.

Take a look at the screenshot below. It shows that page A (green color) and page B (blue color) targeted the same keyword.

keyword report in serp robot

That is an obvious sign something is clearly wrong. When a page is well-optimized, it should be the only one to rank for a particular keyword. In other words, the very keyword mustn’t call out other pages except for the one it’s optimized for.

Duplicate Pages Check STEP 2: Manual Verification

I examined those two pages closely. It turned out that both of them are almost 100% identical. Here is what I realized.

The client created a few groups of products. The thing was that some of these groups fell into more than one category. Therefore, to help their customers find the right products easily, the client copied the product pages and added them to all categories they fitted.

Those pages constituted the same set of elements: product descriptions, titles, and meta tags, to name just three. The only thing that wasn’t duplicated was the URLs.

I also noticed that some pages were better optimized than their “clones”. To illustrate, page A was well-optimized as it featured category descriptions, correct headings, and meta descriptions, whereas its copy seemed to lack those basic SEO elements. Page B showed the available products correctly but the title wasn’t customized – it was automatically copied from page A’s category name. So was the title tag.

Basically, there was no unique content published on page B.

Since the duplicate content was the main culprit of the drops in visibility and keywords positions for page A, I knew, I had to examine the remaining pages, too. Only in this way was I able to check if their rankings also suffered due to duplicate content, and if so, fix this.

Duplicate Pages Check STEP 3: The Scale of the Problem Identification

Finally, I had to find out how serious the problem was. To this end, I went through the same process described above, again using SERPRobot.

Moreover, I spent some time inspecting category pages separately. This way I took a close look at each page and checked them for duplicate content issues.

Wondering what the results of the investigation were? My findings revealed more duplicate pages. Like a lot!

As I learned later (and already mentioned above), the client intended to improve UX to help her customers find the looked-for products quicker by placing one product under a few categories.

Despite sounding like a pretty good idea, registering several identical pages within one domain doesn’t make much sense from the SEO perspective. It confuses Google bots, which results in a dramatic drop in search visibility.

On the other hand, the menu the client worked out made a lot of sense from the UX perspective. Since product Y has a few qualities that belong to several categories, it seems logical to add the Y product page to all the categories it falls into. This clearly improves website usability.

To be honest, I really liked my client’s idea of the expanded menu. That’s why I didn’t want to take a shortcut and simply delete the duplicate pages to get rid of the duplicate content. Even though that would benefit search engine optimization, it would significantly harm the user experience. Realizing that, I decided to work towards an optimal solution that would be an ideal compromise between SEO and UX.

technical seo and m commerce

How to Solve the Issue of the Duplicate Pages

To achieve two aims at once, which is to benefit SEO without harming UX,  I decided to pick one page as a kind of keystone that would keep the structure of duplicate pages together. Later, together with the client, we set up redirects to lead the users from the duplicate pages to the assigned “keystone”, which I’ll refer to from now on as the “Mother” page.

In this clever way, we could improve SEO and keep the user-friendly menu structure designed by the client.

Also, it’s worth noting that I didn’t take the potshot for singling the “Mother” pages out. I checked the metrics of each page in Google Search Console to select the ones showing the highest traffic and largest number of impressions. Naturally, the pages with lower ranks and worse results were intended for redirections.

Once I gathered all the data on the better- and worse-performing pages, I sent them to the client who swiftly set up the recommended redirects and linked the “Mother” pages in the menu.

This is how the problem of duplicate pages was successfully solved.

In case you’d like to learn more details on this process, I’d be more than happy to tell you…

Criteria for Selecting Mother Page from Its Duplicates

The tool of our choice is Google Search Console. I think it’s perfect for this task simply because

  • it’s free of charge, and
  • it shows you all the data you need to make the right decisions.

Of course, you’re able to obtain the data if you already have the tracking code added to your CMS. Without it – and I hate to break it to you – you won’t have the full picture concerning the health score of your website.

Let’s assume, however, you added the tracking code to your website some time ago.

STEP 1 Now, to collect the list of duplicate pages’ URLs, go through the 3 steps I showed you in Duplicate Pages Check.

STEP 2 Open Google Search Console and click Search results.

search results in gsc

STEP 3 Select the time range. I suggest analyzing a longer period of time, if possible make it 12-16 months. Only in this way are you able to notice alarming falls in visibility and traffic. Do this even if some of your pages have been added recently.

STEP 4 Click New to add a new parameter.

STEP 5 Choose Page and introduce the URL you want to inspect.

url inspection in google search console

STEP 6 If you feel like comparing two pages, simply click Compare.

compare urls in google search console

After introducing the URLs and choosing Apply, Google Search Console shows you a graph with the data on impressions and clicks, like on the screenshot below:

graph of compared urls in google search console

Note: The graph shows the data generated by two blog posts published on It isn’t related to the client’s case described in this article.

The bad news is that the Compare function offered in GSC allows you to compare no more than two pages at a time. Therefore, if your investigation revealed that there are multiple duplicate pages, then you simply need to check each pair separately to compare the metrics.

In this case, after selecting Page, introduce the URL address, collect the data on the impressions and clicks, and repeat the process for all the duplicate pages.

page inspection in google search console

Resolve the Duplicate Pages Problem: Redirects

You’ve just measured the performance of the duplicate pages. Congrats!

Now, you know which pages should become your “Mother” pages, so you can move on and set up redirects.

Here, however, I can’t give you one reliable method that works for everyone. Why? Because setting up redirects depends on two things:

  • the CMS you currently use and
  • the structure of your website.

Sometimes installing plugins may do the trick, other times there is no other way around it but introducing changes to the code.

My suggestion for you is to leave this to the web administrator. The specialist knows which solution is going to work best for you and your e-commerce store.

Remember to Optimize the “Mother” Page

You probably remember me saying that some of the duplicate pages lacked basic SEO-improving elements like meta descriptions or category descriptions.

For a page to be displayed high in Search, it must go through the search engine optimization process. Therefore, if you notice that some of your “Mother” pages miss some SEO-important elements, make sure they are added.

The Results of Solving E-commerce Duplicate Pages Problem

Finding the source of the duplicate pages problem in June and implementing redirects the next month stopped the client’s website’s visibility from decreasing. The fix we put forward with the client in the summer helped restore the lost visibility, which is shown in the below screenshot:

solving e-commerce duplicate pages problem results

I think I should also mention one more thing because it’s not a coincidence that the client’s website noted such a dramatic drop in visibility in May. On May 25th, Google announced the Core Algorithm Update. The thing is that the client’s website had multiple duplicate pages before Google released the update, which clearly caused the sudden ranking drop.


As Franklin D. Roosevelt once said:

It is common sense to take a method and try it. If it fails, admit it frankly and try another.

To me, as an SEO specialist, this saying is uncannily accurate. Trying new things and adopting new methods – even by using trial and error – is better than doing nothing. And I wholeheartedly encourage you to test and try new solutions, especially in doing search engine optimization.

Sometimes, however, slip-ups happen. You introduce changes to the website, having the intention of boosting visibility and CTR, yet the result appears to be far from expected. Whether we like it or not, none of us is all-knowing. That’s why sometimes consulting an SEO specialist may be your best solution. It may be a metaphorical lifeline thrown to your drowning online business.

Here’s another thing I’d like to address. It may also happen that you miss the information on some recent Google algorithm update that clearly messes up your website ranking. Unless the source of the rank drops is identified soon enough, you may find it hard to prevent the downward trend from harming your online business. Such an only seemingly trivial item of news like a new algorithm update release may cost you big time: dramatic drops in visibility, organic traffic, and sales.

To avoid that, think of staying in touch with a trusted SEO specialist. Such a professional will keep an eye on your website, often reacting even before some nasty issue comes up. And if something bad happens to your website, an experienced SEO specialist will find the source of the problem quickly and fix it.

Kinga Ochojska Delante
Kinga Ochojska

SEO Specialist

At Delante since October 2018. Once an avid student of editing, now – a happy graduate. She knows well where to insert a comma and where to eat deliciously in Krakow (and Warsaw!). Originally from Silesia, now in exile in the capital, a lover of sleep, sage chops and Friends.


  1. Great post! I love the way you break down the different solutions to the eCommerce duplicate pages problem. It’s clear that you have a deep understanding of the issue, and the solutions you propose are thorough and well thought out. I’m sure this post will be incredibly helpful to anyone looking for a way to tackle this issue. Thanks for sharing!

Leave a comment


Why are duplicate pages bad for your website?

Duplicate pages confuse Google bots. They don’t know which page to choose since the content published on both pages is (almost) identical. That’s why the bots sometimes pick page A, other times they choose page B, which later distorts the results in analytical tools. In the worst case scenario, however, Google bots punish duplicate pages by lowering their ranks. Once Google bots lower the ranks and devalue the duplicate pages pushing them down the SERP, your rivals’ websites are automatically given higher positions.

Why should SEO and UX go hand in hand?

Optimizing a website for main keywords to make it appear at the top of SEPRs for those phrases is one thing. However, if there are issues with the website’s design and UX, there is a slim chance users would be eager to stay and explore it further. Sadly, even the best SEO strategy may not improve the conversions if your website is hard to navigate and causes confusion in the users.

Can I make changes to the website without consulting an SEO specialist?

Sadly, it’s getting increasingly common for e-commerce store owners to unintendedly damage their website ranking. What is supposed to improve one thing, may have a disastrous effect on the others. Therefore, all modifications should be consulted with an SEO specialist before introducing them to the website for good.

Get a
Free Quote


european search awards finalist