Planning to introduce a top-notch and effective SEO campaign? In this case, you need to consider what activities you will introduce on-page and off-page. But is that all? Of course, not!
If you want your website to achieve great success in the search results and make Google love it, you need to remember about technical SEO as well.
What is technical SEO and which elements does it cover? You will learn all this from my exhaustive guide. So let’s dive in and discover more about the technical elements of websites!
Table of contents
- What is Technical SEO? Definition
- Technical SEO Elements
What Is Technical SEO? Definition
Technical SEO is the process of introducing changes to websites that are intended to comply with the technical requirements of search engines (mainly Google). Its main goal is to increase visibility in search results and organic traffic. Technical requirements are constantly changing and become more sophisticated over time, which is why this process should be ongoing. The technical SEO process gives the best results when it is supported by off-page and on-page activities.
If you would like to learn more about off-page SEO, its elements, and the most effective techniques, I’ve prepared a comprehensive guide. Check this out!
Technical SEO Elements
If you want to make technical changes to the site, these are the elements you need to consider:
A sitemap is a file that consists of a list of your website’s pages. It is a sort of roadmap, thanks to which search engine robots can crawl your website more effectively and find all the pages much easier.
Although this file is not obligatory for your site and it can function without it, however, even Google has officially confirmed that XML sitemap is one of the most important sources for finding URLs, so you can benefit significantly by creating this file.
What a Correct XML Sitemap Should Be Like?
If you want Google to interpret your sitemap correctly, remember the following things:
- include only canonical URLs,
- use UTF-8 encoding – it should contain only ASCII characters,
- use hreflang in order to indicate different language versions,
- use consistent URLs as Google will index pages in exactly the same form as listed,
- use various extensions if you want to include other types of content in your sitemap, for example, video, news or images.
Where to Find / Submit Your Sitemap?
If you are wondering if your website has already implemented XML Sitemap, the best way is to check it in your Google Search Console account in the Sitemaps section:
If you already have a sitemap created, you will see it on a list in the “Submitted sitemaps” panel:
If you would like to learn more about sitemaps, check our article: sitemap for SEO
Redirects are created so that a web page can be reached by entering more than one URL. They are introduced for several reasons:
- to shorten URLs,
- after removing a given page to avoid broken links and 404 error,
- to take users to one main page version (e.g. from http://www to https),
- when merging two or even more sites.
Redirects have a huge impact on SEO, as thanks to them, you will not lose valuable traffic and backlinks which, if they point to a 404 page, wouldn’t be counted by Google. Remember also that, as I mentioned earlier, users should always be redirected to one main version of the website. This means that even if they enter, for example, http://www.delante.co, the web browser will open https://delante.co (below I explain why HTTPS is a better choice than HTTP). There are many SEO tools that you can use to check this – personally, I recommend httpstatus. Correctly introduced redirects should look like this:
Keep in mind that there are several types of redirects, 2 of which have the greatest impact on SEO. I’m talking about 301 redirects and 302 redirects. How do they differ and which solution will be better for your site? Generally speaking, 301 redirect is permanent and 302 temporary. For users it makes practically no difference, however, it looks different for search engine crawlers, which tend to ignore pages with 302 redirects as they know that these pages won’t stay for long.
If you would like to learn more about redirects, check our article: Why to use redirects and what are their types?
Page speed is one of the first things you need to improve when you start optimizing your website. Users simply hate websites that are too slow, and Google knows that, that’s why it penalizes all websites that load very long.
If you would like to improve page speed, here are the things you need to keep in mind:
- resize images on your site – if they are too heavy, they slow down the site,
- reduce SRT – the amount of time your server needs to respond,
- upgrade hosting,
- introduce correct 301 redirects – especially when the redirect chain is too long.
Check out the Google PageSpeed Insights tool, which will tell you in detail which elements of your website should be improved in order to make it load faster. This is the official tool introduced by Google, so you can be sure that all the elements indicated there are in fact taken into account by this search engine.
If you would like to learn more about page speed, check our article: Website speed importance in SEO
Have you ever heard of Panda? This Google algorithm was introduced in 2011 to penalize websites that have an excessive level of content duplication and reward those with high quality, unique, and user-friendly content.
Keep in mind that there are two types of content duplication: internal (within your site) and external (pieces of content that appear on other websites).
If you want to check the level of internal duplication, there are several tools that will come in handy: Ahrefs, Screaming Frog, or Siteliner. I personally recommend the last one, which will quickly generate a report and indicate which elements of your pages are duplicated:
And here are the most frequent reasons for a too high level of internal duplication:
- you didn’t indicate one main page address (I tell more about it in the Redirects section),
- your category or product description are practically the same,
- you didn’t introduce language versions correctly or didn’t translate all the elements,
- you didn’t prepare unique title or meta description tags.
If you would like to learn more about duplicate content, check our article: Content duplication. How to deal with this issue?
The ideal situation is when Google can correctly and easily index all your pages. This way, your website can appear in organic search results on more keywords and, consequently, boost web traffic. Unfortunately, sometimes Google faces problems when trying to index your website.
First of all, you should identify these issues – it’s best to do it with Google Search Console in the Index Coverage report (although Screaming Frog and Ahrefs may also be helpful):
The most relevant issues you need to fix in the first place (and monitor on an ongoing basis as well!) include:
- 404 error – a page has been removed,
- 500 error – the internal server error,
- noindex – a page had the noindex attribute in its source code which makes it impossible to index it,
- redirect error.
If you would like to learn more about crawling, check our article: Google crawling. How does it work?
If you haven’t switched from HTTP to HTTPS yet, it’s high time to do it. This protocol provides much better encryption of the transmitted data, allowing users to be more assured that their online activities and transactions are fully secure. HTTPS has been a standard for years and currently, about 95% of global traffic across Google is encrypted. Take a closer look at this chart:
As you can see, year after year encrypted traffic is growing and this trend is still continuing. Google admits that it is aiming at encryption to cover 100% of traffic soon, so HTTPS has become one of the Google ranking factors.
If you would like to learn more about SSL, check our article: SSL certificate installation – the most common errors
Currently, more than half of the global traffic comes from mobile devices. What does that mean? Simply put, that means that you need to optimize your website for mobiles. Without this, you lose a really valuable source of traffic and, what’s more, Google will rate your website unfavorably. Remember that mobile-friendliness is also one of the Google ranking factors!
In the first step, I advise you to carry out a test that will tell you if your website is mobile friendly and if not, which elements require optimization. Choose for example Google Mobile-Friendly Test which will generate a clear and comprehensive report:
If you want to optimize your website for mobile devices, here are the things you need to pay special attention to:
- website speed – often pages load much slower on mobiles than on desktops, you need to take care of the right page loading speed (PageSpeed Insights will show you two separate reports – for mobiles and desktops);
- responsive design – every page should easily adapt to different screen sizes on every type of device: desktop, tablet, and mobile;
- user experience – make your users’ visit to the mobile version of your website completely enjoyable, so remember about the right font size, buttons, and clickable elements;
- separate URLs for mobile devices and desktops.
If you would like to learn more about mobile SEO, check our article: Mobile SEO. A Complete Guide for Mobile Devices SEO
Structured data are code elements that inform search engines what your pages contain. Simply put, they help crawlers to better understand the elements of your content. But what can you really gain from organizing them? Two words: rich snippets! You can see them when you enter your query in Google and then the search engine will show you the results containing not only the title and meta description tags but also more useful information.
Look at this example:
Thanks to structured data, Google has also shown additional information about this recipe, in this case: rating, calories, preparation time.
At Schema.org, you will find a complete list of structured data that you may implement. The most widely used tags include books, movies, places, local businesses, products, and… so much more! Moreover, Google also introduced its own tool to test structured data tags – use it to find out if you have implemented these tags correctly.
If you would like to learn more about structured data, check our article: Structured Data in SEO: What You Need To Know
Breadcrumbs are a great solution that shows users their location on your site – visitors can easily specify where they are currently and how to return to the previous page quickly. This element, usually placed on top of the page, helps to determine the site architecture:
Breadcrumbs are also very important for SEO. Why is that? They support internal linking, make the analysis of the web architecture much easier, and, consequently, support effective page indexation. If you introduce them correctly, the path will also be visible in the search results, which may additionally encourage users to visit your site.
Implementing breadcrumbs is not complicated, and can bring really great results.
If you would like to learn more about breadcrumbs, check our article: Breadcrumb navigation – what is it and why is it important in SEO?
Canonical is a tag that you add to the <head> section of your website to indicate which page contains the original content and which ones are copied. As I mentioned above, internal duplication is extremely negative for Google and rel=canonical gives a great opportunity to get rid of this problem, at least to some extent. So if you have the same content on several pages, you should consider adding a canonical attribute – then only the page tagged with a canonical link will be indexed by Google robots.
When to implement a canonical URL? Mostly, when:
- you have two different versions of one page (e.g. http and https) and you want to indicate which of them is the main one,
- you can sort products or filter them within one category,
- you implement pagination,
- your product descriptions are duplicated and differ only in details.
If you would like to learn more about canonical URLs, check our article: rel = canonical. What are canonical URLs for?
Does your website have two or even more language versions? In that case, hreflangs are an absolute must-have, whereas it turns out that most websites have not even implemented them!
So, what are hreflangs and why should you care about them at all? They inform search engines that your website has several language versions and that your visitors come from different countries. Thanks to them, users will be redirected to the language version suitable for them, reducing the bounce rate at the same time. What’s more, your websites will not be interpreted as duplication, which could be a problem especially in the case of websites in one language but targeted at two different countries, such as for example the United States and the United Kingdom.
If you would like to learn more about hreflangs, check our article: Hreflangs. How to tag the language versions of your website?
Robots.txt is a simple file that will make your communication with Google crawlers much more effective. As crawlers should not have access to all your pages, which is particularly important in the case of online stores, it is worth creating a list in which you indicate which elements should not be crawled. And that is what a robots.txt file is for.
And which specific elements are not supposed to appear in the Google index? These are, above all, user panels, shopping carts, order procedures or internal search engines. Just to name a few – the list of pages excluded from indexation depends on the type of your website!
There are many great ways to generate this file for your website – check in your CMS as most of them have such an option built-in.
Here are the most relevant things to remember about robots.txt:
- each site can have just one robots.txt file,
- this file should be called “robots.txt”, don’t give it any other non-standard name,
- the file should have a UTF-8 encoding (with ASCII characters only),
- two of the most important directives in such a file are: disallow (the selected element should not be crawled) and allow (it can be crawled).
If you would like to learn more about robots.txt, check our article: what is a robots.txt file and how to use it properly?
Your website should be as user-friendly as possible.It applies to many elements, and one of them is URLs. So what should be your site’s URLs so that you can say with confidence that they are user-friendly?
Remember the following rules when creating URLs:
- they should have an appropriate structure in such an order: domain name (delante.co), parent directory (e.g. category name), and products name;
- separate individual words with hyphens (e.g. technical-seo-complete-guide) instead of underscores (“_”);
- implement the SSL certificate (https://);
- URLs should not be too long (115 characters or less is the recommended length);
- each page must have a unique URL.
If you would like to learn more about URL structure, check our article: URL structure – how to build a user and robot-friendly link?
As you can see, when it comes to <strong, there are so many elements that contribute to your site performance in the search results. If you want to obtain the best results possible, you need to consider all of them and constantly monitor the effects of changes implemented. What is important, Google is constantly introducing changes in its algorithms, so you have to adapt your technical SEO activities to the current requirements of the search engine.
Need help with technical SEO? Our SEO experts are ready to help you! They will conduct a professional SEO audit which will identify which technical elements of your website require optimization. Technical SEO supported by both on-page and off-page activities will guarantee success!