5 Fundamental Google Search Console Functions

7min.

Comments:1

5 Fundamental Google Search Console Functionsd-tags
23 December 2020
Are you planning a SEO campaign aimed at increasing visibility in the search results and organic traffic? In this case, Google Search Console is an absolute must-have. SEMrush, Ahrefs, and other commercial SEO tools are great, I am not going to argue with that, however, remember that they are nothing compared to this great free tool offered by Google. In this article, I present the most fundamental Google Search Console functions that can take your website to a completely new level. Let’s discover them!

7min.

Comments:1

Table of contents:

The Best Functions to Make the Most Out Of Your Search Console

1. Sitemaps

Not sure what is a sitemap? In a nutshell, an XML sitemap is a file that provides information about the content of your website (pages, videos, files) and helps the search engine crawlers to read your website correctly. Basically, without this file reaching your content will be much more complicated. A sitemap is not obligatory but you can really gain a lot with this file.

How to submit an XML sitemap?

1.From your Search Console dashboard, choose the Sitemaps option (you will find it in the Index section): Google Search Console Functions Sitemap 2. Add the name of your sitemap and then click the Submit button. GSC and sitemaps screenshot   3. If your sitemap has been created correctly, you will receive a message saying “Sitemap submitted successfully”. The new sitemap will appear in the panel (if you want to check if your website has already had a sitemap, this is also a great place to do it): Google Search Console and Sitemaps Submitting a sitemap in Google Search Console is extremely easy, however, preparing the file is a bit more challenging. If you would like to learn more about the importance of sitemap in SEO, I’ve prepared for you a guide that will explain everything you need to know!

2. Index Coverage Report

Google Search Console shows all indexed pages and indicates whether there are any errors found when indexing your website. You should monitor them on a daily basis, as too many errors (such as 404 error or 500 error) on a website can result in a sudden drop of valuable web traffic. And that's not what you want, right? So where you will find the Index Coverage Report which will indicate all the issues connected with the indexation of your website? In the Coverage option (it is available in the Index section on the right). Google Search Console Index Coverage Report Once you choose the option, you will get a full report of errors. At first sight, this report may seem a bit complicated, so I will explain how you should read the report to obtain all the necessary information. On the summary page, you will see a report with all pages that Google attempted to crawl. The status is grouped into 4 main categories:
  • error - a page is not indexed, you should fix the problems as a priority,
  • warning - a page is indexed, however, some issues encountered when indexing the page;
  • excluded - a given page is not indexed intentionally, for example by using the noindex directive,
  • valid - everything is fine with a page, it is indexed correctly.
Indexation report Google Search console There are the possible errors that can be found by Google:
  • server error (5xx) - the internal server error, appears when, for some reason, it was impossible to process a given request,
  • redirect error - one of the redirect errors appeared, for example, too long redirect chain, empty URL in the redirect chain or a redirect loop,
  • submitted URL not found (404) - a page doesn’t exist anymore,
  • submitted URL seems to be a Soft 404 - the server sends the 200 OK status code, however, Google thinks that your page doesn’t exist,
  • submitted URL marked ‘noindex’ - a page had the noindex directive in a meta tag or HTTP header,
  • submitted URL returns unauthorized request (401) - Google can’t index a page as it lacks valid authentication credentials,
  • submitted URL has crawl issue - Google was unable to crawl a page,
  • submitted URL blocked by robots.txt - it was impossible to index a page as the robots.txt file blocked it.
It is worth paying attention to the fact that at Google Search Console you can also choose the primary crawler value (two user agent types: smartphone or desktop) which simulates how a user would see your website or mobile devices or desktops.
Author
Mateusz Calik
Author
Mateusz Calik

CEO

Managing Partner, has been building Delante since 2014. Responsible for international SEO strategies. He has a strong analytical approach to online marketing backed by more than 12 years of experience. Previously associated with the IT industry, as well as the automotive, tobacco, and financial markets. Has experience in creating scaled processes based on agile methodologies.

Author
Mateusz Calik
Author
Mateusz Calik

CEO

Managing Partner, has been building Delante since 2014. Responsible for international SEO strategies. He has a strong analytical approach to online marketing backed by more than 12 years of experience. Previously associated with the IT industry, as well as the automotive, tobacco, and financial markets. Has experience in creating scaled processes based on agile methodologies.

SEO SEM Agency

Get a
Free Quote

Awards

Award - Deloitte 2021 Award - European eCommerce Awards 2022 Award - European Search Awards 2022 Award - Global Agency Awards 2022 Award - IPMA Award - US Search Awards 2021