Have you ever wondered what influences your position in Google rankings? Google is the unquestionable leader among search engines and for many years the e-market has been analyzed and tested to check which factors affect the rankings. Although it’s very difficult to estimate the exact number of elements that influence the position of your website in search results, it’s more than certain that there are plenty of them. It’s also worth noting that Google regularly modifies its algorithms and this, in turn, closes the door on defining all the ranking factors.
Ranking factors are straightforward signals that help website owners to determine what to take care of and what action to take in order to meet Google’s requirements and, consequently, to improve the position in search results. As it’s been already mentioned in the introductory part, the number of factors is rather abundant and they concern different aspects of the website, nevertheless they can be grouped as follows:
- factors directly related to the website and its elements;
- factors related to various offsite activities.
It’s worth remembering that top positions in search results can be achieved when you prepare your site properly and in accordance with the factors that Google pays special attention to. In our today’s entry we’ll do our best to outline the most important factors from the abovementioned groups.
Ranking factors are elements that influence the position of a website in search results.
Factors directly related to the website and its elements
Unique, valuable and up-to-date content
When creating texts for a website, it should be ensured that they’re of high quality and contain sufficient number of keywords. Interestingly, Google ranks better these sites whose content focuses on their subject matter. This notion is called Topical Authority and its main concept is to promote websites that are beneficial for the users and help them to comprehend various areas of knowledge. Google puts great emphasis on the context of content. One of the benefits of contextual searches is helping the user to get faster and more accurate search results based on the query. For instance, if you’re searching for the phrase “Batman” during your stay in Turkey, you’ll probably be redirected to information about the city with this name.
Spelling and grammatical correctness are equally important – disregarding these aspects may deter your potential customers. When you decide to write on a particular issue, make the effort to provide valid sources and try to create unique and relevant content. In order to increase the value of the content, enrich it with diverse multimedia elements such as pictures, films or animations. Remember about alternative texts for all photos on the website. Using the title headings H1 – H6 influences the transparency of the text and helps the search engine robots to determine which content on the page is more important in the author’s assessment. It’s a good idea to introduce keywords in your headers, however everything should be moderate and natural.
Properly selected keywords
When users look for some information using search engine, they generally type in the phrases they’re interested in and expect the best and the most accurate answers to the asked questions. In such situations, Google’s algorithms search their database of indexed sites in order to find the websites whose content best responds to the user’s query. That’s why including correct keywords in the content is so vital. Bear it in mind that the keywords and the content itself should contain those phrases that correspond to the subject matter of a given website. To highlight the most valuable parts and phrases on your website, you can use bold
<strong> and tilt
<em> tags which are really powerful from the search engine’s point of view.
Meta tags are elements placed in the
<head> section of the website. They’re also called HTML tags and can be found in the source code. While part of the tags is visible to users, some of them are visible only to search engine robots. Many of these tags have a noticeable impact on SEO, therefore it’s profitable to optimize them properly. Among the meta tags that should be optimized for SEO, we can distinguish:
- Meta description
- Meta Keywords
- Original source
- Meta robots
No duplicate content
Original and unique content is something that Google pays special attention to (the Google Panda algorithm). Duplicate content occurs when identical or only slightly modified text appears on different websites. However, it should be also noted that duplicate content within a given page is also unwelcome by Google robots. Nevertheless, there are numerous methods to eliminate duplicate content – you can, among others, create original and unique texts or descriptions, use 301 redirects, introduce distinctive title headings within each subpage or add canonical parameters informing search engine robots which URLs are the original links.
Internal linking is about creating links to particular subpages within a given website. This practice improves the position of keywords, makes the navigation easier for users and facilitates correct indexing of all subpages by Google robots. There are many types of internal linking and among them we can distinguish:
- website navigation menu – its design will be the most important for the users. On the other hand factors such as appropriately chosen words and proper use of anchors will be crucial for SEO.
- contextual linking – it’s about using the context of a given article in order to place a suitable link in its content. For example, when writing text on a given topic, you can encourage the users to read a related post in which they’ll find some supplemental information.
- linking with the use of related entries lists – this technique is frequently used on subpages with various articles. The link which refers to an older publication is usually placed at the end of the article and it’s supposed to encourage the users to check the entry which might prove to be of interest to them.
- upselling and linking similar products – this type of linking is usually encountered in online shops and has two main advantages. First of all, it can increase your sales by redirecting customers to the products they may choose to buy and second of all, it positively influences the profile and structure of internal links.
It’s crucial to ensure the correct structure of URLs when creating links within a given website. User-friendly URLs are short addresses without special characters, capital letters and with the appropriate use of “-“. Make sure that you use 301 redirects whenever it’s required.
The number of indexed subpages
Indexing takes place at the very moment when Google robots visit your website. They analyze the codes of subpages and their content. Then, on that basis the robots determine whether the subpage is worth being ranked in the search results. If it is, the robots decide which position will be appropriate for your site. At this point we need to explain a very important term, Crawl Budget which is the budget of website indexing that determines the number of subpages that Googlebot can index during one visit on the website. Crawl Budget consists of two indicators:
- Crawl Rate Limit – limit of indexation rate;
- Crawl Demand – frequency of indexation.
The combination of these two parameters indicates what Crawl Budget will be allocated to your website. It’s ideal when both of these factors are at a similarly high level and it can be achieved e.g. through the proper technical optimization of the website. Remember that subpages which haven’t been indexed by Google robots can’t be displayed in the search results, therefore it’s very important to take care of correct indexing of all the subpages. If you want to check how many subpages of a given website are indexed by Googlebots, simply enter “site:website address” in the search engine. What if your website has hundreds of subpages and you can see only a few after typing this command? Well, it’s a clear message – you should focus on indexing the missing sites.
Website loading speed
In all probability, websites that load for a long time deter users which, in turn, increases the rejection rate. In such cases Google gets a clear signal that website may not be user-friendly. Website loading speed should be efficient. It’s recommended either to scale and compress images of large size or to use modern photo formats (e.g. .webP). You should also avoid advanced scripting that increases the loading time. Instead, use static file caching and, if possible, served side caching.
Responsive web design
It’s advisable to modify the site code in a way which makes it possible for the page to adapt to the size of a device on which it’s displayed. Responsive web design has a positive impact on the Google’s algorithms – in April 2018 the search engine introduced the so-called Mobile First Index which is a list of websites adapted for mobile devices. The condition which needs to be fulfilled in order to be placed on the list is to provide the same content on the mobile and desktop version of your website. Google pays particular attention to responsive websites, therefore when users are searching from the mobile device, Google will position the websites suitable for these devices higher in search results. In the modern world full of technology and mobile devices, it’s a real necessity to take care of responsive web design.
The security of Internet users is crucial for Google. For this simple reason, websites that have a valid, correctly implemented SSL certificate are rewarded by the search engine. This certificate allows and confirms secure connection from a web server to a browser. To prove its importance, in August 2017 Google, via Search Console, sent warnings to the owners of shops and websites without the certificate. The SSL certificate increases the credibility and the confidence in your website. For people making a purchase in a given e-shop and entering their data there it’s become almost a requirement. On the other hand, the lack of the SSL certificate has a negative impact on the rejection rate.
Site map is an XML document with addresses to all subpages of a given website. Because it facilitates the navigation of robots on the website, it’s created mainly to make the indexing in search engines more effective. Site map is particularly useful when we consider portals with thousands or hundreds of subpages. However, there are no contraindications to create the XML site map for a smaller website.
Factors related to various offsite activities
Valuable external links
Incoming links affect the position of the website in the results returned by the search engine. While evaluating a website, Google checks the average value of the links that lead to it and the quality of domains from which these links originate. You can check the parameters of links in different tools such as Ahrefs or Majestic. Users love to share their favorite articles with others and this is one of the reasons why it’s essential to provide them with catchy and relevant content. Moreover, external links are a clear signal to the search engine saying that the content on a given website is user-friendly and valuable. Remember to update your links – there shouldn’t be any which generate 404 error or are out-of-date. The appropriate profile of the links is also extremely important – take care of good distribution of the proportions of clean, anchor, dofollow and nofollow links. Being active on social media is crucial when it comes to SEO so focus also on this aspect – try to regularly publish posts and photos with links to your site.
The most important rules of proper link-building have been described in one of our previous entries.
Domain authority and its age
The age of a domain is still considered to be one of the Google ranking factors. As a rule of thumb, the older domains, with a longer and positive history, are ranked higher than the younger ones. Here, it’s vital to mention the still unconfirmed theory of Google Sandbox, according to which new domains are stuck in a titular sandbox which makes it very difficult to “enter” the Google rankings. It’s believed that this practice aims to avoid situations when website owners artificially want to improve the position of their new website in the search results by means of over-optimizing, spam links and other unfair SEO activities. It’s assumed that the website stays in Google Sandbox for 1-6 months, depending on the actions and keywords used.
Later on, when the domain has been active for a long time and has many valuable external links or a positive history, it’ll be appreciated by Google. Nevertheless, the age of the domain can also negatively affect SEO. The history of a website with a Web filter or the one that once belonged to the adult industry may have a negative impact on its current position in the search results. Thus, it’s essential to check the history of a domain before buying it!
Another Google ranking factor is Domain Authority, the indicator created by Moz. It’s calculated mainly on the basis of links that lead to the website. You can find more information on this matter in our previous article: Domain Authority and its Trust Level – What does It Affect?
If you want to be ranked higher in the search results and build domain authority, focus on naturally obtained high quality links.
As a matter of fact, Google ranking factors and various speculations about them are so numerous that it’s impossible to give the exact number and discuss all of them in one entry. But why speculations? Well, Google still doesn’t directly confirm the information about the authenticity of these factors. That’s why we’ve decided to explain and outline the areas which, in our opinion, are the most pertinent. Bear it in mind that Google regularly updates even the most basic parameters. Stay up to date with industry news, analyze trends and be quick to react – all of these will certainly positively influence the position of your website in the organic search results.