What is Crawler? – Definition

Crawler (also known as spider or spiderbot) is an Internet bot that is used by search engines. Its function is to collect information about the web architecture and its content in order to index it and, if valuable, display it to users in the search results.

What are Web Crawlers for?

The main functions of crawlers are:

  • to validate the site code,
  • to review the website’s content,
  • to collect additional information about the site relevant to users,
  • to check for content updates available on the Internet.

Main Crawler Functions

The most recognized crawler is Googlebot. It is also worth noting that crawling doesn’t always mean that a page is indexed and therefore displayed in the search results.

Was this definition helpful?
Delante is an online marketing agency specialized in generating website traffic from search engines and paid ads (SEO / SEM). Over 80% of our clients are from rapidly growing e-commerce industry.


Daily SEO Checklist - 12 Tasks to Perform on a Daily Basis

Daily SEO Checklist - 12 Tasks to Perform on a Daily Basis

In order to be successful, SEO activities have to be performed on a regular basis. Thanks to keeping your finger on the pulse and refining your strategy every day, you can gain a competitive edge over your market rivals, and sleep soundly knowing that your website complies with Google requirements. Want to discover our daily SEO checklist? Keep reading!

Read more