Crawler

What is Crawler? – Definition

Crawler (also known as spider or spiderbot) is an Internet bot that is used by search engines. Its function is to collect information about the web architecture and its content in order to index it and, if valuable, display it to users in the search results.

What are Web Crawlers for?

The main functions of crawlers are:

  • to validate the site code,
  • to review the website’s content,
  • to collect additional information about the site relevant to users,
  • to check for content updates available on the Internet.

Main Crawler Functions

The most recognized crawler is Googlebot. It is also worth noting that crawling doesn’t always mean that a page is indexed and therefore displayed in the search results.

Was this definition helpful?
(0)
Author
Delante is an online marketing agency specialized in generating website traffic from search engines and paid ads (SEO / SEM). Over 80% of our clients are from rapidly growing e-commerce industry.

RECENTLY ON OUR BLOG

How to Help Customers Find You Online?

How to Help Customers Find You Online?

The percentage of people shopping online increases year by year. Consequently, the number of online stores is also getting bigger. What can you do to make customers’ decisions easier? How to ensure that users finalize transactions in your store? Keep reading to find how to achieve your goal!

Read more
Content Marketing Statistics & Facts You Simply Must Know

Content Marketing Statistics & Facts You Simply Must Know

Content marketing is one of the most frequently introduced strategies in online marketing by brands from all over the world and there is completely nothing strange about it. With well-planned and engaging content, you can establish trust with your audience, generate leads, improve conversion rate as well as boost web traffic, and build authority. Learn more!

Read more