Pages

Friday, January 20, 2017

Crawl Rate

When Googlebot crawls a site there’s a set number of simultaneous connections it can make, and set length of time it must wait between fetches. This is called “crawl rate limit”, and every site’s limit is unique.

Crawl rate limit is defined by two factors.
The first is crawl health, meaning if the site responds quickly Googlebot can use more connections. If the site begins to slow down from too much crawling, then Googlebot will use fewer connections so it doesn’t degrade the user experience.

The second factor is Search Console — site owners can use Search Console to set a crawl rate limit manually within the Site Settings section.

No comments: