Skip to content

Google bots crawl rate

0

Hello, According to Google, we have maximized the crawl rate on our servers, which can cause excessive crawling that affects the performance of the websites. Is there a way to limit the crawl rate? Ideally without this causing Google to eliminate products from Adwords campaigns because it cannot access the product landing pages.

asked 2 years ago276 views
2 Answers
0
EXPERT
answered 2 years ago
0

You can use Robots.txt

here is a sample file for you

User-agent: *
Disallow: /private/
Disallow: /temp/
Disallow: /test/

User-agent: Googlebot
Allow: /
Disallow: /example-subdirectory/
Crawl-delay: 10

User-agent: Bingbot
Allow: /
Disallow: /example-subdirectory/
Crawl-delay: 10

# Sitemap link
Sitemap: https://www.yoursite.com/sitemap.xml

answered 2 years ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.