2 Answers
- Newest
- Most votes
- Most comments
0
Hi, find the official doc here on this very item. https://developers.google.com/search/docs/crawling-indexing/reduce-crawl-rate
0
You can use Robots.txt
here is a sample file for you
User-agent: *
Disallow: /private/
Disallow: /temp/
Disallow: /test/
User-agent: Googlebot
Allow: /
Disallow: /example-subdirectory/
Crawl-delay: 10
User-agent: Bingbot
Allow: /
Disallow: /example-subdirectory/
Crawl-delay: 10
# Sitemap link
Sitemap: https://www.yoursite.com/sitemap.xml
answered 2 years ago
Relevant content
- asked 3 years ago
- asked 2 years ago
- AWS OFFICIALUpdated a year ago
