Search engines use "crawler robots" to index your web pages and their content. In some cases, the requests from these robots can be overwhelming for your website. Depending upon how your website is built/optimised and the resources of the hosting plan you've purchased, a flood of requests could cause the website to temporarily exceed its available CPU/memory resources, or could cause the website to slow down significantly for normal web visitors.

Many search engines like Bing and Yahoo! support REP (Robots Exclusion Protocol), and one part of this protocol is that the crawler robots will look for a file named robots.txt and follow any instructions within that file. The file can easily be created by any website owner or web developer and placed in the main website directory (usually public_html). More detailed information about the robots.txt file can be found at The Web Robots Pages (robotstxt.org).

In the robots.txt you can add one simple line to instruct all crawler robots to slow down:

Crawl-delay: 1

Important: Not all bots support or respect the Crawl-delay directive. Support varies by crawler:

  • Some bots ignore Crawl-delay entirely.

  • Some bots require crawl-rate changes to be configured in their console (e.g., Google Search Console), rather than (or in addition to) robots.txt.

  • Some bots follow robots.txt only partially, and malicious/spam crawlers may ignore it completely.

If a crawler is causing performance issues and does not respect robots.txt, you will need to mitigate it at the server/CDN level (for example by blocking the user agent or IP range, applying rate-limits, or using a WAF). Checking your access logs (or AWStats) is the quickest way to confirm whether robots.txt directives are being followed.

Google Bot

The Googlebot ignores the "Crawl-delay" directive. It's recommended to register your website for Google Search Console Tools to adjust the crawling rate and other settings.

For more information, please read the official documentation: Change Googlebot crawl rate - Search Console Help

Bing Bot

You can instruct the Bing bot to crawl your website more slowly as follows:

User-agent: bingbot
Crawl-delay: 1

The Bing crawling robot accepts values of 1 (slow), 5 (very slow) and 10 (extremely slow). Generally, this option refers to the maximum number of pages per second the crawler is allowed to read.

Alternatively, a website owner can register for Bing's Webmaster Tools and manage their website's crawl rate, here: Bing Webmaster Tools

Yandex

This search engine can quite aggressively crawl websites and is often responsible for causing website downtime. Thankfully, you can set a timeout value in seconds, so it will take a 2, 4, 6, 8 second break between each request, for example:

User-agent: Yandex
Crawl-delay: 4

Most Yandex users are in Russia, so if your website does not have an audience in Russia, you could consider blocking the robot altogether, like this:

User-agent: Yandex
Disallow: /

Meta crawlers (meta-externalagent / meta-webindexer)

Some websites may see unusually aggressive crawling from Meta-related user agents such as meta-externalagent and meta-webindexer. If these crawlers are consuming resources or bandwidth, you can try limiting them via robots.txt.

Note: The robots.txt file is an advisory standard. Well-behaved crawlers will follow it, but malicious or poorly behaved bots may ignore it. If the bot ignores robots.txt, you may need to block it at the web server level (see the .htaccess example below).

Option A: Slow down the crawl rate (if the bot respects Crawl-delay)

User-agent: meta-externalagent
Crawl-delay: 10 User-agent: meta-webindexer
Crawl-delay: 10

Option B: Block these user agents completely in robots.txt

User-agent: meta-externalagent
Disallow: / User-agent: meta-webindexer
Disallow: /

Option C: Block at the server level (.htaccess)

If you need an immediate server-side block, you can add the following rules to your .htaccess file (usually in public_html):

<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} (meta-externalagent|meta-webindexer) [NC]
RewriteRule ^.* - [F,L]
</IfModule>

DISCLAIMER: The scripts provided in our knowledgebase are for informational purposes only. We do not provide any warranty or support. It is essential to review and modify the scripts to fit your site's specific needs. There may be unforeseen outcomes when adding new script code to an existing website. You should discuss it with your website manager and seek advice from an experienced website developer if you are unsure.

Updated by SP on 23/02/2026

Was this answer helpful? 3 Users Found This Useful (3 Votes)