Unwanted “crawl delay: 10” line added to my robots.txt

I’ve noticed that when I make a request to my robots.txt page, I get it with a crawl delay: 10 line added to it. I double-checked the file, and it doesn’t have that line, so why does it appear? Is it maybe a strange behaviour of some plugins?

Best Answer

For those who are using WordPress as CMS for their site, you can bypass your web hosting server rules by simply removing your robots.txt file and instead modifying the virtual one generated by WordPress. You just have to add a filter to the functions.php file of your theme.

Here’s the code snippet:

//* Append directives to the virtual robots.txt
add_filter( 'robots_txt', 'robots_mod', 10, 2 );
function robots_mod( $output, $public ) {
    $output .= "Disallow: /wp-content/plugins/\nSitemap: http://www.example.com/sitemap_index.xml";
    return $output;
}

All you have to do is modify the $output with your own directives.

Leave a Comment