Help:Reduce server load with a robots.txt file
From semantic-mediawiki.org
Semantic extension(s): | -/- |
Further extension(s): | -/- |
Keyword(s): | performance · robots |
Description:
This tip explains how to reduce the load from your wiki by adding a "robots.txt" file to your wiki's document root. Note that this example assumes you are using short URLs with the following configuration settings in your wiki's "LocalSettings.php" file:
$wgLanguageCode = 'en';
$wgScriptPath = '/w';
$wgArticlePath = "/wiki/$1";
This is the contents for the "robots.txt" file:
#
# robots.txt
#
User-agent: wget
Disallow: /
#
User-agent: *
Allow: /w/load.php?
Disallow: /w/
Disallow: /wiki/MediaWiki:
Disallow: /wiki/MediaWiki%3A
Disallow: /wiki/Special:
Disallow: /wiki/Special%3A
#
See also[edit]
- Help page on robots.txt on MediaWiki.org