Stop AI crawlers: Difference between revisions

Add example list of all the Special Pages and corresponding Lockdown configuration
link to vedmaka's PR
Line 18: Line 18:
== Defenses in MediaWiki ==
== Defenses in MediaWiki ==


* [[mediawikiwiki:Extension:Lockdown|Lockdown extension]] - most suitable for other purposes in the category of "User Rights". It '''is useful''' for disallowing anonymous reads of "heavy" pages. For example, you can block certain swaths of URLs in an entire namespace such as all Special pages. It is just not designed for complex filtering.
* [[mediawikiwiki:Extension:Lockdown|Lockdown extension]] - most suitable for other purposes in the category of "User Rights". It '''is useful''' for disallowing anonymous reads of "heavy" pages. For example, you can block certain swaths of URLs in an entire namespace such as all Special pages - although you need to list each SpecialPage individually in your configuration (see example below). It is just not designed for complex filtering.
* StopForumSpam - as the name suggests, suitable for preventing write access (not reads/views).
* StopForumSpam - as the name suggests, suitable for preventing write access (not reads/views).
* [[mediawikiwiki:Extension:AbuseFilter|AbuseFilter extension]]- suitable for setting rules about content editing such as preventing links to specific domains, but not for traffic.
* [[mediawikiwiki:Extension:AbuseFilter|AbuseFilter extension]]- suitable for setting rules about content editing such as preventing links to specific domains, but not for traffic.
* [[mediawikiwiki:Extension:CrawlerProtection|CrawlerProtection extension]]- by MyWikis' Jeffrey Wang. Currently has a bug for MW 1.43
* [[mediawikiwiki:Extension:CrawlerProtection|CrawlerProtection extension]]- by MyWikis' Jeffrey Wang. Currently has [https://github.com/mywikis/CrawlerProtection/pull/10 a PR by vedmaka] that would enable a list of SpecialPages to deny anonymous access to.


== Problematic pages in MediaWiki ==
== Problematic pages in MediaWiki ==