Stop AI crawlers: Difference between revisions

link to vedmaka's PR
link to my PR for CrawlerProtection
Line 21: Line 21:
* StopForumSpam - as the name suggests, suitable for preventing write access (not reads/views).
* StopForumSpam - as the name suggests, suitable for preventing write access (not reads/views).
* [[mediawikiwiki:Extension:AbuseFilter|AbuseFilter extension]]- suitable for setting rules about content editing such as preventing links to specific domains, but not for traffic.
* [[mediawikiwiki:Extension:AbuseFilter|AbuseFilter extension]]- suitable for setting rules about content editing such as preventing links to specific domains, but not for traffic.
* [[mediawikiwiki:Extension:CrawlerProtection|CrawlerProtection extension]]- by MyWikis' Jeffrey Wang. Currently has [https://github.com/mywikis/CrawlerProtection/pull/10 a PR by vedmaka] that would enable a list of SpecialPages to deny anonymous access to.
* [[mediawikiwiki:Extension:CrawlerProtection|CrawlerProtection extension]]- by MyWikis' Jeffrey Wang. Currently has [https://github.com/mywikis/CrawlerProtection/pull/10 a PR by vedmaka] that would enable a list of SpecialPages to deny anonymous access to. I submitted [https://github.com/mywikis/CrawlerProtection/pull/12 another PR] incorporating the suggestions therein.


== Problematic pages in MediaWiki ==
== Problematic pages in MediaWiki ==
Line 342: Line 342:
# WithoutInterwiki
# WithoutInterwiki


 
Assuming '''CrawlerProtection''' is updated with [https://github.com/mywikis/CrawlerProtection/pull/12 my PR], you can just configure that extension with the list above.
{{Collapsible
{{Collapsible
|visible_text=You can achieve this restriction using the Lockdown extension with this configuration:
|visible_text=You can achieve this restriction using the Lockdown extension with this configuration: