Linux下规则文件.htaccess(手工创建.htaccess文件到站点根目录)
<IfModule mod_rewrite.c> RewriteEngine On #Block spider RewriteCond %{HTTP_USER_AGENT} "SemrushBot|Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl|Python|Wget|Xenu|ZmEu" [NC] RewriteRule !(^robots\.txt$) - [F] </IfModule>
Windows2008、2012或更高系统下规则文件web.config (手工创建web.config文件到站点根目录)
<?xml version="1.0" encoding="UTF-8"?> <configuration> <system.webServer> <rewrite> <rules> <rule name="Block spider"> <match url="(^robots.txt$)" ignoreCase="false" negate="true" /> <conditions> <add input="{HTTP_USER_AGENT}" pattern="SemrushBot|Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl|Python|Wget|Xenu|ZmEu" ignoreCase="true" /> </conditions> <action type="AbortRequest"/> </rule> </rules> </rewrite> </system.webServer> </configuration>
注:“{HTTP_USER_AGENT}”所在行中是不明蜘蛛名称,根据需要添加以"|"为分割。
规则中默认屏蔽部分不明蜘蛛,要屏蔽其他蜘蛛按规则添加即可,附各大蜘蛛名字:
google蜘蛛:googlebot
百度蜘蛛:baiduspider
百度手机蜘蛛:baiduboxapp
yahoo蜘蛛:slurp
alexa蜘蛛:ia_archiver
msn蜘蛛:msnbot
bing蜘蛛:bingbot
altavista蜘蛛:scooter
lycos蜘蛛:lycos_spider_(t-rex)
alltheweb蜘蛛:fast-webcrawler
inktomi蜘蛛:slurp
有道蜘蛛:YodaoBot和OutfoxBot
热土蜘蛛:Adminrtspider
搜狗蜘蛛:sogou spider
SOSO蜘蛛:sosospider
360搜蜘蛛:360spider
apache、iis屏蔽限制ip访问(适用虚拟主机)
Copyright © 2013-2021 8a.hk All Rights Reserved. 八艾云 版权所有 中山市八艾云计算有限公司 粤ICP备14095776号