搜狗蜘蛛还是打我们的网站上,甚至阻塞后搜狗、蜘蛛、网站

由网友(自重。)分享简介:我们的网站越来越从搜狗网络蜘蛛许多命中,所以我们想阻止它使用htaccess的规则。我们创建下面规则 - Our website was getting many hits from "Sogou web spider", So we thought of blocking it using htaccess ru...

我们的网站越来越从搜狗网络蜘蛛许多命中,所以我们想阻止它使用htaccess的规则。我们创建下面规则 -

Our website was getting many hits from "Sogou web spider", So we thought of blocking it using htaccess rules. We created below rules -

RewriteCond %{HTTP_USER_AGENT} Sogou [NC]
RewriteRule ^.*$ - [L]

不过,我们仍然得到的点击来自搜狗。我想知道我应该在这个规则什么样的变化,以阻止搜狗。

However we are still getting hits from Sogou. I would like to know what changes should I make in this rule to block Sogou.

感谢你,

推荐答案

由于@faa提到的,你实际上并没有阻挡什么:

As @faa mentioned, you're not actually blocking anything:

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} Sogou [NC]
RewriteRule ^.*$ map.txt [R=403]

请确保你有RewriteEngine叙述上与[R = 403]。

Make sure you've got RewriteEngine On and the [R=403].

您仍然可以看到你的访问日志但不发送任何数据的403禁头相结合,并从他们的命中率,你应​​该看到的点击率都死光了也说不定。即使他们继续抓取你的网站,它应该不再生成您的服务器上太多的额外负担。

You may still see hits from them in your access logs but with the combination of not sending any data and a 403 forbidden header, you should see the hits die off eventually. Even if they continue to crawl your site, it should no longer generate so much extra load on your server.

阅读全文

相关推荐

最新文章