I set a 'deny from' in my htaccess to block certain spam bots from parsing my site. While using the code below, I noticed in my log file that I'm getting a lot of 'client denied by server configuration' and it's cluttering up the log files when the bot starts its scan. Any thoughts?
Thanks, Steve
<Files *>
order al开发者_运维百科low,deny
allow from all
deny from 123.45.67.8
</Files>
I ended up going with the following:
RewriteCond %{REMOTE_ADDR} 123.4.3.4.5
RewriteRule (.*) - [F,L]
Take a look at the conditional logging here - I think that will provide everything you need:
http://httpd.apache.org/docs/2.2/logs.html
Also - if you can identify that the various bots are always coming from a specific IP address, you can block them in your hosts.allow/deny files VIA IP address or automatically using something like blockhosts or possibly mod_evasive, that way apache will never see the requests to log them.
-sean
UPDATE: Are you identifying the ip addresses manually then adding them to your htaccess? that sounds painful. If you really want to do it that way I would suggest you block the ip addresses at the firewall with a drop rule OR as above in hosts allow/deny.
SPURIOUS BROKEN RECORD UPDATE: Take a look at blockhosts, it can block ip addresses based on their 'behavior' & will eliminate the need for you to manually prune them out every day.
You can get the log file to be sent to a program (aka a script).
Perhaps implement a script than just gives a periodic summary?). The rest to log file?
精彩评论