I have a small situation where i have to remove my robots.txt file because i don't want and robots crawlers to get the links.
Also i want them to be accessible by the user and i don't want them to be cached by the search engines.
Also i cannot add any user authentications for various reasons.
So i am thinking about using mod-rewrite to disable search engine crawlers from crawling it while allowing all others to do it.
The logic i am trying to implement is write a condition to check if the incomming user agent is a search engine and if yes then re-direct them to 401.
The only problem is i don't know how to implement it. :(
Can somebody help me w开发者_运维技巧ith it.
Thanks in advance.
Regards,
I may be understanding you wrong, but I think
User-agent: *
Disallow: /
in robots.txt will do just what you want - not let any crawler in, while keeping website open for normal users.
Or do you need to specifically remove robots.txt (for what reason?) from the web server?
精彩评论