How do I use a different robots.txt for https request than the one that is use开发者_StackOverflow中文版d for http connections in IIS 7?
Thanks.
There are a few options here, depending on how custom you need this to be. Most flexible approach would be to write a handler and map to it for robots request, and handle internally.
However, for most needs, try to URL rewrite module http://www.iis.net/download/urlrewrite
off top of my head (aka prob doesnt work 100%), its something like :
<rule name="https_robots">
<match url="(.*)" />
<conditions>
<add input="{HTTPS}" pattern="off" ignoreCase="true" />
</conditions>
<action type="Redirect" redirectType="Found" url="https://{HTTP_HOST}/robots-https.txt" />
</rule>
Got exactly the same problem:
I am setting up https version now, want to look at it and debug it without a rush, meanwhile fending off Google from crawling https, but keeping http "business as usual". Here's the piece of code I used in a config file:
<rule name="https_robots" stopProcessing="true">
<match url="^robots\.txt$" />
<conditions>
<add input="{HTTPS}" pattern="^OFF$" />
</conditions>
<action type="Rewrite" url="robots_https.txt" />
</rule>
A few notes:
Basically, I've created a file "robots_https.txt" containing "disallow for all" instruction.
I used Rewrite, not Redirect. I am not sure if the spider would get the redirect. I really doubt that. With Rewrite you can't go wrong.
Hope that helps!
精彩评论