Any robots.txt revisions applied 开发者_JAVA百科inside VS2010 save the file with BOM which basically causes Google to reject it with a 'Syntax not understood' error. There is a related question on this but the "Save With Encoding" option isn't available for text files - even if it were, there should be a solution that just works with CTRL + S rather than having to go the advanced route just to keep BOM out.
I can't believe I'm the the only person experiencing this problem, surely there's a solution?
Looks like an easy fix for this is to have a blank line or a comment as the first line of the file: http://www.12titans.net/p/robots-txt-bom.aspx
VS adds the BOM to UTF-8 file, but why should your robots.txt file be encoded as UTF-8 anyway?
Since it contains URLs, it should be plain ASCII or ISO-8859-1. If your site URLs contain non-ASCII characters, you have to URL-encode them appropriately. Making the file plain ANSI will save all the BOM hassle.
Also see robots.txt; What encoding?
精彩评论