Edit: I learned that my error was unrelated to the robots file. Disregard.
I just learned t开发者_Go百科he hard way that Google blocks access to the Maps API if you have a restrictive robots.txt file. I recently created a robots file with "Dissallow: /". Now my site can no longer use Maps. Rats.
I removed the robots file, but I still cannot use Maps. I also tried creating a completely permissive file ("Dissallow: "), and that has not yet solved the issue.
Can anyone tell me the next step? If at all possible, I'd prefer the site not show up in Google, since it's a staging site. But I also don't know how long before they rescan for a new robots file.
I don't think this is your problem. I'm successfully running Google maps on an internal development server which Google can't crawl.
Are you getting an error message?
As for rescanning the robots.txt file, you can use the Google Webmaster tools app to request a rescan.
精彩评论