It's based on MVC 3 + Razor, and now there is no DNS created for the site, but just public IP. Due to lack of understanding of whether and how google handle the spider for IP sites, we're getting a headache that found we cannot get any search results of our public IP in google. Someone insists that this is because of MVC 3, which cannot be in开发者_开发问答dexed by main stream search engine. Frankly, that sounds a big joke to me, how could google handle AJAX sites but cannot crawl MVC web sites? I cannot believe this.
Now I want to resolve and convince them that MVC 3 has no business of the issue we got, and resolve this by a proper way.
I also found that if we registered the public IP site, in the future if it's pointed to a DNS, google will consider the same content crawled in the DNS having lower rank than the first coming site - public IP.
BTW, I tried Microsoft Search Server to crawl our MVC site, it's working well.
Is there any articles can help us,
make the public IP site crawlable in google. can robots help us out?
convince team that MVC 3 is not the root cause that push us in the situation.
Thanks.
"Someone insists that this is because of MVC 3, which cannot be indexed by main stream search engine. Frankly, that sounds a big joke to me..."
Frankly, you're right! It is a big joke! To ensure that Google crawls your web site you need to do one of two things:
- Have links to your web site (even if it is the public IP) from other popular web sites.
OR - Tell Google to crawl your web site.
Option #1 will cause your web site to be crawled eventually (at some undetermined point in time) and Option #2 will cause your web site to be scheduled for crawling in the very near future.
MVC3 has almost noting to do with how your website is crawled. Suppose you have a static HTML page inside your MVC3 project:
<html>
<head></head>
<body>
Hello! This is my ENTIRE WEB SITE! And Google will see it the same way regardless if I have an MVC project or a static HTML page!
</body>
</html>
I also found that if we registered the public IP site, in the future if it's pointed to a DNS, google will consider the same content crawled in the DNS having lower rank than the first coming site - public IP.
There is no problem with this. When time has come you can respond with HTTP 301 Moved Permanently
for every request that's addressing the IP. Set the Location
in response header to your new address and Google will recognize that you've moved the site.
Therefore Google will readdress all collected data to the new location. However they may and will adjust a few things. But that's the way Google recommends ... (http://www.google.com/support/webmasters/bin/answer.py?answer=83105)
Hi I just want to give some input from own experience. When dealing with the Googles crawler (bot fetch) within the Google Web Master Tools, then makes sure that the main page (index, default, or if mvc home controller and even global asax and basecontroller) none of these must at any time return null values. This will make the google bot return errors instead of fetch the pages.
精彩评论