Google Webmaster Tools are constantly showing this version of my domain name:
mysite.co.uk/?cat=
I have disallowed every URL with ? in it and the above shows under the Crawl errors: URL restricted by robots.txt
I simply don't know why is that happening when I've got the following in my .htaccess file:
RewriteCond %{QUERY_STRING} ^cat=$ [NC]
RewriteRule ^(.*)$ http://misite.co.uk/? [R=301,L]
I thing the above is supposed to 301 redirect: mysite.co.uk/?cat= to: mysite.co.uk/
开发者_开发问答and also if I click the first URL it actually goes to the second one in the browser.
I've got a couple more similar issues with ? URLs and I'll be really happy if somebody tells me how to properly 301 redirect them in order to show Google that these have been permanently removed and stop them from being crawled.
The appropriate HTTP response status code to tell a resource has been permanently removed would be 410 Gone:
The requested resource is no longer available at the server and no forwarding address is known. This condition is expected to be considered permanent. Clients with link editing capabilities SHOULD delete references to the Request-URI after user approval. […]
Sending this response status code will make search engines remove the requested URI from the index. But they might be crawled after all if there still are links with that URI.
精彩评论