I'm creating a website where I have an image gallery that does AJAX requests for loading images, and I've got a system that I can navigate through photos using arrow keys etc, and for sharing the URLs easily, I change the hash of the address bar, and on the page, check hash using javascript and redirect to the appropriate location if needed (just like facebook does anyway). The system is working, but I can't figure out how to make this fetcher/crawler friendly. For example, a user may copy the address http://mysite.com/photos#photo/123
, where 123 is the photo ID. A normal browser WILL redirect to http://mysite.com/photo/123
and display the page without any problem, but I want this functionality to be preserved when a visitor pastes the address to Facebook too (as a link on their wall etc) What is the best practice of doing this? Does Facebook have any "knowledge" of handling hashta开发者_如何学Pythongs out of it's own scope? I currently don't have the chance to try it, and I don't think the crawler would parse and execute javascript to go to the right page.
If you are, or your web hosting provider is, running an Apache HTTP server, this can be accomplished with URL rewrites in your httpd.conf
or on a per-directory basis with .htaccess
files (which is the most common way, particularly for a shared hosting enviroment where you have limited control over Apache's configuration).
Try putting this in a .htaccess
file in your base directory. (Note; this is off the top of my head, use only as a start)
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} =facebookexternalhit\/[0-9]+(\.[0-9]+)*
RewriteRule /photos\#photo/([0-9]+) photos/$1/ [L,R]
精彩评论