I want to know how to control the links on the webpage, where the link leads to external sites. Sometimes it can be that an external site just vanishes and you don't know that such an event occurred.
I am thinking about making an Excel sheet which holds all links and it's places from the website. Then I go manually monthly over all links and check if the website exists (which I find a pretty stupid idea :/ )
Does anyone of you have a goo开发者_JS百科d system on maintaining external links?
Use Xenu.
Xenu's Link Sleuth (TM) checks Web sites for broken links. Link verification is done on "normal" links, images, frames, plug-ins, backgrounds, local image maps, style sheets, scripts and java applets. It displays a continously updated list of URLs which you can sort by different criteria. A report can be produced at any time.
If you're on Mac, have a look at Integrity.
Xenu is generally agreed to be the best link checker. Run it regularly, maybe even set up a cron job (or windows scheduler) to run it regularly, maybe even to email you the results.
The W3C also has a link checker at http://validator.w3.org/checklink (and you should run their CSS & HTML checkers on your entire site, too).
It depends how your site is coded. Mine is all PHP and I once had an idea to check all links on a page each time the page is loaded (or every 'n'th time). If you do this, you could either parse the page and check each link, or just do it simply and invoke W3C's link checker and parse the output - if any errors, send yourself an email.
精彩评论