HTML and CSS Reference
Repair Broken Links
Repair broken links if possible. Delete them if not.
<a href="http://deadsite.example.com/">Dotcom, Inc.</a>
<a href="http://www.example.com/reorganized/site">Learn More</a>
<a href="http://www.example.com/new/location">Learn More</a>
Dead links annoy users and waste their time. In the worst case, they can be offensive. Many disgusting
spammers make a habit of buying up abandoned domain names of failed companies and replacing them with
pages of ads for subprime mortgages, get-rich-quick schemes, and outright pornography. Many web sites are
pointing to porn and don't even know it.
Dead links also reduce search engine placement for both your site and the sites linked to it.
Checking links is fairly easy to automate. As a result, many tools will do this for you. Some are built into
authoring tools, and they usually work on a single page. Others are stand-alone programs that run on your
computer. Still others are web-based services. For a quick check on one page, I'll either use what's built into my
editor or hop over to the web-based checker at http://validator.w3.org/checklink . Googling for "online link
checker" will find many similar tools.
For more automated testing of an entire site, on Windows I use Xenu Link Sleuth,
http://home.snafu.de/tilman/xenulink.html ; and on UNIX I use Linklint, www.linklint.org/ . Once again, these are
just two choices. There are many others. Each can scan a site remotely or locally and attempt to follow each link
it finds. If a link can't be followed, or if it is redirected, an error message is logged. For example, here's some
output from checking one of my sites with Linklint:
$ ./linklint -http -host www.cafeaulait.org -doc results /@
Checking links via http://www.cafeaulait.org
that match: /@
1 seed: /
checking robots.txt for www.cafeaulait.org