Earlier this week I wrote about linkrot.

This post missed a similar annoying trend. Today’s link check of my site hit four 403 errors. There are a few every week.

Over the last year or so I’ve been killing the links when I find them because I fear lots of 403s can make my site look suspect to search engines. (I’d be interested to hear if my fears are real or imaginary).

The pages are there, you can click through to them all but something tells the bot that access is forbidden.

Often 403 means there’s a poorly maintained site at the other end. It’s not linkrot as such, but the growing number of 403s is a sign the web is deteriorating.

Have you seen a rising number of 403s on your outgoing links and, if so, how do you deal with them?