Mathias Schindler wrote:
Brion Vibber wrote:
I'm inclined to agree. It looks legit from a
markup perspective (ie it
shouldn't cause HTML validators to bitch at us; rel _is_ a defined
attribute for <a>, and the set of values is open-ended.)
Of course it won't prevent wikispam, but with luck it will discourage it
by reducing its value.
Would this tag apply to all external links, even to the "yes, this is a
good link"-links? In the long run, we wouldn't hurt not just spammers
but even the good guys by not linking to them in a way search engines
would consider this link to be relevant.
Or is this just PageRankParanoia?
Mathias
The best proposal I've seen so far is to apply it to all recently-modified
pages (with 'recent' determined per project by how fast you think the
editors will get it cleaned up - probably a day or so is reasonable). That
assumes that it will get cleaned up before the timeout expires, and
prevents it from being visible to a robot in the interim. But links that
survive for a while still become rankable. And it's fairly simple, unlike
tracking, per-link, whether or not it has been verified.
It would also penalize valid pages that just change frequently (like, say,
the front page) but I would think most such are really not good sources of
PageRank correlation anyway.