We want nofollow to be a succesful and useful initiative, which in at least some small place makes the Internet a better place to live. Nofollow will work so long as most search engines find the information contained in the nofollow tag to be generally useful on average.
If nofollows are sprinkled all over the web indiscriminately, then search engines which choose to ignore them will be at a competitive disadvantage to that which choose to respect them.
Therefore, what we can do to help ensure the success of the nofollow iniative is try to be sure that nofollow tags encode useful information for search engines. That is to say, links which are deemed *good* by *humans* do not tend to contain the nofollow tag, and links which are *suspicious* tend to contain the nofollow tag.
Therefore, it seems best to me if nofollow is on by default in the software distribution (since most small wikis are victimized by spam) and that nofollow is turned off in all the busy wikipedia sites.
Wikipedia provides an enormously high value set of hints to search engines to help them find sites that don't suck. We should not lower the value of the nofollow tag by labelling all of our high-value links as being suspect when they really aren't. Doing so reduces the value of the nofollow link for search engines.
--Jimbo
On Mon, 7 Mar 2005 11:54:12 -0800, Jimmy (Jimbo) Wales jwales@wikia.com wrote:
Therefore, it seems best to me if nofollow is on by default in the software distribution (since most small wikis are victimized by spam) and that nofollow is turned off in all the busy wikipedia sites.
I didn't take part in the original vote, since it was a very scattered discussion and a simple "nofollow v. no nofollow" choice was hard to make; but the above sounds like a fair short-term compromise.
It would also be great for editors to have the option of forcing "nofollow" for a link (and for setting other attribs like "title").
Wikipedia provides an enormously high value set of hints to search engines to help them find sites that don't suck.
In the longer term, we should start treating external links, and other bibliography entries and references, like first-class citizens -- with their own discussion pages, watchlist entries, and attributes.
A reference is a lot more than just another sentence in a 10k article. It forms part of the core of a good encyclopedia -- doubly so for one which prides itself on not hand-picking its editors. References should have histories, should become more precise over time (which edition of that book? on what date was that website visited [and where's the permalink to it via the Internet Archive]? what do other people say about this reference? is this advertising? zealotry? spam?), should give off more of a signal when added, modified, or deleted. When a link changes (as most of the web does), all uses of that reference should change as well.
Perhaps we could design a system where there is a 'right' way to insert or call an external URL, so that a URL that is just pasted in would have ref=nofollow, but one which is called correctly (and has its own discussion space and history somewhere) would not.
I will write something about this at [[m:References]], once the wiki is back up...
I stumbled across this way to reduce comment spam:
http://www.gadgetopia.com/2005/03/08/PreventingSpamByKeystrokeCounting.html
http://overstated.net/projects/mt-keystrokes/
It's not perfect of course - but might be worth investigating.
paul
On Mon, 7 Mar 2005 16:08:13 -0500, Sj 2.718281828@gmail.com wrote:
On Mon, 7 Mar 2005 11:54:12 -0800, Jimmy (Jimbo) Wales jwales@wikia.com wrote:
Therefore, it seems best to me if nofollow is on by default in the software distribution (since most small wikis are victimized by spam) and that nofollow is turned off in all the busy wikipedia sites.
I didn't take part in the original vote, since it was a very scattered discussion and a simple "nofollow v. no nofollow" choice was hard to make; but the above sounds like a fair short-term compromise.
It would also be great for editors to have the option of forcing "nofollow" for a link (and for setting other attribs like "title").
Wikipedia provides an enormously high value set of hints to search engines to help them find sites that don't suck.
In the longer term, we should start treating external links, and other bibliography entries and references, like first-class citizens -- with their own discussion pages, watchlist entries, and attributes.
A reference is a lot more than just another sentence in a 10k article. It forms part of the core of a good encyclopedia -- doubly so for one which prides itself on not hand-picking its editors. References should have histories, should become more precise over time (which edition of that book? on what date was that website visited [and where's the permalink to it via the Internet Archive]? what do other people say about this reference? is this advertising? zealotry? spam?), should give off more of a signal when added, modified, or deleted. When a link changes (as most of the web does), all uses of that reference should change as well.
Perhaps we could design a system where there is a 'right' way to insert or call an external URL, so that a URL that is just pasted in would have ref=nofollow, but one which is called correctly (and has its own discussion space and history somewhere) would not.
I will write something about this at [[m:References]], once the wiki is back up...
-- +sj+ _______________________________________________ Wikitech-l mailing list Wikitech-l@wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikitech-l
admin Yellowikis wrote:
I stumbled across this way to reduce comment spam:
http://www.gadgetopia.com/2005/03/08/PreventingSpamByKeystrokeCounting.html
http://overstated.net/projects/mt-keystrokes/
It's not perfect of course - but might be worth investigating.
It's a neat idea, but there are several serious problems with it, particularly so for a wiki environment.
Most seriously, it requires client cooperation -- a spambot can simply send the expected number of bytes along with the request. It can only therefore be effective against humans doing manual cut-n-paste of spam text... or real genuine humans cutting and pasting text.
It requires JavaScript, so either it cuts out submissions from humans with JavaScript disabled or unavailable or it must accept submissions without the number, making it trivially defeatable by humans...
For wikis, we send out the entire current page text and take back the entire text with modifications. Determining number of changed characters is not trivial; think again of cut-n-paste, rearrangement of sections, deletion of sections, etc. This makes it much harder to get meaningful information out of such a comparison.
-- brion vibber (brion @ pobox.com)
wikitech-l@lists.wikimedia.org