To aggregate some of the arguments and counter-arguments, I posted
https://www.mediawiki.org/wiki/The_dofollow_FAQ and
https://www.mediawiki.org/wiki/Manual:Costs_and_benefits_of_using_nofollow.
It does seem, from my googling of what the owners of smaller wikis
have
to say about it, that nofollow is less popular outside of WMF with many of
those wiki owners who have taken the time to analyze the issue. On the
other hand, it could be that people who were happy with the default felt
less dissatisfied with MediaWiki devs' decision and therefore didn't feel
as much need to voice their opinions, since they had already gotten their
way and didn't have to take any measures to override the default.
I do think the implications of changing how nofollow is applied are very
different on, say, Wikipedia than they would be on a small or even
medium-sized wiki where the average user watches RecentChanges instead of a
watchlist. In a small town, you can leave your doors unlocked and get away
with it because you don't have as much traffic coming through and the
neighbors would notice and care about (for curiosity, if no other reason)
the presence of anyone who seemed out of place. It's the same way on these
small wikis; it's rare than anyone comes along to try to subtly add a spam
link, and when they do, it's noticed. Likewise, if someone starts marking
spammy edits as patrolled, that gets noticed.
Spambots are not able yet to be subtle, and the labor required to get
accustomed to the norms of a wiki and to become fluent enough in the native
language to fit in require a skilled labor that is more expensive than that
required to simply pass a CAPTCHA. So, I think that putting dofollow on
patrolled external links would be okay especially on smaller wikis, as the
patrol would stop the spambots from getting a pagerank boost and the labor
costs would deter the subtler ones. Even on Wikipedia, those fighting spam
can take advantage of the same economies of scale as those adding spam,
such as using pattern recognition on the entire wiki to catch people, or
blacklisting individual spammers and taking measures to keep them out (on
the smaller wikis, a person caught spamming can just go to another wiki,
but if you're caught spamming on Wikipedia, there isn't another site of
Wikipedia's size and scope you can go to.)
To say that patrolling wouldn't do enough to keep spam out is basically to
say, at least to some extent, that patrolling is not a very effective
system and that the wiki way doesn't work very well. If Google agrees, they
can stop giving wikis in general, or certain wikis, such influence over
pagerank. The spammers have market incentives to become more sophisticated,
but so does Google, since their earnings depend on keeping their search
results relevant and useful, so that people don't switch to competitors
that do a better job.
The question of what the default configuration should be, or what
configuration should be used on WMF sites, can be addressed in other bugs
besides this one. It doesn't take much coding to change a default setting
from "true" to "false". For now, I would just like to implement the
feature
and make it available for those wikis who want to use it. So, is there
support for putting this in the core as an optional feature, and is there
anyone who will do the code review if I write this?