Brion VIBBER wrote:
Toby Bartels wrote:
Can Randompage weight the articles that it picks by their length? This will make *any* stub less likely to be chosen, arguably without giving a distorted picture of what Wikipedia is like.
The Ram-bot city entries average somewhere around 2130 bytes each (standard deviation 113), which is *larger* than the English Wikipedia-wide average for article-space non-redirect pages (about 1900 bytes, standard deviation a whopping 3028).
I knew that they weren't short, but I was hoping that they weren't *long*. I guess that they are.
As I recall, the median article size is smaller than the average; if you could cut out the Ram-bot cities by size, you'd cut out most of the rest of Wikipedia with them.
(I assume that by "average" you mean <arithmetic mean>, then?) Weighting by size could never cut out articles larger than the mean, so the conclusion is not that most of Wikipedia would be cut out but instead that the Ram-bot entries would not be cut out.
In light of other responses, there seem to be two purposes to Randompage. One is to give visitors an idea of what Wikipedia is like, and if Ram-bot entries are both among the most common and most substantial of Wikipedia articles, that it's only fair that they show up often. But another is to give contributors ideas for new things to work on, and Ram-bot entries are generally useless for this purpose. Perhaps we need 2 versions of Randompage? (I should not really talk about it, since I rarely use Randompage anyway.)
-- Toby