I couldn't help but notice:
* Five articles were promoted to featured status this week
* Four articles were delisted this week.
* Twelve lists were promoted to featured status this week
* Eight lists were delisted this week
What a lot of churn. So the overall rate was merely +1 FA, +4 FL (and
also 3 topics and three images).
Is it always this bad?
In a message dated 8/23/2009 4:53:57 AM Pacific Daylight Time,
> The search for "bees" and "flowers" suggests "pollination". I do not see
> anything mindless about that. That is a human association>>
You're not understanding me. An article discussing bees and mentioning
that they pollinate flowers IS a human association. I didn't say it wasn't.
However the meta-network of *all* such associations to the nth degree of
relatedness is not something a human can encompass in one bite. That's one
What I was stating is that this meta-network itself, is created by a
computer algorithm, which ITSELF has no mind. It has no idea what the terms mean,
or refer to, or imply. It only knows that they are associated in some way.
It creates this meta-network and ranks the associations in a mindless way,
i.e. without comprehension. That's what I meant.
2009/8/31 Brion Vibber <brion(a)wikimedia.org>:
> On 8/31/09 7:35 AM, Michael Peel wrote:
> We've been planning to get a test setup together since conversations at
> the Berlin developer meetup in April, but actual implementation of it is
> pending coordination with Luca and his team.
> My understanding is that work has proceeded pretty well on setting it up
> to be able to fetch page history data more cleanly internally, which was
> a prerequisite, so we're hoping to get that going this fall.
To add to what Brion said: The author of the Wired story, Hadley
Leggett, scheduled a call with me earlier this month, but she missed
the call. I didn't have time to follow up with her after that, and she
filed the story without it. This is why there's no WMF quote in the
The gist of it is that:
We're very interested in WikiTrust, primarily for two reasons:
- it allows us to create blamemaps for history pages, so that you can
quickly see who added a specific piece of text. This is very
interesting for anyone who's ever tried to navigate a long version
history to find out who added something.
- it potentially allows us to come up with an algorithmic "best recent
revision" guess. This is very useful for offline exports.
The trust coloring is clearly the most controversial part of the
technology. However, it's also integral to it, and we think it could
be valuable. If we do integrate it, it would likely be initially as a
user preference. (And of course no view of the article would have it
toggled on by default.) There may also be additional community
Any integration is contingent on the readiness of the technology. It
seems to have matured over the last couple of years, and we're
planning to meet with Luca soon to review the current state of things.
There's no fixed deployment roadmap yet, and the deployment of
FlaggedRevs is our #1 priority.
Deputy Director, Wikimedia Foundation
Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate
*"Starting this fall, you’ll have a new reason to trust the information you
find on Wikipedia: An optional feature called “WikiTrust” will color code
every word of the encyclopedia based on the reliability of its author and
the length of time it has persisted on the page.*
*More than 60 million people visit the free, open-access encyclopedia each
month, searching for knowledge on 12 million pages in 260 languages. But
despite its popularity,
* has long suffered criticism from those who say it’s not reliable. Because
anyone with an internet connection can contribute, the site is subject to
vandalism, bias and misinformation. And edits are anonymous, so there’s no
easy way to separate credible information from fake content created by
*Now, researchers from the **Wiki Lab* <http://trust.cse.ucsc.edu/>* at the
University of California, Santa Cruz have created a system to help users
know when to trust Wikipedia—and when to reach for that dusty Encyclopedia
Britannica on the shelf. Called
*, the program assigns a color code to newly edited text using an algorithm
that calculates author reputation from the lifespan of their past
contributions. It’s based on a simple concept: The longer information
persists on the page, the more accurate it’s likely to be.*
*Text from questionable sources starts out with a bright orange background,
while text from trusted authors gets a lighter shade. As more people view
and edit the new text, it gradually gains more “trust” and turns from orange
More in story
Sorry if this is a duplicate thread but I haven't seen anything about
reaching this milestone.
The Christian Science Monitor reports/
"Wikipedia, the upstart social experiment that trusts the online mob to
steward world knowledge, has hit a major milestone.
The English volume of the Web encyclopedia reached its 3 millionth article.
That massive number of whos, whats, wheres, and whens culminated with a
profile on Norwegian soap opera actress Beate
In the less than 24 hours since she marked the 3 millionth entry, more than
1,000 new articles have already flooded in."
It concludes with info about the disagreement between inclusionists and
"Both see the other ruining Wikipedia, either by defeating the point of an
open encyclopedia, or by expanding its “pages” until the site dies from
Which side do you come down on? More the merrier? Or quality over quantity?
Let us know below, or join the conversation by following us on
The current <ref>...</ref>...<references/> system produces nice
references, but it is flawed--all the text contained in a given
reference appears in the text that the reference is linked from. For
It was a sunny day on Wednesday<ref>David Smith. ''History of Wednesdays.''
History Magazine, 2019.</ref>. The next day, Thursday, was cloudy.
== References and notes ==
(That's a very simple example, too. References start to become a lot
larger once they start to include other information and/or are
produced via a template.)
Once way I could conceive of correcting the problem is to have a
reference tag that provides only a _link_ to the note via a label and
another type of reference tag that actually _defines_ and _displays_
the note. For example:
It was a sunny day on Wednesday<ref id="smith"/>. The next day, Thursday,
== References and notes ==
<reference id="smith">David Smith. ''History of Wednesdays.'' History
This makes the raw wikitext easier to read, since the text of the
actual reference is in the _references_ section instead of in the
page's primary content.
I think this could work ...
Well-sourced junk that reads like it belongs on Simple En.wiki:
'''Adaptation''' is one of the basic phenomena of
biology.<ref>Williams, George C. 1966. ''Adaptation and natural
selection: a critique of some current evolutionary thought''.
Princeton. "Evolutionary adaptation is a phenomenon of pervasive
importance in biology." p5</ref> It is the process whereby an organism
becomes better suited to its [[habitat]].<ref>The ''Oxford Dictionary
of Science'' defines ''adaptation'' as "Any change in the structure or
functioning of an organism that makes it better suited to its
environment".</ref> Also, the term ''adaptation'' may refer to a
characteristic which is especially important for an organism's
survival.<ref>Both uses of the term 'adaptation' are recognized by
King R.C. Stansfield W.D. and Mulligan P. 2006. ''A dictionary of
genetics''. Oxford, 7th ed.</ref> For example, the adaptation of
horses' teeth to the grinding of grass, or their ability to run fast
and escape predators. Such adaptations are produced in a variable
population by the better suited forms reproducing more successfully,
that is, by [[natural selection]].
The above will be changed, obviously. Note also the large inline
<refs> make editing difficult, which in turn lets nonsense writing
persist. If we can't come up with some better technical means of
separation - all ref tags under their own invisible section maybe -
then at least carriage-returns - putting the <ref> on the next line -
would work well enough. Still showing up the same in view mode, but
the text can actually be readable in edit mode).
Anyway, working on something unsourced like:
In [[biology]], '''adaptation''' is an observed ''effect'' of the
process of [[evolution]] —wherein canonical [[organism]]s
(species) appear to [[change]] over time to survive more efficiently
within their [[habitat]]. The concept of adaptation was developed
before the theory of evolution —Lamarck had made some
groundbreaking observations which inspired Darwin. "Adaptation" in
reality does not refer to changes within individual organisms, but to
the canonical form of the species — changes brought about by a
process of [[natural selection]]. "Adaptation" in the context of
biology, thus is a largely a colloquialism for natural selection.
Sources available upon request.
In a message dated 8/31/2009 11:47:04 AM Pacific Daylight Time,
> - WikiTrust might be described as "a way to see how long an edit
> and how much trust it seems to have"; in most users' hands it'll be
> colored red/blue so its right/wrong."
> - People won't think, they'll assume and rely.>>
Interesting to see this by virtue of repetition in our mirrors.
And our pseudo-mirrors who *don't* event state that they mirrored us.
Then after a phrase has been cut from our version due to lack of source,
it's put back in citing a past mirror who hasn't removed it....
Unsourced statement one has "high trust" because it's been there for two
years, without a source. When a source is found contradicting it, will there
be a big fight because "100 editors has passed on this and haven't reverted
.... Shades of past warfare.