Something like the plan in [[User:David Gerard/1.0]] would use an article rating system (picture a "Rate this page" tab at the top next to "Article", "Edit", etc.) to get a rough idea of what is of decent quality to pull for a distribution.
I assume you're talking about rating per article, and not per revision (as the latter would be fairly impossible). In that case, it'd be quite a rough idea indeed.
Personally I think we should just use something like featured article status for our rating system. At the point where the article becomes "featured" (or whatever we decide to call it), that's where we'd make our branch. Obviously we'd have to tweak the featured article process to be more closely aligned with a print version (featured articles shouldn't have non-free images in them), but something similar to the featured article process would work much better than article ratings, in my opinion.
Any branching and polishing would be left as late as possible. Think of the Mozilla process, where the alpha, beta and final are branched from the nightlies, slightly polished for a few days (or weeks) and then released. This would provide minimal disruption of and diversion of resources from the live Wikipedia.
If the Mozilla process does this, then presumably its programmers are not supposed to introduce brand new features during the alpha and beta stages of development. I find that rather hard to believe, but maybe Mozilla is a small enough project that it can do such a thing.
Wikipedia isn't. In fact, Wikipedia is still growing exponentially. New information is being added faster than it can be fact checked and proofread. The only way I can see avoiding a fork and coming out with a respectible encyclopedia would be to freeze all new development for a really long time, and that is obviously unacceptable.
Again, I don't know much about Mozilla development, but any major development project on the order of something like Wikipedia is going to have branches off the main trunk which are maintained for long periods of time. This causes some wasted time since things need to be merged and backported, but it's the fastest and most efficient way to produce a high quality product. I'm afraid that "slightly polished for a few days (or weeks)" isn't going to cut it.
Diversion of resources isn't such a big problem as long as the branch is maintained properly. New information goes into the main trunk. "Bug fixes" (typos, inaccuracies, small wording changes, POV removal) are implemented in the branch, and merged into the main trunk if applicable. Really important new information (the twin towers are destroyed, the leader of a world superpower is assassinated, etc.) might get backported, i.e. added into the main trunk and then merged into the the branch; but this should be the exception, not the rule, and should only be done while the branch is still active.
Of course, maybe our only disagreement here is over how long it's going to take to get from the point of the fork to the point where the branch is no longer maintained. In my opinion a few weeks isn't going to be anywhere near enough time to fix all the inaccuracies.
- d.
Anthony
_________________________________________________________________ FREE pop-up blocking with the new MSN Toolbar get it now! http://toolbar.msn.click-url.com/go/onm00200415ave/direct/01/
Anthony DiPierro (anthonydipierro@hotmail.com) [050807 03:36]:
Something like the plan in [[User:David Gerard/1.0]] would use an article rating system (picture a "Rate this page" tab at the top next to "Article", "Edit", etc.) to get a rough idea of what is of decent quality to pull for a distribution.
I assume you're talking about rating per article, and not per revision (as the latter would be fairly impossible). In that case, it'd be quite a rough idea indeed.
Nope, per article version. See [[m:Article validation feature]]. (I think "validation" is a misnomer here myself - that was Magnus' name for it, since he wrote the feature.)
Any branching and polishing would be left as late as possible. Think of the Mozilla process, where the alpha, beta and final are branched from the nightlies, slightly polished for a few days (or weeks) and then released. This would provide minimal disruption of and diversion of resources from the live Wikipedia.
If the Mozilla process does this, then presumably its programmers are not supposed to introduce brand new features during the alpha and beta stages of development. I find that rather hard to believe, but maybe Mozilla is a small enough project that it can do such a thing.
It's comparable to OpenOffice or KDE in compilation time. Depends if you call that "small".
Again, I don't know much about Mozilla development, but any major development project on the order of something like Wikipedia is going to have branches off the main trunk which are maintained for long periods of time. This causes some wasted time since things need to be merged and backported, but it's the fastest and most efficient way to produce a high quality product. I'm afraid that "slightly polished for a few days (or weeks)" isn't going to cut it.
The terms I was thinking in were to pull all suitably-rated articles that the distributor is interested in, then polish that lot. Note that this is useful to article quality drives in general - anyone could do this at any time to show up areas that you'd want in an encyclopedia but that need work.
Of course, maybe our only disagreement here is over how long it's going to take to get from the point of the fork to the point where the branch is no longer maintained. In my opinion a few weeks isn't going to be anywhere near enough time to fix all the inaccuracies.
The presumption is that the sufficiently highly rated stuff will be of good quality anyway. If it isn't, it's material for an article improvement drive.
- d.