Hi,
hopefully this notice will prevent complaints about this change.
Someone just checked in a software change that will allow sysops to
customise the names of months and weekdays via pages in the MediaWiki
namespace, just like most other things in the user interface. When this
change goes live on the site, you will see the month and weekdays names
change into English (I believe). Please do not panic when that happens.
Please create pages called [[MediaWiki:January]] etc. and
[[MediaWiki:Sunday]] etc. in your local Wikipedia and translate the
months/weekdays names appropriately.
In case you're not sure about the correct spellings, I have included
them below.
Greetings,
Timwi
Weekday names:
[[MediaWiki:Sunday]]
[[MediaWiki:Monday]]
[[MediaWiki:Tuesday]]
[[MediaWiki:Wednesday]]
[[MediaWiki:Thursday]]
[[MediaWiki:Friday]]
[[MediaWiki:Saturday]]
Month names:
[[MediaWiki:January]]
[[MediaWiki:February]]
[[MediaWiki:March]]
[[MediaWiki:April]]
[[MediaWiki:May]]
[[MediaWiki:June]]
[[MediaWiki:July]]
[[MediaWiki:August]]
[[MediaWiki:September]]
[[MediaWiki:October]]
[[MediaWiki:November]]
[[MediaWiki:December]]
Abbreviated month names:
[[MediaWiki:Jan]]
[[MediaWiki:Feb]]
[[MediaWiki:Mar]]
[[MediaWiki:Apr]]
[[MediaWiki:May]]
[[MediaWiki:Jun]]
[[MediaWiki:Jul]]
[[MediaWiki:Aug]]
[[MediaWiki:Sep]]
[[MediaWiki:Oct]]
[[MediaWiki:Nov]]
[[MediaWiki:Dec]]
Since Jimbo has announced the "move to 1.0" some month ago, not much has
happened. Meanwhile, the tone of the press concerning wikipedia seems to
be slowly drifting from "interesting, lots of potential" to "source of
unreliable information". There's the "single printed volume" deal
waiting in the wings, and Mandrake Linux would be interested in a
wikipedia-on-DVD.
Since there seems to be long-standing consensus that wikipedia needs
some form of review (as an additional option, *not* as a replacement for
the wiki way!), I would very much like the discussion about the "how" to
start again, and to reach a clonclusion this time, for a change :-)
It seems to me that most of you would agree to a method similar to this:
* A (logged-in) user can approve a single version of an article.
* At least two approvals are needed for the "wikipedia seal of approval" :-)
Now, the rapid change of wikipedia articles unveils this problem:
* Does the second approval need to be for the *same version* as the
first, or can it be for a later one, which then gets the "seal of approval"?
Also, given the different goals of an approval system, should there be
* one approval only
* one approval for "web version", one for "CD-ROM version", one for
"printed version (single volume)", one for "printed version (30
voulmes)", etc.
Also, should there be
* yes/no approval(s)
* or rather a rating (0-9 or something)
For the record, my opionion to these:
* approvals need *not* to be on the same version, but a "disapproval"
could invalidate any prior approvals
* one approval (we can sort out bot articles/extremely long ones later)
* no rating (either it is a good one or not).
Magnus
Brian wrote:
"To anyone interested: TomeRaider 3 beta is out. It has native image, html,
and category support. I recently used user:Erik Zachte's conversion script
and the text-only database for Wikipedia is weighing in ~300MB, so including
pictures in the future may only be feasible for the wealthier of us (not
me!)"
Mat Ripley says that the new features have been designed with Wikipedia in
mind specifically. Wikipedia will become 'the flag ship' (his words) of this
popular document reader for handhelds. If it isn't already: I've had well
over 20 thousand page views on my support page for Wikipedia for TomeRaider.
My currents thoughts are to include png/gif images only. These are
relatively small and mainly consist of maps, flags and diagrams. Some jpegs
would be nice to have in thumbnail size, but I can't think of a simple
criterion which ones to include. I might add a white list opt-in feature for
people who want to run the script themselves (many do, for regular updates).
Erik Zachte
To anyone interested: TomeRaider 3 beta is out. It has native image,
html, and category support.
I recently used user:Erik Zachte's conversion script and the text-only
database for Wikipedia is weighing in ~300MB, so including pictures in
the future may only be feasible for the wealthier of us (not me!) :)
They are also touting improved compression so that may help with this.
Also - any speculation as to how small (filesize wise) we could get
pictures considered necessary down to, defaulting the width to say,
100px?
I personally found my way to Wikipedia when looking for an
encyclopedia for my PDA, so this could be an important feature to
think about.
/Brian (Alterego)
http://en.wikipedia.org/wiki/Wikipedia:TomeRaider_databasehttp://members.ams.chello.nl/epzachte/Wikipediahttp://www.tomeraider.com/tomeraider3.html
"Take all that to heart. Still, I stress that Wikipedia is clearly one
of the Internet's top five information tools. No other free online
resource -- none -- can give you such a quick and useful briefing on
practically any subject you can think of.
Andy Ihnatko writes on technical and computer issues for the Sun-Times. "
http://www.suntimes.com/output/worktech/cst-fin-andy20.html
Goodmorning all-
I am new to the Wikipedia mailing list, but have spent the last few
days researching an idea, and it was suggested to me that I might
receive some further insight or assistance by bringing it to all of
you:
I have been involved with technology for several years now, and have
come to believe very strongly in the potential power of developing web
interfaces for cell
phone technology. I have spent some time both personal and
professional researching this market, and understanding that more and
more cell phones come equipped with the ability to access the
Internet, and that cell phones are now in service even in some of the
remote corners of the world. It occurs to me that Wikipedia could
possibly be the ultimate extension of this technology, and would
require very little effort on our side or the side of the cell phone
user to integrate.
Simply stated, by creating a cell phone web interface for the
Wikipedia database, instant access to a dynamic encyclopedia could be
given to all cell phone users worldwide. From what I can tell of the
site, the infrastructure to undertake such a project is all there.
I do not know if this is a project that you are already considering,
but if not, I would be happy to lead this initiative, and am very
interested in recruiting others who are interested in becoming
involved in the project.
I would like to hear the ideas or suggestions of all, I already have
constructed ideas for Phase 1 and 2 of the project, and would be more
then happy to share them if there is any further interest.
Thank you all for your time
- Alex Hottenstein
The lingua franca of meta is Englisch, when you devide it up, people who read meta will not necessaraly read relevant posts. This will undermine the usefulness of meta. At present there is little translated to other languages.
The only thing practical would be to have people indicate that an article is in XX language, and have a userpreference to indicate these articles for people intrested in that language.
Thanks,
GerardM
Erik Moeller wrote:
> Daniel-
>
>>>How about a setup like this:
>>>de.wikimedia.org
>>>en.wikimedia.org
>>>fr.wikimedia.org
I am *very* strongly opposed to doing that.
Meta is the only place where we can really meet, and find information
that someone else left. Yes, there is work to do to make it easier to
navigate between languages, but separating us is not the solution. That
is the "easy" solution, but will be very bad in the long term.
Currently, meta is slowly becomming more multilingual, in particular
thanks to german speaking people. We must keep that this way, and rather
try to go toward a solution to navigate (with a sort of internal
language linking) and language interface to be changed in preferences.
> Having multiple languages in one wiki doesn't help people to come
> together.
Yes, it does. Much more than any "supposingly" multilingual mailing list.
> In fact, in my experience, it does the exact opposite.
In my experience, it does bring people together, provided that you
welcome the interaction.
> Parcipation on Meta by people from languages like Chinese or Japanese is
> minimal.
It was also minimal a few months ago by Germans. It is becoming less.
Plus, there are japanese and chinese people currently over there. We
have Tomos, Suisui, Britty etc...
> I'm afraid Meta is perceived as an extension of the English
> language Wikipedia.
Perhaps if you were editing over there in German, it would lessen that
feeling ?
> You can't eliminate the language barrier by throwing all languages into
> one big pot. That only means that the most popular common one - English -
> will dominate and small pockets of non-English discussions will form. This
> is what has happened on the multilingual mailing lists and it is what will
> continue to happen on Meta if we stay on the current path.
This is what is happening on the multinlingual mailing lists, because
each time someone DARE putting a word in a language different than
english, he is severely told that "of course, he could write in english,
because really, no one can understand him".
I regularly write in french on purpose on wikipedia-l, and it never
failed. Each time, I am told it is bad and if I want to be understood, I
should really avoid writing in french.
This is a comment I never get on meta. The argument what does happen on
mailing list will necessarily happen on meta is moot.
> The reality is that because of the language barrier, there *are* different
> communities. Because of national barriers, there *are* different Wikimedia
> interests. And there's no reason why an interesting global policy
> discussion shouldn't be started by people who speak no English whatsoever,
> and then be translated into the main languages if there is a vote.
There is no reason... but this is just too heavy, so it won't happen.
> While I would prefer it if all languages of a project were handled with a
> single database and codebase, this requires quite substantial changes to
> the current code, and is unlikely to happen anytime soon. And if it
> happens, we can port all the existing wikis over to that new system.
No, thank you.
But I
> think we should strive for a consistent approach.
>
> Regards,
>
> Erik
On Friday 16 July 2004 20:24, Magnus Manske wrote:
> Meanwhile, the tone of the press concerning wikipedia seems to
> be slowly drifting from "interesting, lots of potential" to "source of
> unreliable information".
[...]
> It seems to me that most of you would agree to a method similar to this:
> * A (logged-in) user can approve a single version of an article.
> * At least two approvals are needed for the "wikipedia seal of approval"
> :-)
Before discussing an approval method we should agree on what the approval is
about. Is it about spelling/grammer correctness, factually accurateness,
license problems, neutrality, ... or all of the above?
What we need for Wikipedia is a complete review in terms of quality similar to
the traditional peer review system. I write similar since we already know
that a traditional peer review system will not work for Wikipedia--too little
incentive, too much work for a single person, basically it comes down to the
same reasons why Nupedia has failed.
A complete review of an article needs all the things described in the
beginning:
1. style - this includes grammer, well-written, image captions, ...
2. legal - in particular check status of images
3. completeness - does the article contain everything "important" about the
subject
4. fact checking - validate every information in the article
For competing against Britannica 1, 3 and 4 are important. No.2 is important
for avoiding legal mess a la SCO.
A simple yes/no approval mechanism is IMHO not the right way in order to
achieve our goals. I don't think that anyone will fact-check an article of
>30.000 characters, this would literally take days of work if done properly.
I even seriously doubt that everyone would read the whole article before
approving. Erik Möller once wrote:
"My experience on FAC [featured articles candidates] indicates that many
people have not even read the articles they support (no surprise, many of
them are 40,000 characters and longer)."
Not to mention other problems:
* everyone can create an arbitrary number of accounts
* we do not know anything about the readers expertise on the field
* approving an article again and again after every (small) edit will become a
pain
The last point IMHO is important. Every review method should be chosen in such
a way that the additional work after several smaller changes have occurred is
small. In particular one (trusted) user should be able to retransfer the
results from a "stable" version to a more recent version.
I am confident that we can solve no.1 and no.2 and perhaps no.3 within the
exiting wiki framework.
[[Wikipedia:Featured article candidates]] is already a good base for no.1 and
it seems to work reasonable well. And btw I never have heard any complaints
in press that grammer and/or writing style of the articles is bad.
In order to avoid legal problems in the long run, in particular after having
printed 1.0, we should at least have someone who is familiar with law and
GFDL who looks through the article. In most cases it probably would be
sufficient to have a closer look at the images. Here it would be helpful to
have a list of trusted "experts" that could help, similar to
[[Wikipedia:Association_of_Members'_Advocates]].
No.3 can become more difficult if the topic is very special. E.g. for the
article [[quantum field theory]] only a small number of our users have the
knowledge to judge if everything important is included.
No.4 IMHO is the most difficult one and concerning the "reliability" aspect
the most important one.
Checking every single statment can't be done by one user (it is too much
work). Thus a simple yes/no for the whole article cannot be the solution. The
hard point IMHO is the following:
* we either check the article in a very short time frame. Then we need an
"expert committee" which can validates the material. This has also has been
discussed many times, the last time
[[meta:Article_validation#Proposition_of_validation_by_a_committee]]. Reading
the proposition reminds me of Nupedia that's why I don't think this will
work.
* the other way is that many person independently validate information in a
longer time frame. The unsolved problem about this ansatz is how the system
remembers which parts already have been validated and which haven't.
(The discussion page won't be sufficient: if we know that A checked section
1,2 of version 125, B checked section 2,3 of version 134 and C of 1,4 of
version 145, it is hard to find out if really everything was validated
because of the edits in between. And this is only an example with three
reviewers where everyone has checked whole paragraphs.)
The edits between reviews make this hard. I have some ideas I'm playing around
with in order to overcome the problem. This is done by allowing users to
validate single sentences and by that allowing the underlying system to tell
for every single sentence how often it was validated. (unfortunately it is
not that easy, but this is another story)
O.k. just my 2 cents, why I think that a simple yes/no unfortunately is not
sufficient for an equivalent of peer review. It's late, therefore I apologize
for any extra grammer / spelling mistakes ;-)
best regards,
Marco
Hi Members,
You rarely hear from me on this list, but... I was the Chief Copy Editor for Nupedia. Since part of validation process is copy editing, and I have seen few comments on this subject, I added my first thoughts on this subject to Mediawiki's page on article validation...please see the Copy Edit section (6.6) of the Article Validation page at:
http://meta.wikimedia.org/wiki/Article_validation#Copyediting
I think this is important, because I see copy editing as different from other types of article validation. I would appreciate feedback, either to this list or to the article validation page.
Thank you,
As Ever,
Ruth Ifcher