A very warm congratulations, Erik Möller!
I could not resist to do some stats: One year, one month and one week passed
since you resigned from your previous official position, stating "My small
role in the history of Wikimedia is largely over." For once you displayed a
remarkable lack of vision :-)
Erik Zachte
My deepest congratulations to Erik; may he be a successful and wise Board member. Also congratulations to each and every one of the candidates, and great thanks to the elections officials, everyone who helped translate and make the election successful, and everyone who voted.Best of luck, Erik!FlcelloguyFrom Wikipedia, the free encyclopedia.
_________________________________________________________________
Express yourself - download free Windows Live Messenger themes!
http://clk.atdmt.com/MSN/go/msnnkwme0020000001msn/direct/01/?href=http://im…
Since there has been some talk about candidates' statements and translating them into multiple languages, I thought that I would toss my $0.02 into the fray.
I recently finished writing a book (due for released in 2007). I was commissioned to write 25,000 words. It is a small book, but it is a complete book nonetheless. Based on the following categorization below, it would qualify as a novella:
SuperPlus novel: 100,000 + words
Plus Novel: 70,001 - 99,999 words
Novel: 45,001-69,999 words
Category: 30,000-45,000 words
Novella: 15,000-29,999 words
Quickie: up to 15,00 words
(see http://www.ellorascave.com/about/length.htm)
Accordng to Gmaxwell, the largest candidate statement is approximately 20,000 words, i.e., a novella. We have 17 candidates this election. If each one wrote a statement of that length, we would have 340,000 words to read, i.e., statements totalling several times the length of "War and Peace" or a standard Dickens novel. We are talking "A la recherche du temps perdu."
I suggest that the time of Wikimedians can be better spent than reading statements of this length by, for example, researching articles. I would suggest that the time of our translators could be better spent working on making high quality content available in all languages.
In the first election, I set a limit of 500 words on each candidate's statement. Despite the complaints back then, there was a reason for that.
Danny
________________________________________________________________________
Check out the new AOL. Most comprehensive set of free safety and security tools, free access to millions of high-quality videos from across the web, free AOL Mail and more.
Hi,
Recently there was a call for help on foundation-l concerning copyright
question on soviet works in Wikimedia Commons. Previous to that several
people have highlighted in various threads on this list several technical
problems in Wikimedia Commons that have lead to a huge backlog of tagged
copyvios in Wikimedia Commons.
Since any of the currently 701 Wikimedia Foundation wikis can link to
Wikimedia Commons, Commons contributors and administrators need to take care
upon modifying or deleting files. Up to now, MediaWiki only provides the
embedding of files, and not the necessary bidirectional cross project
maintenance assistance. In order to overcome these severe technical flaws -
each one challenging the Wikimedia Commons project as a whole -
passionate "Commonists" simply wrote the missing tools themselves. One
essential tool is a file delinker bot that needs to run in every Wikimedia
wiki and it is unreasonable to await approval by 701 different Wikimedia wiki
communities. They all use Wikimedia Commons and Wikimedia Commons now needs
to efficiently interact with them in return:
That for the administrators of Wikimedia Commons request to be exempt from
local bot policies in order to run this bot.
This is entirely necessary if Commons is ever to not have a backlog of
literally thousands of images, ever to talk with any confidence of the
integrity of our wares. And furthermore it is really important to get rid of
our huge backlog of copyvios in Commons efficiently as this copyvio backlog
is a risk for *all* Wikimedia projects.
So please have a look at the request page and support it. It is really
important for the success of all Wikimedia projects:
http://meta.wikimedia.org/wiki/Requests_for_permissions/CommonsDelinker
Please also contact your local (wikipedia-, wikisource-, wikinews-...)
communities so that they are already prepared and know what it is about.
Communities that are openminded towards that service will take part as soon
as possible at the beta test phase of that bot running at large scale.
If you are interested to know how this bot will work in detail have a look at:
http://meta.wikimedia.org/wiki/User:CommonsDelinker
Cheers, Arnomane
My 2 year old son, Alexej Martin Merkey, was admitted to Primary
Childrens Hospital in Salt lake City this afternoon in
Serious condition due to our family (and others in Utah) getting exposed
to this virulent strain of E. Coli from California
on contaminated Spinach. We had some of this crap in our house and
alexej ate a some of it, as did my wife and I.
They have 14 cases here at present of children 2 and under, and one of
them has died (from Idaho).
Alexej's photo is here http://meta.wikimedia.org/wiki/User:Jmerkey if
folks don't know who he is.
Alexej's platelet count was at 83 which is very high (most of them are
below 20 due to the damage caused by the toxins from this strain) and
they feel he will pull through, but his kidneys were having problems
late this evening and they told me they might have to put him on
dialasys over the weekend. He was not feeling too well, and its always
rough for a 2 year old to have to go through something like this. They
tell me the entero-toxins from this virulent strain of E. Coli cause a
lot of damage and the next two days will be bad for him.
Both my wife and I appear to also have been infected, but since we are
adults, out systems dealt with it more effectively.
Bottom line, DONT BUY OR EAT ANY SPINACH FROM CALIFORNIA FOR NOW.
All my love. Hopefully, he will home soon.
Jeff
I am reminded of a wonderful commercial on American television.
A group of business people are sitting around a table and one of them begins
to choke on his food. They then spend the next few minutes in a casual
discussion about the Heimlich manoeuvre, until someone from another table comes up
and actually does it.
Danny
Returning from holidays and browsing through the various threads, I
got triggered by a few buzzwords in the threads on the board
elections: sock puppets and (language) origin of the votes. As I'm
currently working on technical means to automatically detect open
proxies, that raised with me the following question.
Does the auditing process on the vote validation takes care of sock
puppets? I guess the answer is yes, but what about the case where the
sock puppet is smart enough to use open proxy IP's. I presume the
standard check user tools are inappropriate for that.
The reason I post this (rather critical) question, is the
aforementioned work on open proxy detection. With just a list of IP's,
I could easily scan those IP's on being an open proxy or not. If
detected that is positively yes, if not detected the likelihood of
being OK is pretty high as the program also takes into account the
daylight periods in the respective time zones of the IP's.
An even more easily obtainable by-product of such an exercise could be
a frequency distribution of votes per country and that might be more
useful for understanding the origin of votes than just the language.
Rgds Ronald Beelaard
This didn't get to the foundation list yesterday. There has been
followup on wiki-research-l.
Kevin Gamble knows of one programmer who currently hopes to work on
this starting in October; notes the project could use more interested
people.
http://mail.wikimedia.org/pipermail/wiki-research-l/2006-September/000215.h…
Best,
SJ
---------- Forwarded message ----------
From: SJ <2.718281828(a)gmail.com>
Date: Sep 21, 2006 9:56 PM
Subject: Reflection and research: User surveys (2 of 3)
To: Research into Wikimedia content and communities
<wiki-research-l(a)wikimedia.org>, Wikipedia general list
<wikipedia-l(a)wikimedia.org>, wikiresearch-l(a)wikipedia.org,
wiki-research(a)wikisym.org
A number of Wikipedians have advocated for a user survey for a very
long time. Erik Zachte has been the most vocal and persistent.
Realizing a user survey was one of the top items on the agenda of the
Wikimedia research network before it stopped holding meetings. It has
been the subject of long debates and conversations on IRC, has come up
this summer in Special Projects Committee discussions, and has been
the bugbear of dozens of student and professorial research projects.
Today I attended the wrapup discussion of a three-day conference on
open content and public broadcasting, with gathered luminaries from
WGBH, PBS, the Hewlett Foundation, the Corporation for Public
Broadcasting, Yale law school, the Federation of American Scientists,
and so forth. Of great relevance to them : good information on the
demographics of Wikipedians, segmented by activity in various areas of
the community and the projects. Are we dominated by people with no
full-time jobs and no children? The question was not posed to me, but
I could not have answered with certainty.
Noone outside of the projects can run a survey that ties reliably to
user login authentication. Important sociology and technology
projects going on every month, talks given by Wikipedians and
Wikimedians every week, and literally thousands of third parties
making decisions about communities and creativity, wish they could be
informed by the results of such surveys.
With a brief discussion about preserving privacy in aggregate data,
randomizing test and control samples, and a tweak to allow web forms
on pages that are aware of your wikipedia userid, we could have a
simple projects-wide survey completed within a month. Let's make this
a priority and make such a thing happen -- then figure out how to
optimize future iterations.
The latest discussions on meta are here:
http://meta.wikimedia.org/wiki/General_User_Survey
I recall other pages on en:wp and other language wp's that are not
currently linked from there; if you were part of one of those efforts,
please add a link to your work.
SJ
--
ps - while looking for the link to the user survey on meta, I ran
across this: a poll applet that seems to be working as of last month.
Nice.
http://meta.wikimedia.org/wiki/Poll
(thread 2 in a 3-thread microseries. see also
http://mail.wikimedia.org/pipermail/foundation-l/2006-September/010247.html
)
--
++SJ
In a message dated 9/21/2006 7:02:39 PM Eastern Daylight Time,
2.718281828(a)gmail.com writes:
Not while on mailing lists, we're not; certainly not on this one.
But consider Wikipedia: the vast majority of pages are not in the main
namespace. Is this in accordance with, or in opposition to, the
natural order of what we are supposed to be busy doing?
++SJ
If you mean the talk pages, those are intended to work out the nuances of
articles. In effect, they are an essential--integral--part of article building.
If you mean policy pages, they are a small percentage of our 1.3 million
article pages. Other than that, I have no idea what you are talking about. Do you?
Danny
Hi,
according to
http://meta.wikimedia.org/wiki/Wikimania_2007/Bids/Judging#Vote
"A general meeting (to discuss bids and ask/reply to last-minute
questions) will be held after the final deadline for completed bids.
The jury will be as above, plus the new Board member elected in
September to replace Angela. This meeting will be public and will
happen with bidding teams and whoever is interested in freenode
channel #wikimania on September 23rd at 15.00 UTC. The final vote will
be held by the jury afterwards."
Given that the result of the election will likely not be announced
before Monday, should this Wikimania meeting be postponed?
--
Peace & Love,
Erik