I don't know whether the Board wants community input on this or not,
but I suspect there will be community members who would like to give
their input anyway.
>From the "Board meeting planned in october" thread:
On 9/10/07, Florence Devouard <anthere(a)anthere.org> wrote:
>
> During the board meeting, there should be discussions over whether to
> expand the board to 9, or keep it for now at 7. A couple of names are
> currently floating around.
> There may be a change in the terms of the appointed members.
Based on the board expansion resolution of December last year [1], I
would have expected that the Board would be expanded to 9 in July next
year, with three more elected seats to be up for election at that
time.
--
[1] http://wikimediafoundation.org/wiki/Resolution:Board_expansion
--
Stephen Bain
stephen.bain(a)gmail.com
On 10/09/2007, Mike Godwin <mgodwin(a)wikimedia.org> wrote:
>
> David, for some reason foundation-l doesn't want to accept this
> response from my mgodwin(a)wikimedia.org address.
>
> ----------------
> This is a great question, precisely because there is a big division
> among copyright theorists on what the answer is.
>
> "Copyright absolutists" like to class "fair use" as merely a defense
> against an infringement claim, because doing so makes it seem narrow
> and exceptional.
>
> Free-speech theorists prefer to class "fair use" as a right that
> derives directly from the First Amendment (or equivalent guarantees
> in other national constitutions or under international treaties).
>
> Personally, I fall into the second camp. It should be noted, however,
> that "fair use" is built into American copyright statutes as,
> technically, a "defense." Constitutional lawyers like me tend to
> believe this doesn't really answer the philosophical question -- it's
> just a structural choice.
>
>
>
> --Mike Godwin
> General Counsel
> Wikimedia Foundation
>
>
>
>
>
>
Hello.
As a few people may know, the Green Screen of Death multilingual error
message on Wikimedia is my baby. Since it's now 2 years old, and since
several people had dissed its javascript etc, I recently started
re-coding it and adding several languages.
I have spoken with so many kind, friendly Wikipedians who speak other
languages, and they have been incredibly helpful to me by providing
translations and putting up with my incessant popping into their IRC
questions to ask silly questions in English.
Anyway, after almost a couple of months, I finally have all the
translations together. I would now like people who are fluent in these
languages to proof-read the messages, and mark them in the table on
this page as checked:
http://meta.wikimedia.org/wiki/Multilingual_error_messages/Draft
Additionally, some of the languages there need some links to be inserted.
Thanks for your time. Any help that people can give me in this by
doing this proof-reading will be rewarded with warm thank-yous and
WikiLove.
~Mark Ryan
I am not a US lawyer, nor is [[Pamela Jones]] of [[Groklaw]]. But
here's some food for thought:
http://www.groklaw.net/article.php?story=20070907195435565
Despite my personal fondness for slash-and-burning fair abuse on en:wp
and taking away children's eyecandy, I remain a big fan of fair use,
because quotation is a necessary part of being able to talk about
something. [[Golan v. Gonzales]] (that's a red link. Could someone
please write the article?) is the US 10th Circuit Court of Appeals
saying it is too.
So what's Wikipedia and Wikimedia's duty to exercise that right in the
pursuit of educational value?
- d.
Dear all,
--------
INFORMING YOU: GENERAL VENUES
Last time I updated you on this list, there was a request so that
updates be also added on the wiki, for later references.
I heard you :-)
You may find this here:
*http://wikimediafoundation.org/wiki/Meetings
*http://wikimediafoundation.org/wiki/Messages
In each of the pages you may find here, I try to add as well relevant
links.
I also started a general clean-up of meta, related to Wikimedia pages :-)
Allow me the opportunity to remind you of the resolution page, where we
gather our decisions
*http://wikimediafoundation.org/wiki/Resolutions
Regarding our last board meetings
The board meeting (June, Amsterdam) is reported here:
http://wikimediafoundation.org/wiki/Meetings/June_1-3%2C_2007
The board meeting (August, Taipei) does not have any report. It lasted
two hours and was essentially an update from Sue.
The advisory board meeting notes may be found here:
http://advisory.wikimedia.org/wiki/Meeting_August_2007/Notes
-------
BOARD MEETING IN OCTOBER
Things are moving at lightspeed since Sue arrival. I am not going to
comment on everything, but only on a few topics related to board activity.
A board meeting is planned in early october, to take place in our office
in Florida. The agenda is not fully defined yet, but a few big topics
are already known.
First topic: Board membership.
Three members are ending their term in december. Jimmy, Jan-Bart and
Michael. The first is willing to stay, the second still have to confirm
but seems willing as well. However, Michael has explicitely said he
would expect to be replaced both as board member and as treasurer.
During the board meeting, there should be discussions over whether to
expand the board to 9, or keep it for now at 7. A couple of names are
currently floating around.
There may be a change in the terms of the appointed members.
October is also the moment to elect our officers (chair, vice-chair,
treasurer and secretary).
I'd like as well to see a discussion over the roles and responsabilities
of the officers.
However, my hottest topic on board membership is the treasurer. We are
now a rather big organization, and we really can not afford not to have
professional oversight on financial matters. As such, my intent is that
Michael be replaced by someone with the financial skills to ensure the
financial oversight of the organization. I am currently working with
Mona on this, for a proper advertisement of a clearly defined position.
I will oppose us naming treasurer someone without professional
expertise. I expect you will hear about the position in the next few weeks.
Second topic: budget
Staff and Mona are currently working on a budget draft. An operating
budget should be ready for board approval at the October meeting.
Third topic: advisory board
Names have been floating around for new members. I'd like us to also
reflect back a little on the advisory board meeting in Taipei.
All those above should lead to well defined resolutions.
Another part of the board meeting should be operational reporting,
including (but not restricted to),
* audit (currently ongoing)
* staff (in particular new staff)
* fundraising operations (in the hands of Sabine)
* trademarks situation and more generally legal assessment (from Mike)
There are various other topics, which are currently more at the
discussion level, including
* communication strategy
* biz dev strategy
* branding strategy
* fundraising strategy (a meeting is planned with a professional)
* chapter strategy
The board meeting will last two days (6 and 7) and will be followed by a
board+staff meeting (8).
Next updates you can expect are likely to be on Wikimania 2008, audit
and treasurer issues :-)
Florence
Links
*http://lists.wikimedia.org/pipermail/foundation-l/2007-July/031751.html
(audit)
I think we really need to have some automation in dealing with these.
Granted there are notable exceptions on some wikis such as the Chinese
wikis, but for every other wiki there needs to be a more centralized block
on open proxies.
How likely is an average person to edit a wiki from an open proxy? How
likely is a spam bot or some destructive vandal script to preform the task.
I think a solution could be like the spam list - a centralized meta page for
open proxy blocks.
- White Cat
I think we have a serious problem with this. When the interwiki bot issue
was last discussed there only was a handful of wikis. I think it is time to
bring some attention to this.
http://meta.wikimedia.org/wiki/Special:SiteMatrix displays quite a large
number of wikis (I was told this is around 700). Wikipedia alone has 253
language editions according to
http://meta.wikimedia.org/wiki/List_of_Wikipedias
I was told only 60 of these 700ish wikis have an actual local bot policy of
which most are just translations or mis-translations of en.wiki.
Why is this a problem? Well, if a user decides to operate an interiwki bot
on all wikis. He or she (or it?) would have to make about 700 edits on the
individual wikis. Aside form the 60 most of these wikis do not even have a
bot request page IIRC. Those individual 700 edits would have to be listed on
[[m:Requests for bot status]]. A steward will have to process these 700 -
wikis with active bcrats. Thats just one person. As we are a growing
community, now imagine just 10 people who seek such interwiki bot operation.
Thats a workload of 7000. Wikimedia is a growing community. There are far
more than 700 languages on earth - 7000 according to
http://en.wikipedia.org/wiki/Natural_language#Native_language_learning thats
ultimately 7000 * (number of sister projects) wikis per individual bot. With
the calculation of ten bots thats 70,000 requests.
There are a couple of CPU demanding but mindless bot tasks. All these tasks
are handled by the use of same code. Tasks that come to my mind are:
* Commons delinking
* Double redirect fixes
* Interwiki linking
* Perhaps even anti-spam bots
Currently we already have people who make bot like alterations to individual
such as mediawiki developers wikis without even considering the opinions of
local wikis. I do not believe anyone finds this problematic. Also we elect
stewards from a central location. We do not ask the opinion of individual
wikis. Actions a steward has access to is vast but the permission they have
is quite limited. So the concept of centralized decisions isn't a new
concept. If mediawiki is a very large family we should be able to make
certain decisions family wide.
I think the process on bots operating inter-wiki should be simplified
fundamentally. Asking every wiki for permission may seem like the nice thing
to do but it is a serious waste of time, both for the bot operator and for
the stewards as well as the local communities actually. There is no real
reason to repetitively approve "different" bots operating the same code.
My suggestion for a solution to the problem is as follows:
A foundation/meta bot policy should be drafted prompting a centralized bot
request for a number of very spesific tasks (not everything). All these need
to be mindless activities such as interwiki linking or double redirect
fixing. The foundation will not be interfering with the "local" affairs, but
instead regulating inter-wiki affairs. All policies on wikis with a bot
policy should be compatible or should be made compatible with this
foundation policy. Bot requests of this nature would be processed in meta
alone saving every one time. The idea fundamentally is "one nom per bot"
rather than "one nom per wiki" basically.
If a bot breaks, it can simply be blocked. Else the community should not
have any problem with it. How much supervision do interwiki bots really need
anyways?
Perhaps an interface update is necessary allowing stewards to grant bot
flags in bulk rather than individually if this hasn't been implemented
already.
- White Cat
For those who don't read my blog - just forwarding to various lists in
English and if needed, please forward it to the various project lists.
Thank you!!!
Sabine
-------- Original-Nachricht --------
Well, dealing with the Fundraiser 2007 I am trying to involve the whole
of the communities. But that seems easier that it really is ... one
things: oh well, there are the village pumps and you just go around them
... or you go through the mailing lists (but not all projects have one)
... or in the worst case you use the various chat rooms ... well no, it
does not really work ... a really well structured communication in this
specific moment is not possible - and in some way we should think about
a solution.
Village pumps:
I am getting step by step to them - there is no all comprehensive page -
but even if there was there is one huge problem: they are structured
very different from one project to the other. Often I posted "somewhere"
without even understanding if it was the right place ... some have extra
pages for messages written in a different language from theirs (but then
again: there you don't reach a maximum number of people who maybe would
help). Some have different sectors for different themes ... but again:
there you don't reach all potential people, just the part of them that
goes to the "general" village pump page. Uhmmmmmm ...... not sure how to
sort this out ... and no: I would be agains a page for special
"foundation information" since again it would be read by only a part of
the users/editors and these would probably be more or less the same ones
who read foundation list ....
Now I asked for a bot that can help us to add new sections to specific
pages within the pywikipediabot framework - this would at least make one
part of it easier, not having to go around all projects, but still the
"how to communicate effectively" problem remains ....
For now, until we maybe get a better solution, I would like to ask
people from the various projects to check the page where I list the
village pumps
<http://meta.wikimedia.org/wiki/Fundraising_2007/Village_Pumps> and
change the links I have there to the page where they want to have the
messages added - for all projects please - this will help all of us to
live an easier wiki-life. During the next days I will then make another
round around the various Village Pumps asking people to correct the link
on the page above if necessary.
I am sorry, but for now I don't see a different way to get this on the way.
And please: all links to all projects are needed - also the smallest
ones ... they all have the same relevance.
Thanks!!!
--
Posted By Sabine Cretella to words & more
<http://sabinecretella.blogspot.com/2007/09/fundraiser-2007-communication-wi…>
at 9/08/2007 02:35:00 PM
To Thomas Dalton:
I think SUL is a distant dream at this point. Even when that is available we
will have problems as how rules are set at this point. How compatible is the
local wiki policies with each other?
I am En-N Tr-4 and Ja-1. I do not understand local policies aside from those
languages (I understand Japanese with great difficulty and I do not get it
all that much). Most interwiki bot operators know just a single language.
All local bot policies must be advertised/linked from on a page on meta
preferably with an English translation. After all I can only follow rules I
can find and understand. It has to be both. It would be beyond silly if we
are going to require interwiki bot operators to know every language out
there.
All bot policies on all local wikis should be compatible with each other.
There should be a meta bot standard for a few spesific tasks such as
interwiki linking, double redirect fixing, commons delinking to avoid
conflicts. Local wikis would NOT be required to follow this standard but
they would be jawdroppingly incompetent if they did not. This is much like
how commons is. Wiki's aren't required to use/move images on/to commons but
they are recommended to do so instead. That worked pretty well.
This will be a problem even when SUL becomes a reality... that is if it
becomes a reality.
To: effeietsanders
I bet it is the same code (redirect.py). You can intwerwiki link pages like
that with it too. The amount of attention I pay to it is directly
proportional to the number of wikis my bot operates on (other wise it is a
waste of time as explained above as another interwiki bot would simply
revert). I do not have the time for that at the moment because I have 665
wikis to request a bot flag which I rather not.
Indeed trust is important, in fact critical. How would a wiki with half a
user determine weather or not to trust a bot operator? Say the bot operator
is from de.wikipedia and local wiki user is from a far eastern language (ex:
ml.wikipedia) that does not even use latin text. How would he be able to
determine weather or not to trust a user?
I think the operation of an interwiki bot needs interwiki consensus. If a
user is trustable on an interwiki meta discussion, I think it is safe to say
he/she/it is trustable on all wikis. If a local wiki still decides to refuse
that interwiki consensus, they can and are more than welcome to block the
bot in question.
Let's take your example. Suppose a wiki states their local request of a "3
bot max" on meta. All users who want to operate an interwiki bot would see
it and react to it. That wiki would be ignored by interwiki bot operators as
per their request. Bot operators would not even bother to file a request for
an interwiki bot flag on that wiki. So this would save every one a lot of
time. As for the 699 other wikis such a restriction is not the case and they
do not have a problem with the interwiki bots.
I still need to see one logical explanation why communities need to
"approve" a spesific script repetitively.
It only makes sense to handle interwiki affairs like interwiki bots on an
interwiki median such as meta.
- White Cat
On 9/8/07, effe iets anders <effeietsanders(a)gmail.com> wrote:
>
> From Andre's explanation I do understand that his robot is less automatic
> and he solves more manually. That means that although you might run the same
> script, the number of errors is not necessarily the same. Besides that,
> there is also a matter of trust involved (do you trust the person (s)he will
> only run that specific script? Do you trust that (s)he will do the necessary
> updates in time? And what about the conflicting situations?) So there are
> differences enough. But even if they'd not allow you, just because they want
> maximal three bots or so (which seems understandable to me) I would think
> that as a valid argument.
>
> But *even* if they wouldn't want a bot to have a bit just because of no
> single reason, they just dont want it, so be it. I feel that this is up to
> the *community* and not to us to decide, unless the software changes
> dramatically, as peter described before.
>
> Eia
>
> 2007/9/8, White Cat < wikipedia.kawaii.neko(a)gmail.com>:
> >
> > I think it is beyond silly to demand people to make over 700 individual
> > human edits just so they can run an interwiki bot. It takes well over weeks
> > if not months of work to file all the requests. All these bots operate the
> > same code. I still need to see one logical explanation why communities
> > need to "approve" a spesific script repetitively. Bot A and B makes
> > identical edits since they run the same code.
> >
> > No I cannot write a script. Fundamentally bots are what you call, a
> > "script". What you suggest is the use of an unauthorized bot, something
> > exclusively banned. I can't believe you are even suggesting it.
> >
> > If the local community is unhappy with a bot they can simply block it or
> > ask on meta to be removed from wikis that support interwiki bots. If the
> > local wiki does not have a single admin they they are not truly ready for a
> > bot request discussion. The bot's would make rare appearances in such wikis
> > with their article count anyways.
> >
> > Wikipedia/Wikimedia isn't a democracy. If devs are allowed to "force"
> > software upgrades down the local communities throats, I truly do not see why
> > interwiki bot operators are not allwed to do the same.
> >
> > - White Cat
> >
> > On 9/8/07, effe iets anders < effeietsanders(a)gmail.com> wrote:
> >
> > > I think that above situations have described perfectly that bots are
> > > not
> > > perfect :) And although I think that the advantages outweight the
> > > disadvantages, that doesn't mean that every community (with
> > > 0-bizillioin
> > > members) agrees to that conclusion. I think that it is of the
> > > uttermost
> > > importance that communities are independant, and are at least able to
> > > protest to another new bot user. I know this is a pain in the ass, I
> > > know
> > > this means more work to you guys, and I know that you don't like this.
> > > But
> > > when determining this kind of things, I think that you should not only
> > > look
> > > from the point of view of the bot owner, but even more to the pov of
> > > the
> > > community (yes, even is there is only half a person there). Put the
> > > request
> > > on the appropriate page (that is either a bot request page either some
> > > much
> > > visited community page or even possibly the talk:Main_Page in the
> > > extreme
> > > case) and give those folks the ability to protest to the new bots. if
> > > they
> > > don't want them, well, it's their wiki, their choise. If that is
> > > because of
> > > wrong information, well, either inform them well, either leave it
> > > there. I
> > > think it is totally wrong if stewards are forcing bots up their
> > > throat.
> > >
> > > And btw, I am confident that you are able to write some script to make
> > > that
> > > making the requests somewhat easier in the first place... For the
> > > stewards
> > > it makes no difference btw, because we have to grant hte rights
> > > seperately
> > > anyways...
> > >
> > > Effeietsanders
> > >
> > > 2007/9/8, White Cat < wikipedia.kawaii.neko(a)gmail.com >:
> > > >
> > > > Yes, whats breaking the bot is human error. and as a fellow
> > > interwiki-bot
> > > > operator I think it would be of great help if we were given some
> > > slack on
> > > > bot flag bureaucracy. You could just use the bot to fix the bad
> > > > interwikilink rather than fixing them manually. The policy would not
> > > solve
> > > > everything but would be a good step in the right direction.
> > > >
> > > > - White Cat
> > > >
> > > > On 9/7/07, Tuvic < tuvic.tuvic(a)gmail.com> wrote:
> > > > >
> > > > > Indeed, that's right. Just remember that interwiki-bots just
> > > spread
> > > > > the bad link, they don't make it: it are human users who make the
> > > bad
> > > > > link.
> > > > >
> > > > > It happened to me on several occasions: I had just spend 20
> > > minutes to
> > > > > untangle an web of interwiki-linked articles, and some user just
> > > puts
> > > > > a bad link back, because he/she thinks that the link should be
> > > there.
> > > > > Very annoying, and not always revertable: after all, I'm just an
> > > > > interwiki-bot-operator, while it's their home wiki most of the
> > > time.
> > > > >
> > > > > So, not all problems would be avoided when having a general bot
> > > policy.
> > > > >
> > > > > Greetings, Tuvic
> > > > >
> > > > > 2007/9/7, White Cat <wikipedia.kawaii.neko(a)gmail.com >:
> > > > > > Bots aren't sentient so they can act stupidly. There are
> > > situations
> > > > > where
> > > > > > you have a bad interwiki link. Unless that is removed from every
> > > > single
> > > > > > instance where it forms a chain it will eventually return to the
> > > list
> > > > > (which
> > > > > > makes sense, the bots think the wrong link as a new member to
> > > the
> > > > > chain).
> > > > > > However if all interwiki bots were able to operate on all wikis
> > > such
> > > > > > problems could be very easily avoided.
> > > > > >
> > > > > > - White Cat
> > > > > >
> > > > >
> > > > > _______________________________________________
> > > > > foundation-l mailing list
> > > > > foundation-l(a)lists.wikimedia.org
> > > > > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> > > > >
> > > > _______________________________________________
> > > > foundation-l mailing list
> > > > foundation-l(a)lists.wikimedia.org
> > > > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> > > >
> > > _______________________________________________
> > > foundation-l mailing list
> > > foundation-l(a)lists.wikimedia.org
> > > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> > >
> >
> >
>