[Foundation-l] Bot policy on bots operating interwiki

White Cat wikipedia.kawaii.neko at gmail.com
Sat Sep 8 14:28:18 UTC 2007


To Thomas Dalton:

I think SUL is a distant dream at this point. Even when that is available we
will have problems as how rules are set at this point. How compatible is the
local wiki policies with each other?

I am En-N Tr-4 and Ja-1. I do not understand local policies aside from those
languages (I understand Japanese with great difficulty and I do not get it
all that much). Most interwiki bot operators know just a single language.
All local bot policies must be advertised/linked from on a page on meta
preferably with an English translation. After all I can only follow rules I
can find and understand. It has to be both. It would be beyond silly if we
are going to require interwiki bot operators to know every language out
there.

All bot policies on all local wikis should be compatible with each other.
There should be a meta bot standard for a few spesific tasks such as
interwiki linking, double redirect fixing, commons delinking to avoid
conflicts. Local wikis would NOT be required to follow this standard but
they would be jawdroppingly incompetent if they did not. This is much like
how commons is. Wiki's aren't  required to use/move images on/to commons but
they are recommended to do so instead. That worked pretty well.

This will be a problem even when SUL becomes a reality... that is if it
becomes a reality.

To: effeietsanders

I bet it is the same code (redirect.py). You can intwerwiki link pages like
that with it too. The amount of attention I pay to it is directly
proportional to the number of wikis my bot operates on (other wise it is a
waste of time as explained above as another interwiki bot would simply
revert). I do not have the time for that at the moment because I have 665
wikis to request a bot flag which I rather not.

Indeed trust is important, in fact critical. How would a wiki with half a
user determine weather or not to trust a bot operator? Say the bot operator
is from de.wikipedia and local wiki user is from a far eastern language (ex:
ml.wikipedia) that does not even use latin text. How would he be able to
determine weather or not to trust a user?

I think the operation of an interwiki bot needs interwiki consensus. If a
user is trustable on an interwiki meta discussion, I think it is safe to say
he/she/it is trustable on all wikis. If a local wiki still decides to refuse
that interwiki consensus, they can and are more than welcome to block the
bot in question.

Let's take your example. Suppose a wiki states their local request of a "3
bot max" on meta. All users who want to operate an interwiki bot would see
it and react to it. That wiki would be ignored by interwiki bot operators as
per their request. Bot operators would not even bother to file a request for
an interwiki bot flag on that wiki. So this would save every one a lot of
time. As for the 699 other wikis such a restriction is not the case and they
do not have a problem with the interwiki bots.

I still need to see one logical explanation why communities need to
"approve" a spesific script repetitively.

It only makes sense to handle interwiki affairs like interwiki bots on an
interwiki median such as meta.

    - White Cat


On 9/8/07, effe iets anders <effeietsanders at gmail.com> wrote:
>
> From Andre's explanation I do understand that his robot is less automatic
> and he solves more manually. That means that although you might run the same
> script, the number of errors is not necessarily the same. Besides that,
> there is also a matter of trust involved (do you trust the person (s)he will
> only run that specific script? Do you trust that (s)he will do the necessary
> updates in time? And what about the conflicting situations?) So there are
> differences enough. But even if they'd not allow you, just because they want
> maximal three bots or so (which seems understandable to me) I would think
> that as a valid argument.
>
> But *even* if they wouldn't want a bot to have a bit just because of no
> single reason, they just dont want it, so be it. I feel that this is up to
> the *community* and not to us to decide, unless the software changes
> dramatically, as peter described before.
>
> Eia
>
> 2007/9/8, White Cat < wikipedia.kawaii.neko at gmail.com>:
> >
> > I think it is beyond silly to demand people to make over 700 individual
> > human edits just so they can run an interwiki bot. It takes well over weeks
> > if not months of work to file all the requests. All these bots operate the
> > same code. I still need to see one logical explanation why communities
> > need to "approve" a spesific script repetitively. Bot A and B makes
> > identical edits since they run the same code.
> >
> > No I cannot write a script. Fundamentally bots are what you call, a
> > "script". What you suggest is the use of an unauthorized bot, something
> > exclusively banned. I can't believe you are even suggesting it.
> >
> > If the local community is unhappy with a bot they can simply block it or
> > ask on meta to be removed from wikis that support interwiki bots. If the
> > local wiki does not have a single admin they they are not truly ready for a
> > bot request discussion. The bot's would make rare appearances in such wikis
> > with their article count anyways.
> >
> > Wikipedia/Wikimedia isn't a democracy. If devs are allowed to "force"
> > software upgrades down the local communities throats, I truly do not see why
> > interwiki bot operators are not allwed to do the same.
> >
> >     - White Cat
> >
> > On 9/8/07, effe iets anders < effeietsanders at gmail.com> wrote:
> >
> > > I think that above situations have described perfectly that bots are
> > > not
> > > perfect :) And although I think that the advantages outweight the
> > > disadvantages, that doesn't mean that every community (with
> > > 0-bizillioin
> > > members) agrees to that conclusion. I think that it is of the
> > > uttermost
> > > importance that communities are independant, and are at least able to
> > > protest to another new bot user. I know this is a pain in the ass, I
> > > know
> > > this means more work to you guys, and I know that you don't like this.
> > > But
> > > when determining this kind of things, I think that you should not only
> > > look
> > > from the point of view of the bot owner, but even more to the pov of
> > > the
> > > community (yes, even is there is only half a person there). Put the
> > > request
> > > on the appropriate page (that is either a bot request page either some
> > > much
> > > visited community page or even possibly the talk:Main_Page in the
> > > extreme
> > > case) and give those folks the ability to protest to the new bots. if
> > > they
> > > don't want them, well, it's their wiki, their choise. If that is
> > > because of
> > > wrong information, well, either inform them well, either leave it
> > > there. I
> > > think it is totally wrong if stewards are forcing bots up their
> > > throat.
> > >
> > > And btw, I am confident that you are able to write some script to make
> > > that
> > > making the requests somewhat easier in the first place... For the
> > > stewards
> > > it makes no difference btw, because we have to grant hte rights
> > > seperately
> > > anyways...
> > >
> > > Effeietsanders
> > >
> > > 2007/9/8, White Cat < wikipedia.kawaii.neko at gmail.com >:
> > > >
> > > > Yes, whats breaking the bot is human error. and as a fellow
> > > interwiki-bot
> > > > operator I think it would be of great help if we were given some
> > > slack on
> > > > bot flag bureaucracy. You could just use the bot to fix the bad
> > > > interwikilink rather than fixing them manually. The policy would not
> > > solve
> > > > everything but would be a good step in the right direction.
> > > >
> > > >       - White Cat
> > > >
> > > > On 9/7/07, Tuvic < tuvic.tuvic at gmail.com> wrote:
> > > > >
> > > > > Indeed, that's right. Just remember that interwiki-bots just
> > > spread
> > > > > the bad link, they don't make it: it are human users who make the
> > > bad
> > > > > link.
> > > > >
> > > > > It happened to me on several occasions: I had just spend 20
> > > minutes to
> > > > > untangle an web of interwiki-linked articles, and some user just
> > > puts
> > > > > a bad link back, because he/she thinks that the link should be
> > > there.
> > > > > Very annoying, and not always revertable: after all, I'm just an
> > > > > interwiki-bot-operator, while it's their home wiki most of the
> > > time.
> > > > >
> > > > > So, not all problems would be avoided when having a general bot
> > > policy.
> > > > >
> > > > > Greetings, Tuvic
> > > > >
> > > > > 2007/9/7, White Cat <wikipedia.kawaii.neko at gmail.com >:
> > > > > > Bots aren't sentient so they can act stupidly. There are
> > > situations
> > > > > where
> > > > > > you have a bad interwiki link. Unless that is removed from every
> > > > single
> > > > > > instance where it forms a chain it will eventually return to the
> > > list
> > > > > (which
> > > > > > makes sense, the bots think the wrong link as a new member to
> > > the
> > > > > chain).
> > > > > > However if all interwiki bots were able to operate on all wikis
> > > such
> > > > > > problems could be very easily avoided.
> > > > > >
> > > > > >    - White Cat
> > > > > >
> > > > >
> > > > > _______________________________________________
> > > > > foundation-l mailing list
> > > > > foundation-l at lists.wikimedia.org
> > > > > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> > > > >
> > > > _______________________________________________
> > > > foundation-l mailing list
> > > > foundation-l at lists.wikimedia.org
> > > > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> > > >
> > > _______________________________________________
> > > foundation-l mailing list
> > > foundation-l at lists.wikimedia.org
> > > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> > >
> >
> >
>


More information about the foundation-l mailing list