-----Original Message-----
From: KNOTT, T [mailto:TKNOTT@qcl.org.uk]
Sent: Thursday, February 12, 2004 12:08 PM
To: English Wikipedia
Subject: [WikiEN-l] sock puppets
I am reasonably certain that [[User:R Gunther]] is Mr Natty health
trying to avoid his ban. Should I just ban him now or should a request a
developer to look at is IP ?
Would you like me to become a List Administrator for Wikitech-l? I
already administer Wikien-l, so I have experience with the web
interface.
I would be able to clear up things like the excessive bounce problem for
you (and others).
Ed Poor
Wikien-l Admin
-----Original Message-----
From: Brion Vibber [mailto:brion@pobox.com]
Sent: Tuesday, February 10, 2004 8:40 PM
To: Wikimedia developers
Subject: [Wikitech-l] Mail problems
Just FYI, I've been having a lot of trouble with the list kicking me
for "excessive bounces". I'm trying to get it resolved, but in the
meantime I'm not receiving all mails to the lists.
-- brion vibber (brion @ pobox.com)
"Brion Vibber" <brion(a)pobox.com> schrieb:
> robots.txt isn't based on user-agent. It works on the honor system; if
> the client doesn't obey robots.txt, there is no blocking caused by
> robots.txt.
>
> Here's the current user-agent block list:
>
> # we don't like these user-agents:
(...snip...)
> acl stayaway browser Python-urllib
I think that this is from Rob's robot before a more specific user-agent
was defined for it. As such, I don't think it has much use to keep it
here.
Andre Engels
this is a message based on a conversation about revamping the wiktionary
service so that it contained more structured data. In it's current form it is
a Wiki about words. This does not allow for easy connections to be made
between Thesarus words, translations, etc.
Hopefully, with some help, brainstorming, and experience we can hammer out a
better system to store, look-up, and access dictionary words.
I've got loads of ideas about how to implement some of this, but others know
more about how to include rollbacks, versioning, and deletion, of material.
If people are interested in can post some more indepth ideas about what (i
think) would need to be built and get some feedback about planning for other
languages, special cases, migration, and future additions.
-brian
There may be some brief downtime or odd behavior over the next few
hours as we try swapping the new servers into production. Hopefully
everything will go pretty smoothly.
-- brion vibber (brion @ pobox.com)
Do we forbid certain spiders access to the site based on User-Agent? A
user in a German forum reported recently that he couldn't access
Wikipedia at all, always receiving a "Forbidden" message. It turned out
that his webwasher proxy was to blame (an ad banner block). The proxy
sends the User-Agent
"Mozilla/4.0 (compatible; MSIE 5.0; Windows 98; DigExt) WebWasher 3.0"
Webwasher cannot be used to spider and download sites.
Axel
I asked this question a few days ago, & since it is vital, I am
bothering all of you once again with it.
I have an idea of a possible way to settle part of a dispute, which I
know in theory is possible, but I don't know if it also is in
practical terms.
What I would like to do is find the IP addresses for several people
who made changes to Wikipedia, then do an reverse lookup in order to
determine whether these people come from the same subnet. This might
be convincing enough to the two people I'm working with to settle
the matter. (I could then also show whether or not these people have made
edits while not logged in, as one party claims.)
Can this be done?
Geoff
Just FYI, I've been having a lot of trouble with the list kicking me
for "excessive bounces". I'm trying to get it resolved, but in the
meantime I'm not receiving all mails to the lists.
-- brion vibber (brion @ pobox.com)