Then I think the thing to do is urge the developers to provide some
sort of global block list... Perhaps this is an issue to discuss on
meta.
From what you describe, wikimedia would have to take a
hsefline
against *all* proxies on *all* wikimedia foundation wikis. This would
then have to be enforced by a global IP block list...
On 1/18/09, Christopher Grant <chrisgrantmail(a)gmail.com> wrote:
We have bots that do that, grawp still gets
through(part of the reason is
that these proxies need to be blocked globally or else grawp can still abuse
SUL and TOR to create accounts and make the required 10 edits before he has
to find an unblocked proxy on enwiki).
- Chris
On Mon, Jan 19, 2009 at 11:25 AM, K. Peachey <p858snake(a)yahoo.com.au> wrote:
can
continue to use unblocked proxies until we block them all. (
Blocking *all* proxies is nigh on impossible because computers get
comprimised daily... So new "open proxies" are created daily.)
Maybe it
would if we could hook someone like
<http://www.1freeproxy.com/feed/atom/> (rss feed for just proxies) in
so that they are automatically blocked, which i believe is Wikipedia's
policy anyway.
_______________________________________________
WikiEN-l mailing list
WikiEN-l(a)lists.wikimedia.org
To unsubscribe from this mailing list, visit:
https://lists.wikimedia.org/mailman/listinfo/wikien-l
_______________________________________________
WikiEN-l mailing list
WikiEN-l(a)lists.wikimedia.org
To unsubscribe from this mailing list, visit:
https://lists.wikimedia.org/mailman/listinfo/wikien-l
--
Sent from my mobile device