"George Herbert" george.herbert@gmail.com wrote in message news:38a7bf7c0901192006p3fe2a3ft987dfeea3a11f6d0@mail.gmail.com...
On Sun, Jan 18, 2009 at 6:36 PM, Christopher Grant <chrisgrantmail@gmail.com
wrote:
We have bots that do that, grawp still gets through(part of the reason is that these proxies need to be blocked globally or else grawp can still abuse SUL and TOR to create accounts and make the required 10 edits before he has to find an unblocked proxy on enwiki).
- Chris
On Mon, Jan 19, 2009 at 11:25 AM, K. Peachey p858snake@yahoo.com.au wrote:
can continue to use unblocked proxies until we block them all. ( Blocking *all* proxies is nigh on impossible because computers get comprimised daily... So new "open proxies" are created daily.)
Maybe it would if we could hook someone like http://www.1freeproxy.com/feed/atom/ (rss feed for just proxies) in so that they are automatically blocked, which i believe is Wikipedia's policy anyway.
Perhaps we could add a front-end proxy check to all connections from previously unknown IPs.
If the account isn't on the known proxy users exemption list, then zap the IP...
I am not sure what you mean. I imajin that checkusers can construct a list of accounts that were created from a particular IP#. If not, then that would be a useful tool. Under dynamic IP, it would be nothing but clues that do not go together in a reliable manner, and IP# reassignments would be a headache for the tool designer in any case.
Or are you guessing that proxies are identifiable as such. They are not. Start with the case of a living proxy. HTTPS is the main reliable manner of verifying anyone's identity, and it offers a level of inconvenience to openning accounts and ensuring the privacy of the private partner to your public key. There is a proposed modification of protocol for HTTP, "X-Forwarded-For". It is actually a remake of a STANDARD header that Lynx can send, but does not send by default (AFAIK, Explorer does not support e-mail addresses in HTTP headers). If an ISP filled out the e-mail address, then that could work with a higher degree of authenticity, and it would hav to be restricted to sites that hav the right to demand it, somehow. It is something of a technical nightmare, because the software for inserting this header is not common, and the privacy measures are another ball of string. To demand it, we would technically be requiring all ISPs to be _active mod_ proxies. Similarly, to demand HTTPS would require certificate authorities. [[Digital Signature]] [[Secure wikipedia]] _______ http://ecn.ab.ca/~brewhaha/Privileged%20Information%20for%20Newbies.HTM