Hoi, The category system is as far as I am concerned of little interest. It is as far as I am concerned not helpful for Selecting one from a bunch. It is a sick dog and it is in misery. Thanks, GerardM
On 14 October 2011 01:14, WereSpielChequers werespielchequers@gmail.com wrote:
Message: 7 Date: Wed, 12 Oct 2011 11:07:54 -0300 From: Andrew Crawford acrawford@laetabilis.com Subject: Re: [Foundation-l] Image filtering without undermining the category system To: Wikimedia Foundation Mailing List foundation-l@lists.wikimedia.org Message-ID: <CAE0LbZ5M_iN2CiTaObubtWC8Zd3rAf4NDH+Y5+kX+0d=NYgqgw@mail.gmail.com
Content-Type: text/plain; charset=ISO-8859-1
In general I think this is the best and most practical proposal so far.
Hi Andrew,
Thanks I appreciate that.
Having filter users do the classifying is the only practical option. In my opinion, it is unfortunately still problematic.
- It is quite complicated from the user's point of view. Not only do they
have to register an account, but they have to find and understand these options. For the casual reader who just doesn't want to see any more penises, or pictures of Mohammed, that is quite a lot to ask. The effort it would take to implement a system like this might outweigh the benefit to the small number of readers who would actually go through this process.
Yes my wording of the options is not ideal, and I'm hoping we can make it more user friendly. But the process isn't very complex. If we create http://en.wikipedia.org/wiki/Special:Preferences#mw-prefsection-filter
It need be no more complex than http://en.wikipedia.org/wiki/Special:Preferences#mw-prefsection-watchlist
I'm pretty sure we can make it simpler than buying some censorship software with a credit card and then installing it on your PC.
- It is obviously subject to gaming. How long would it take 4chan to
figure out they can create new accounts, and start thumbs-upping newly-uploaded pictures of penises while mass thumbs-downing depictions of Mohammed?
Subject to gaming, well it's bound to be. But vulnerable to gaming, hopefully not. Fans of penises are welcome to add their preferences. That's why I didn't include the option "Hide all images except those that a fellow filterer has whitelisted".
If some people find naked bodies wholesome but crucifixes troubling, and others the reverse, then the filter will pick up on that as an easy scenario, and once you've indicated that you are happy to see one or the other it will start giving a high score to things that have been deemed objectionable to people who've made similar choices to you, or things that were deemed wholesome by people whose tastes run counter to yours. Conversely it will give low scores to images cleared by people whose tastes are highly similar to yours or to images objected to by people whose tastes are the reverse of yours.
- How can we prevent the use of this data for censorship purposes?
We prevent the use of this data for censorship by not releasing the knowledge base, only showing logged in users the results that are relevant to them, and not saying how we've come up with a score. If we only had a small number of images and a limited set of reasons why people could object to them then it would be simple to impute the data in our knowledge base, but we have a large and complex system, and some aspects would be inherently difficult to hack by automated weapons. An experienced human looking at an image with a filter score would sometimes be able to guess what common reasons had caused a filterer or filterers not to want to see it again, but a computer would struggle and often anyone but the filterer who'd applied that score would be baffled. If you had access to that individuals filter list it might be obvious that they were blocking images that triggered their vertigo, depicted people associated with a particular sports team or train engines that lacked a boiler. But without the context of knowing which filter lists an image was on it would be difficult to get meaningful information out of the system.
Would we
keep the reputation information of each image secret? I imagine many Wikipedians would want to access that data for legitimate editorial reasons.
Well of course any of the editors could themselves have the filter set on
and would know what the score was relative to their preferences. But otherwise the information would be secret. I don't see how we could give editors access to the reputation information without it leaking to censors, or indeed divulging it generally. Remember the person with vertigo might not want that publicly known, the pyromaniac who blocked images that might trigger their pyromania would almost certainly not want their filter to be public. As for "legitimate editorial reasons", I think it would be quite contentious if anyone started making editorial decisions based on the filter results, so best not to enable that - but I'll clarify that in the proposal
Thanks for your feedback
WereSpielChequers
Cheers,
Andrew (Thparkth) On Tue, Oct 11, 2011 at 5:55 PM, WereSpielChequers < werespielchequers@gmail.com> wrote:
OK in a spirit of compromise I have designed an Image filter which should meet most of the needs that people have expressed and resolve most of the objections that I'm aware of. Just as importantly it should actually
work.
http://meta.wikimedia.org/wiki/User:WereSpielChequers/filter
WereSpielChequers _______________________
Thanks for that and for your comments on
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l