[Foundation-l] Possible solution for image filter
WereSpielChequers
werespielchequers at gmail.com
Wed Sep 21 09:10:47 UTC 2011
Forking and creating "safe" versions of all our wikis has the same
disadvantage of any other fork, the wisdom of crowds is dissipated if the
crowd is ever further divided. In that sense this would be as much a mistake
as it was to spin Outreach, Strategy and Ten off as separate wikis rather
than projects on meta. Better to encompass both "safe" and existing wikis
within the same wiki by making the image filter an opt in user choice, that
way you achieve all the advantages of "safe" and unsafe wikis without any of
the overheads. I think you'll find that was always the intention, I don't
recall anyone arguing for it to be compulsory for everyone to opt in to the
filter and pick at least one thing they object to.
Commons is a different matter, and I can understand the concern there that
this might lead to arguments as to the categorisation of particular
articles. Personally I think that it would be progress to replace arguments
as to whether an image is within scope with arguments about the category.
But this does depend on the way in which the filter is implemented; If we
implement a filter which offers 8-15 broad choices to those who opt in to
it, then those filters probably don't currently exist on Commons, so by
implication we as a community are vetting all of commons to see what fits
into those filters. Such a system also conflicts with other things we are
doing, in particular the GLAM collaborations and the large releases of
images that we are getting from various institutions. But if we go down the
more flexible personal image filter route then there is far less reason to
fork Commons as it makes no difference on Commons whether an image is
blocked by one reader on their personal preferences or by one million. There
would still be the issue that not everything is categorised, but if we
release this in beta test and don't over promise its functionality that
should not be a problem - we just need to make clear that it is currently x%
efficient and will improve as people identify stuff they don't want to see
again, and categories where they want to first check the caption or alt text
in order to decide whether to view them.
WereSpielChequers
------------------------------
>
> Message: 3
> Date: Wed, 21 Sep 2011 03:47:07 +0200
> From: Milos Rancic <millosh at gmail.com>
> Subject: [Foundation-l] Possible solution for image filter
> To: Wikimedia Foundation Mailing List
> <foundation-l at lists.wikimedia.org>
> Message-ID:
> <CAHPiQ2HLhuFYiMKoKBDo1i9=1QA-8U0z8cPSTm3Mep3w2++ncA at mail.gmail.com
> >
> Content-Type: text/plain; charset=ISO-8859-1
>
> I am serious now, please read below as a serious proposal.
>
> I was talking today with a friend about the image filter, and we came
> to the possible solution. Of course, if those who are in favor of
> censorship have honest intentions to allow to particular people to
> access Wikipedia articles despite the problems which they have on
> workplace or in country. If they don't have honest intentions, this is
> waste of time, but I could say that I tried.
>
> * Create en.safe.wikipedia.org (ar.safe.wikiversity.org and so on).
> Those sites would have censored images and/or image filter
> implemented. The sites would be a kind of proxies for equivalent
> Wikimedia projects without "safe" in the middle. People who access to
> those sites would have the same privileges as people who accessed to
> the sites without "safe" in the domain name. Thus, everybody who wants
> to have "family friendly Wikipedia" would have it on separate site;
> everybody who wants to keep Wikipedia free would have it free.
>
> * Create safe.wikimedia.org. That would be the site for
> censoring/categorizing Commons images. It shouldn't be Commons itself,
> but its virtual fork. The fork would be consisted of hashes of image
> names with images themselves. Thus, image on Commons with the name
> "Torre_de_H%C3%A9rcules_-_DivesGallaecia2012-62.jpg" would be
> "fd37dae713526ee2da82f5a6cf6431de.jpg" on safe.wikimedia.org. The
> image preview located on upload.wikimedia.org with the name
>
> "thumb/8/80/Torre_de_H%C3%A9rcules_-_DivesGallaecia2012-62.jpg/800px-Torre_de_H%C3%A9rcules_-_DivesGallaecia2012-62.jpg";
> it would be translated as "thumb/a1f3216e3344ea115bcac778937947f1.jpg"
> on safe.wikimedia.org. (Note: md5 is not likely to be the best hashing
> system; some other algorithm could be deployed.)
>
> * Link from the real image name and its hash would be just inside of
> the Wikimedia system. It would be easy to find relation image=>hash;
> but it would be very hard to find relation into other direction. Thus,
> no entity out of Wikimedia would be able to build its censorship
> repository in relation to Commons; they would be able to do that just
> in relation to safe.wikimedia.org, which is already censored.
>
> Besides the technical benefits, just those interested in censoring
> images would have to work on it. Commons community would be spared of
> that job. The only reason why such idea would be rejected by those who
> are in favor of censorship would be their wet dreams to use Commons
> community to censor images for themselves. If they want to censor
> images, they should find people interested in doing that; they
> shouldn't force one community to do it.
>
> Drawbacks are similar to any abuse of censorship: companies, states
> etc. which want to use that system for their own goals would be able
> to do that by blocking everything which doesn't have "safe" infix.
> But, as said, that's drawback of *any* censorship mechanism. Those who
> access through the "safe" wrapper would have to write image names in
> their hash format; but that's small price for "family friendliness", I
> suppose.
>
> Thoughts?
>
>
>
More information about the wikimedia-l
mailing list