I am serious now, please read below as a serious proposal.
I was talking today with a friend about the image filter, and we came
to the possible solution. Of course, if those who are in favor of
censorship have honest intentions to allow to particular people to
access Wikipedia articles despite the problems which they have on
workplace or in country. If they don't have honest intentions, this is
waste of time, but I could say that I tried.
* Create en.safe.wikipedia.org
and so on).
Those sites would have censored images and/or image filter
implemented. The sites would be a kind of proxies for equivalent
Wikimedia projects without "safe" in the middle. People who access to
those sites would have the same privileges as people who accessed to
the sites without "safe" in the domain name. Thus, everybody who wants
to have "family friendly Wikipedia" would have it on separate site;
everybody who wants to keep Wikipedia free would have it free.
* Create safe.wikimedia.org
. That would be the site for
censoring/categorizing Commons images. It shouldn't be Commons itself,
but its virtual fork. The fork would be consisted of hashes of image
names with images themselves. Thus, image on Commons with the name
"Torre_de_H%C3%A9rcules_-_DivesGallaecia2012-62.jpg" would be
"fd37dae713526ee2da82f5a6cf6431de.jpg" on safe.wikimedia.org
image preview located on upload.wikimedia.org
with the name
it would be translated as "thumb/a1f3216e3344ea115bcac778937947f1.jpg"
. (Note: md5 is not likely to be the best hashing
system; some other algorithm could be deployed.)
* Link from the real image name and its hash would be just inside of
the Wikimedia system. It would be easy to find relation image=>hash;
but it would be very hard to find relation into other direction. Thus,
no entity out of Wikimedia would be able to build its censorship
repository in relation to Commons; they would be able to do that just
in relation to safe.wikimedia.org
, which is already censored.
Besides the technical benefits, just those interested in censoring
images would have to work on it. Commons community would be spared of
that job. The only reason why such idea would be rejected by those who
are in favor of censorship would be their wet dreams to use Commons
community to censor images for themselves. If they want to censor
images, they should find people interested in doing that; they
shouldn't force one community to do it.
Drawbacks are similar to any abuse of censorship: companies, states
etc. which want to use that system for their own goals would be able
to do that by blocking everything which doesn't have "safe" infix.
But, as said, that's drawback of *any* censorship mechanism. Those who
access through the "safe" wrapper would have to write image names in
their hash format; but that's small price for "family friendliness", I