I do have concerns about censorship and can see both sides of the equations but to suggest that "*built by user itself to "filter" images*" is an unrealistic endeavour and to hold such a position is an arogant[sp]assumption that people have the ability and resources to create their own filter. Its also ignorant of the needs of many of the people who benefit from the work of our contributors, remember our basic principle is to make knowledge freely available. Misconceptions about Wikipedias demographic of being the male computer geek aged between 16-24 is only reinforced by such comments, when we make such misconceptions we knowingly discriminate against people outside that group, when we discriminate we no longer make the sum of human knowldedge freely available.
Whether a person choose to use a filter or not it should be their own choice, I know some of the content is an issue in certain environments and by being personally able to filter out such content would enable greater participation. If a voluntary self selective filter enable minorities within our society to participate then that must be a good thing, those same people can then participate and expand content within thier knowledge/resource base that benefits all of us.
On 5 July 2011 06:35, Alex Brollo alex.brollo@gmail.com wrote:
2011/7/2 Casey Brown lists@caseybrown.org
Although it isn't "official" or at all definitive, I believe the "personal image filter mockup" would be interesting to look at if you haven't already: http://www.mediawiki.org/wiki/Personal_image_filter
I took a look, at first glance I dislike such an idea. I think that simply a good set of categories freely classifying as many kinds of "offensive contents" as needed from endless list of personal idiosyncrasies coupled with some simple user-side js tool, freely built by user itself to "filter" images, should be sufficient. Users should be simply encouraged to add such categories to offending images, and to build filtering tools, by themselves or with some help from willing friends; problem solved, IMHO.
And - what about wikipedia, or sister projects, using "offensive images"? Is such a filtering procedure to be extended to articles using offensive images too into any wiki?
Alex brollo
Commons-l mailing list Commons-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/commons-l