Thomas Morton wrote:
I think a key part of resolving this is to avoid calling the labels "potentially objectionable". I mean - anything can be potentially objectionable, it depends on the individual.
Indeed. The term "objectionable" is more applicable than "offensive" is (because one needn't be offended by an image to object to its sight), but neither concept can be accurately defined on behalf of the projects' readers as a whole.
Obviously we cast this in the nudity/Mohammed light, because those are the most high profile examples.
But another example; clowns.
Some people are terrified of clowns, even their images. You wouldn't describe images of clowns as "potentially objectionable" but it would be great for Coulrophobes to go "oh hey Wikipedia, I don't like clowns so can you hide pics of them for me please? Thanks".
Some people are squeamish - so OK let the hides images involving blood/gore. Foot phobia? (that's common enough) Hide images of naked feet.
And so on.
Another example, mentioned several times, is "spiders." An aversion to spiders is extremely common.
But even if we were to confine the image filter system to subjects that actually offend people (and further restrict this by mandating that the relevant belief be common in at least one culture), the list is staggering.
Many people are offended by photographs of unveiled women. Will one of the "5–10 categories" be dedicated to such images? If not, why not? Because we're deeming that cultural belief unworthy of accommodation?
I haven't even touched on the logistics. (Imagine a need to tag every image containing an unveiled woman.)
This should not be about filtering "potentially objectionable" images, but about giving readers a way to filter their experience in a way that makes them feel safe and happy. And that is the light to cast & develop the feature
Agreed. And one of the most important aspects to acknowledge is the infeasibility of labeling/grouping images based on what we believe people will want to filter.
David Levy