Stephen Bain wrote:
And once again, the labelling doesn't need to be perfect (nothing on a wiki is) if an option to hide all images by default is implemented (which at present there seems to be broad support for, from most quarters).
With such an option in place, why take on the task of labeling images on readers' behalf? As previously discussed, it would be a logistical nightmare and enormous resource drain.
Furthermore, I don't regard "If you don't like our specific filters, you can just use the blanket one." as a remotely appropriate message to convey to the millions of readers whose beliefs we fail to individually accommodate.
The accuracy of filtering can then be disclaimed, with a recommendation that people can hide all images if they want a guarantee.
Or we can simply accept the idea's infeasibility and provide the non-broken image filter implementation alone.
Note that disclaimers won't satisfy the masses. (If they did, we wouldn't be having this discussion.) As soon as the WMF introduces such a feature, large segments of the public will expect it to function flawlessly and become outraged when it doesn't.
I foresee sensationalistic media reports about our "child protection filters that let through smut and gore." (I realize that we aren't advertising "child protection filters." Nonetheless, that's a likely perception, regardless of any disclaimers.)
And of course artworks are being used as examples because they're going to present the corner cases. But all of these discussions seem to be proceeding on the basis that there are nothing but corner cases, when really (I would imagine) pretty much everything that will be filtered will be either:
- actual images of human genitals [1],
- actual images of dead human bodies, or
- imagery subject to religious restriction.
Almost all will be in the first two categories, and most of those in the first one, and will primarily be photographs. [1] Which, naturally, includes actual images of people undertaking all sorts of activities involving human genitals.
Firstly, I don't know why you've singled out genitals. People commonly regard depictions of other portions of the human anatomy (such as buttocks and female breasts) as objectionable.
Secondly, "imagery subject to religious restriction" (which doesn't constitute a viable "category" in this context) includes "images of unveiled women." You state that "almost all will be in the first two categories," but I'm confident that we host significantly more images of unveiled women than images of human genitals and images of dead human bodies combined.
Thirdly, there's been a great deal of discussion regarding other images to which people commonly object, such those depicting violence (whatever that means), surgical procedures, spiders and snakes, to name but a few.
On the basis that the community, by and large, is not comprised wholly of idiots, I'm sure it will be capable of holding a sensible discussion as to whether images of mummies (not to forget bog bodies and Pompeii castings, as further examples) would be in or out of such a category.
...thereby arriving at an inherently subjective, binary determination that fails to meet many readers' expectations.
And again, perfection is not necessary. If someone has "dead bodies" filtered and sees the filtered image placeholder with the caption "this is an Egyptian mummy", they can elect to show that particular image, or decide that they would like to turn off the filter. Or if such a "dead bodies" filter is described as not including Egyptian mummies, someone could decide to hide all images by default.
Or we could simply provide that functionality alone, thereby enabling the same scenario.
This doesn't have to be difficult.
Indeed.
David Levy