[Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)
Andreas Kolbe
jayen466 at yahoo.com
Wed May 12 11:39:04 UTC 2010
--- On Tue, 11/5/10, wjhonson at aol.com <wjhonson at aol.com> wrote:
If there is enough of a perceived need for content filtering, someone will fill that void. That someone does not need to be us. Google does this job with their image browser already without the need for any providers to actively "tag" any images. How do they do that? I have no idea, but they do it. I would suggest a "child-safe" approach to Commons, is simply to use the Google image browser with a "moderate filter" setting. Try it, it works.
It doesn't work if you enter Commons through the main page, or an image page, and then search through its categories. The best-thumbed pages of library books are usually the ones that have nude images; it's human nature. Commons is no different if you look at the top-1000.
With respect to minors, the libertarian position that anyone should be able to see whatever they want to see is simply a fringe position. Every country legally defines some things as "harmful" to minors* and expects providers to behave in a way that prevents that harm. Arguing about whether the harm is real is an idle debate that's of no interest to teachers, say, who are legally bound by these standards and can experience professional repercussions if they fail in their duty of care.
I would suggest that any parent who is allowing their "young children" as one message put it, to browser without any filtering mechanism, is deciding to trust that child, or else does not care if the child encounters objectionable material. The child's browsing activity is already open to five million porn site hits as it stands, Commons isn't creating that issue. And Commons cannot solve that issue. It's the parents responsibility to have the appropriate self-selected mechanisms in place. And I propose that all parents who care, already *do*. So this issue is a non-issue. It doesn't actually exist in any concrete example, just in the minds of a few people with spare time.
As I see it, a working filter system for adult content would relieve teachers and librarians of the headache involved in making Commons or WP available to minors. Do we have figures on how many schools or libraries in various countries block access to Wikimedia sites over concerns related to content harmful to minors? Is this a frequently-voiced concern, or are we making more of it than it is?
The most sensible access control system would be one that can be set up on a physical computer used by minors. (Linking it to user account data would not work, as IP users should have normal access.) And if the same child is allowed to surf the net freely by their parents at home, then that is perfect. It is the parents' choice, and every parent handles this differently.
If an outside developer were to create such a filter product, that would be great too. I just wonder how they would cope with categories and images being renamed, new categories being created, etc. And does anyone actually know how Google manages to filter out images in safe search?Andreas
* See the Miller test for minors reproduced at
http://commons.wikimedia.org/wiki/Commons_talk:Sexual_content#Pornography
More information about the wikimedia-l
mailing list