[Wikimedia-l] Who invoked "principle of least surprise" for the image filter?
Tobias Oelgarte
tobias.oelgarte at googlemail.com
Sat Jun 16 22:51:35 UTC 2012
Am 16.06.2012 23:36, schrieb Tom Morris:
> On Saturday, 16 June 2012 at 20:21, Tobias Oelgarte wrote:
>> That means they already found a solution to their problem that includes
>> the whole web at once. As you might have noticed it isn't perfect. I
>> guess that it could be easily improved over time. But the image filter
>> had an different goal. It wouldn't help the schools, since the content
>> is still accessible. But why we discuss about schools and children all
>> the time and speak about it as a net nanny?
>
> Don't you get it? An image filter you can trivially opt-out of by clicking the big button labelled "show image" is a perfect way of preventing children from getting to naughty pictures…
Is this irony? My comment included some irony as well. ;-)
How would a "show image" button protect children from getting to naughty
pictures? The first thing a child would do is to press this button out
of curiosity alone. Real child protection software is meant to hide such
content without giving the child even the possibility to access such
content. That is what a so called "net nanny" software will do, since it
is usually meant to block access in case no parent is present and
watching over their children exploring minefields. At least the adverts
tell this great story.
> Seriously though, I'm slightly surprised that commercial censorware providers haven't bothered to add the nudey stuff from Commons. Pay a few bored minimum wage people to go through and find all the categories with the naughty stuff and stick all those images in their filter. It'd only take a few hours, given the extensive work already done by the Commons community neatly sorting things into categories with names like "Nude works including Muppets" and "Suggestive use of feathers" etc.
Yes they could do that. But the Internet is large. They usually use a
combination of black and white listing which is the core evil in the
detail. White listing delivers perfect results (as long the content
doesn't change over night), but it is much more expensive since every
new page would need to be checked. Blacklisting is way easier, since it
doesn't block access to new pages or images. But at the same time it has
it's flaws, because any unknown website (the biggest part) can be
accessed regardless of content.
> It's almost as if the censorware manufacturers are selling products to people who don't know any better that are ineffective and serve to give piece-of-mind placebo to people in place of effective access control. Oh, wait, that would be the inner cynic speaking.
Exactly that is the case. I have never seen a "censorware" that works
flawlessly (not even china can do this right). Either it allows to much
(incomplete blacklist) or it is unnecessary limited (incomplete
whitelist producing angry mob). Additionally it has to suite the view of
the parents and match the age of the child. The only "software" which
does this perfectly is the brain of the parents that tracks the actions
of the child, stops them when necessary and gives useful advice (even
better then Clippy).
nya~
More information about the Wikimedia-l
mailing list