[Foundation-l] Potential ICRA labels for Wikipedia

David Goodman dgoodmanny at gmail.com
Mon May 10 05:56:53 UTC 2010

There is no general agreement here that any system of filtering for
any purpose is ever necessary, and I think it is totally contrary to
the entire general idea behind the the free culture movement.

But people have liberty do do as they please with our content, and if
someone wants to filter  for their own purposes we cannot and should
not prevent them.  Neither should we assist them.

For JV to suggest assisting censorship by doing something that will
not "feel" like censorship is not in my opinion forthright. We should
have good descriptors because that's part of the context for the
images, but this should be decided without the least concern about
anything other than finding the images a user might want to find.
Agreed that one part of that is avoiding retrieving what they do not
want to receive, but there are many such criteria, such as size, date,
and the like. It can be argued that we have some responsibility to
those of o  users who can not access unfiltered content, but the least
  judgmental way is to provide ourselves for a option to display
images as text descriptors only, rather than leave it to
browsers--especially since a text-only view is appropriate for other
purposes also.

We could show the proper approach by working on better descriptors for
more important things than sexual images first.  The necessary
distinctions for any filtering service that does aim at restricting
concept in a way which is not grossly heavy handed would require very
detailed separation of the various types of breast images, and I do
not see why distinguishing between such things as the different
degrees of nudity is all that important in an encyclopedic sense.

David Goodman, Ph.D, M.L.S.

On Sun, May 9, 2010 at 11:23 PM, Jussi-Ville Heiskanen
<cimonavaro at gmail.com> wrote:
> Sue Gardner wrote:
>> Hi Derk-Jan,
>> Thank you for starting this thread.
>> There is obviously a range of options -- let's say, on a 10-point
>> scale, ranging from 0 (do nothing but enforce existing policy) to 10
>> (completely purge everything that's potentially objectionable to
>> anyone, anywhere).  Somewhere on that continuum are possibilities like
>> i) we tag pages so that external entities can filter, ii) we point
>> parents towards content filtering systems they can use, but make no
>> changes ourselves, iii) we implement our own filtering system so that
>> people can choose to hide objectionable content if they want, or iv)
>> we implement our own filtering system, with a default to a "safe" or
>> "moderate" view and the option for people to change their own
>> settings.  Those are just a few: there are lots of options.  (e.g.,
>> Google Images and Flickr I believe do different versions of option iv.
>>  I'm not saying that means we should do the same; it does not
>> necessarily mean that.)
>> I would love to see a table of various options, with pros and cons
>> including feedback from folks like EFF.  If anyone feels like starting
>> such a thing, I would be really grateful :-)
> Hi Sue,
> Is it okay if I first explain why none of the examples
> you mention are a good fit for us; and then pull a
> rabbit out of my hat, and explain how one of them
> can be salvaged and made into an excellent system?
> Rating by level is fixed, and it will never be culturally
> sensitive. And on wikipedia no matter how it is rigged
> people who edit will just get frustrated for both the
> right and the wrong reasons.
> Using words like "safe" etc, will certainly offend
> cultures, which are very very strict, for instance
> in terms how much flesh can be seen of women.
> Pointing parents to systems of filtering, that is
> half a solution, and the problem would be we
> would have to keep vetting what the filtering
> systems are basing their filtering, so our site
> doesn't look ridiculous in some form or another,
> either accidentally failing and offending the
> viewer (ask me sometime, I have tales to tell),
> or going to the other extreme, and leaving the
> viewer without a perfectly nice result.
> The last problem, but certainly not the least one.
> All of these are a *hard* *sell*. They are a hard
> sell to the wikimedian community. They are also
> a hard sell to a huge sector of our readers, and
> those who love us, even enough to give us small
> donations. Our community must matter to us,
> our readers must matter to us, those who love
> us should matter to us, and well, those who
> give us small donations -- I am not in a place
> to tell how much they matter to us.
> So now we come to the rabbit time!!!
> If it is a hard sell, find a way to soften it, without
> forcing the issue. How? First, be very canny about
> how the tags are named. Tits vs. Breasts, Butt vs.
> Rear-end, Baretits vs. Topless and so forth. Second
> do *not* limit the tags to such content tags which
> are useful for _avoiding_ content, but add in also
> positive tags (I know, for some all those above are
> positive tags ;-) puppies, kittens, funny, horsies,
> etc.
> I am sure someone can think of even better and
> smoother tagnames. But teh advantages of this
> approach are that it doesn't *feel* like censorship,
> but more like a value added service. I won't
> talk about how the system of selecting which stuff
> to see should be constructed, but I am sure
> someone has ideas. I do think though that the
> system should be reversible, that is essential to
> sell it not as censorship, so that those who only
> want to see naughty bits can do so, or for that
> matter somebody can see only cute animals.
> There is one additional benefit in terms of our
> community too. It is much more fun to add those
> kinds of tags, and much less drudge-work.
> Yours,
> Jussi-Ville Heiskanen
> _______________________________________________
> foundation-l mailing list
> foundation-l at lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l

More information about the foundation-l mailing list