From: Tobias Oelgarte <tobias.oelgarte(a)googlemail.com>
That is again
the picking of an example. But what do you expect to find? Say that someone actually
searches for an image of this practice. Should he find it at the last spot? An good
search algorithm
treats everything equal and delivers the closest matches. A search which is more
intelligent would
deliver images of showers first if you search for "shower", since it knows the
difference between the
terms "golden shower" and "shower". Thats how it should work.
It's definitely not an error of the
search engine itself, but it could be improved to deliver better matching results,
without any marking.
Extending it to exclude marked content leads back to the basic question(s), which should
be
unnecessary.
I would expect that someone who has entered "shower" as their search terms is
looking for images
of showers, and that someone looking for images of wet sex would enter "golden
shower" as their
search term. So if we present an image of people urinating on each other in response to a
search like
"shower", we violate the principle of least astonishment.
Evidence would be a statistic in which it is shown how
many people are actually happy with the results.
With happy in the meaning: "i will use it again and was not so offended to not use
it".
One thing not to forget here is that we may turn away users who might otherwise
contribute. If the users
that *remain* are happy, that does not mean that we have not lost many others, nor that
those who
remained are representative of the broader population.
If the naming of that images is a problem then we can
just rename them to something more useful. We
have templates and bots for that. Marking the images would not help in this case. But
doing what we
can already do, would be a simple and working solution: Rename it.
Difficult too; would you suggest giving all sexual images code names?
Everyone should know that "Barbie" is a
often used term or part of a pseudonym. That the search
reacts to both is quite right. The word itself does not distinguish between multiple
meanings. But thats
again not the problem.
Someone entering "Barbie" as the search term is probably looking for images of
Barbie dolls, not images
of porn actresses like Lanny Barbie, Fetish Babie or Barbie Love. I think it's not
unreasonable to expect
the latter group of people to enter both parts of the name.
> One thing that might help would be for the search
function to privilege files that are shown in top-
> level categories containing the search term: e.g. for "cucumber", first
display all files that are in
> category "cucumber", rather than those contained in subcategories, like
"sexual penetrative use of
> cucumbers", regardless of the file name (which may not have the English word
"cucumber" in it).
Refining the search should definitely be an option.
After reading Brandon's comment I must also
wonder why it doesn't consider categories. That are the places where content is
already pre-sorted by
ourself. It would definitely worth the effort, since it would two things at once:
1. It would most likely give better results, even if the description or filename is not
translated.
2. Given a search function which finds content more effective, would also minimize the
effect we are
talking about.
We are in agreement on that point. I've asked Brandon (on the gendergap list) if this
would be a lot work,
but he has previously indicated that finding time to reprogram this might be difficult.
Nevertheless, I
think it is something we should pursue. Anything you can do to help is appreciated.
I'm a little against categories that are purely
introduced to divide content in sexual (offensive) and non
sexual (not offensive) content. If the practice/depiction has a own specialized term than
it is acceptable.
But introducing pseudo categories just blows up the category tree and effectively hides
content. If we
implement the first idea and introduce special categories, then we are effectively back
at filtering and
non neutral judgment.
I don't think it makes sense to feature women with cucumbers inserted in their vagina
in the top-level
cucumber category. Again, principle of least astonishment.
PS: I was wondering which mail client you use. Usually
the structure is destroyed and the order of
mails (re:) is not kept, which makes it hard to follow conversations.
I know, it's a pain. This should be my last post to this list with the yahoo client;
I've gotten myself a
gmail account and will use that from now on.
Cheers,
Andreas
[1]
>>
>>http://meta.wikimedia.org/w/index.php?title=Controversial_content%2FBrainstorming&action=historysubmit&diff=2996411&oldid=2995984
>>[2]
>>http://lists.wikimedia.org/pipermail/foundation-l/2011-October/069699.html
>>
>>Am 17.10.2011 02:56, schrieb Andreas Kolbe:
>>> Personality conflicts aside, we're noting that non-sexual search terms in
Commons can prominently return sexual images of varying explicitness, from mild nudity to
hardcore, and that this is different from entering a sexual search term and finding that
Google fails to filter some results.
>>>
>>> I posted some more Commons search terms where this happens on Meta; they
include
>>>
>>> Black, Caucasian, Asian;
>>>
>>> Male, Female, Teenage, Woman, Man;
>>>
>>> Vegetables;
>>>
>>> Drawing, Drawing style;
>>>
>>> Barbie, Doll;
>>>
>>> Demonstration, Slideshow;
>>>
>>> Drinking, Custard, Tan;
>>>
>>> Hand, Forefinger, Backhand, Hair;
>>>
>>> Bell tolling, Shower, Furniture, Crate, Scaffold;
>>>
>>> Galipette – French for "somersault"; this leads to a collection of
1920s pornographic films which are undoubtedly of significant historical interest, but are
also pretty much as explicit as any modern representative of the genre.
>>>
>>> Andreas
>>
>
_______________________________________________
Commons-l mailing list Commons-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/commons-l
_______________________________________________
Commons-l mailing list
Commons-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/commons-l