Dear readers
Yesterday, on September 15th 2011, the German Wikipedia closed the poll (Meinungsbild) "Einführung persönlicher Bildfilter". [1] It asked the question if the personal image filter can be introduced or if it should not be introduced.
A strong majority of 86% percent voted to not allow the personal image filter [2] , despite the fact that the board already decided to introduce the feature.
The questions are: * Will the board or the WMF proceed with the introduction of the personal image filter against the will of it's second largest community? * If the WMF/board does not care about the first question. Will it affect the way the personal image filter will be implemented? For example: Not for all projects. A different implementation as suggested inside the "image filter referendum". * Will there be an attempt to follow this example and to question other communities the same question?
Greetings from Tobias Oelgarte
[1] http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C... [2] http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
On Fri, Sep 16, 2011 at 10:08, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
A strong majority of 86% percent voted to not allow the personal image filter [2] , despite the fact that the board already decided to introduce the feature.
I believe it is a fair assumption that we have voted for developing the feature, so wikipedias who need it can activate and use it, while those who do not want to use it will not request its activation, or will request its deactivation. I see no technical reason not to do that so I see no reason not to do it this way.
Peter
Am 16.09.2011 10:40, schrieb Peter Gervai:
I believe it is a fair assumption that we have voted for developing the feature, so wikipedias who need it can activate and use it, while those who do not want to use it will not request its activation, or will request its deactivation. I see no technical reason not to do that so I see no reason not to do it this way.
Where exactly has such a vote taken place?
Just a few bits about the "Meinungsbild" in the de-WP: Only active authors are allowed to vote since the results of a "Meinungsbild" are binding, unlike the results of ordinary polls ("Umfage").
Of those 14% who did not oppose the filter, I did not really see much actual support for it either. The general tone of the people who did not vote against it was that they don't mind that such a tool should be introduced if there's really demand for it.
The 86% rejection rate means that the feature will not be activated in the de-WP and that the WMF would be in a heap of trouble if they tried to force the second largest project to adopt something that the people who actually shape the project simply do not want. I believe it is also safe to assume that those 86% are not likely to do the dirty work of tagging pictures with categories to support the filter.
Regards, Oliver
On 16 September 2011 09:40, Peter Gervai grinapo@gmail.com wrote:
On Fri, Sep 16, 2011 at 10:08, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
A strong majority of 86% percent voted to not allow the personal image filter [2] , despite the fact that the board already decided to introduce the feature.
I believe it is a fair assumption that we have voted for developing the feature,
Citation needed.
- d.
On Fri, Sep 16, 2011 at 11:23, David Gerard dgerard@gmail.com wrote:
I believe it is a fair assumption that we have voted for developing the feature,
Citation needed.
Well I am the universally official source for my own beliefs.
But then you state that WMF will make it compulsory for all projects to activate the feature? (Citation is welcome, sure, but not required.)
g
On 16 September 2011 10:27, Peter Gervai grinapo@gmail.com wrote:
On Fri, Sep 16, 2011 at 11:23, David Gerard dgerard@gmail.com wrote:
I believe it is a fair assumption that we have voted for developing the feature,
Citation needed.
Well I am the universally official source for my own beliefs.
I mean the claim that "we" have voted for developing the feature, obviously. If you have no evidence for this claim, say so.
I would also suggest, more generally, that a strategy of asserting that consensus was reached wanting the feature, when this is strongly not the case, is unlikely to convince people - particularly when the discussion is about strong evidence of consensus *against*.
If you're so sure there's strong consensus in favour of it, I suggest we run a series of polls, similar to the de:wp poll, which I'm sure will show that lots of people want the feature.
- d.
On Fri, Sep 16, 2011 at 11:31, David Gerard dgerard@gmail.com wrote:
On 16 September 2011 10:27, Peter Gervai grinapo@gmail.com wrote:
On Fri, Sep 16, 2011 at 11:23, David Gerard dgerard@gmail.com wrote:
I believe it is a fair assumption that we have voted for developing the feature,
Citation needed.
Well I am the universally official source for my own beliefs.
I mean the claim that "we" have voted for developing the feature, obviously.
Please read what I wrote, you have quoted it. If I wanted to write "we have voted as" then I would have written just that. I didn't.
If you have no evidence for this claim, say so.
Well I do not have the original poll handy but as far as I remember it it was about what we think would be good to have, what to would like to see implemented. I do not remember any question about making it compulsory. Do you?
I would also suggest, more generally, that a strategy of asserting that consensus was reached wanting the feature, when this is strongly not the case, is unlikely to convince people - particularly when the discussion is about strong evidence of consensus *against*.
I am not sure what is the point debating this with _me_. (Apart from my person I mean.) I am not German. I am not active on DEWP. I voted for the feature, and I believe it's good to have it. You try to teach a lesson to me about your own troubles, but I really cannot help it.
The only thing I can offer my views are the global poll about the feature, and yes, it wasn't a strong concensus. But even it it were I do not think we should change the otherwise very well working method of WMF *not* messing with local projects apart from the very basic principles like the five pillars. This feature isn't *that* important - this is my opinion, please save me from asking a citation.
If you're so sure there's strong consensus in favour of it, I suggest we run a series of polls, similar to the de:wp poll, which I'm sure will show that lots of people want the feature.
Ironically this was what I was talking about, and what you were rejecting.
All I say is that if a local project vote not to use a feature then they shouldn't have to. If you disagree with that one you can simply say it, but do not try (and fail) to describe what I want to convince people about, please.
<g>
Am 16.09.2011 11:59, schrieb Peter Gervai:
I am not German. I am not active on DEWP. I voted for the feature, and I believe it's good to have it. You try to teach a lesson to me about your own troubles, but I really cannot help it.
The only thing I can offer my views are the global poll about the feature, and yes, it wasn't a strong concensus. But even it it were I do not think we should change the otherwise very well working method of WMF *not* messing with local projects apart from the very basic principles like the five pillars. This feature isn't *that* important
- this is my opinion, please save me from asking a citation.
You could never vote for the feature. The referendum did not ask the question if you want it or don't want it. It only if you see this feature as important. (important because you want it, or important because you don't want it?)
I see no consensus in the referendum. The opinions are widely spread and divided. Additionally it wasn't the question if something else would be more important. Asking if something is important is very different matter as if to ask if something is more important as something else. Please remember that, before coming to conclusions.
If you're so sure there's strong consensus in favour of it, I suggest we run a series of polls, similar to the de:wp poll, which I'm sure will show that lots of people want the feature.
Ironically this was what I was talking about, and what you were rejecting.
All I say is that if a local project vote not to use a feature then they shouldn't have to. If you disagree with that one you can simply say it, but do not try (and fail) to describe what I want to convince people about, please.
<g>
Questioning other projects, if they want that filter or not, would be good thing to do. The referendum did not ask this question at all. Additionally it would be time to release per project voting data.
86% of the German contributers opposed the feature. Does the same pattern apply to the global poll, or was it just the difference in question? We don't know as long per project data isn't released. I repeatedly asked for this data for more then 2 weeks. So far, no additional data was released. It somehow starts to piss me off.
Tobias
On Fri, Sep 16, 2011 at 12:15, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
I see no consensus in the referendum. The opinions are widely spread and divided.
Well you have to see that such controversial features will never have huge consensus in such a large and diverse community. Even simple majority would be an awesome result. :-)
g
Am 16.09.2011 12:22, schrieb Peter Gervai:
I see no consensus in the referendum. The opinions are widely spread and divided.
Well you have to see that such controversial features will never have huge consensus in such a large and diverse community. Even simple majority would be an awesome result. :-)
g
In a poll that asks for importance and is divided more or less into two groups? Thats a very strange interpretation.
Tobias
On Fri, Sep 16, 2011 at 3:15 AM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
86% of the German contributers opposed the feature. Does the same pattern apply to the global poll, or was it just the difference in question? We don't know as long per project data isn't released. I repeatedly asked for this data for more then 2 weeks. So far, no additional data was released. It somehow starts to piss me off.
Tobias
Tobias -- we all want to see the by-language correlations. It hasn't been done yet, as far as I know (I haven't seen anything further myself, nor has the rest of the board). This information isn't being kept from you or hidden, the analysis just doesn't exist yet. Patience!
-- phoebe
Yes, and as was mentioned previously this should be a very easy query to run. People are wondering why the data isn't already available.
2011/9/16 phoebe ayers phoebe.wiki@gmail.com
On Fri, Sep 16, 2011 at 3:15 AM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
86% of the German contributers opposed the feature. Does the same pattern apply to the global poll, or was it just the difference in question? We don't know as long per project data isn't released. I repeatedly asked for this data for more then 2 weeks. So far, no additional data was released. It somehow starts to piss me off.
Tobias
Tobias -- we all want to see the by-language correlations. It hasn't been done yet, as far as I know (I haven't seen anything further myself, nor has the rest of the board). This information isn't being kept from you or hidden, the analysis just doesn't exist yet. Patience!
-- phoebe
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On Sep 16, 2011 7:39 PM, "phoebe ayers" phoebe.wiki@gmail.com wrote:
On Fri, Sep 16, 2011 at 3:15 AM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
86% of the German contributers opposed the feature. Does the same pattern apply to the global poll, or was it just the difference in question? We don't know as long per project data isn't released. I repeatedly asked for this data for more then 2 weeks. So far, no additional data was released. It somehow starts to piss me off.
Tobias
Tobias -- we all want to see the by-language correlations. It hasn't been done yet, as far as I know (I haven't seen anything further myself, nor has the rest of the board). This information isn't being kept from you or hidden, the analysis just doesn't exist yet. Patience!
He didn't ask for information. He asked for data. That obviously exists. Last I heard, it still needed anonymising. Perhaps that should be prioritised.
Am 16.09.2011 20:38, schrieb phoebe ayers:
On Fri, Sep 16, 2011 at 3:15 AM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
86% of the German contributers opposed the feature. Does the same pattern apply to the global poll, or was it just the difference in question? We don't know as long per project data isn't released. I repeatedly asked for this data for more then 2 weeks. So far, no additional data was released. It somehow starts to piss me off.
Tobias
Tobias -- we all want to see the by-language correlations. It hasn't been done yet, as far as I know (I haven't seen anything further myself, nor has the rest of the board). This information isn't being kept from you or hidden, the analysis just doesn't exist yet. Patience!
-- phoebe
It took you three days to get out the first results, with most time spend in reading comments. Now i simply asked for a table from the database and it takes more then two weeks, while it could calm down the controversy about the poll and provide good arguments. The longer it takes the worser it gets. So you should hurry up on this simple matter. Just the raw data, a simple table
Q1 | Q2 | Q3 | Q4 | Q5 | Project (> 20 voters or "other") | date (no time)
with random order of rows would satisfy. We can make the analysis alone based on this data, provide the plots and so on.
What we don't want, is a final conclusion without the anonymous raw data.
Tobias
Peter Gervai wrote:
But then you state that WMF will make it compulsory for all projects to activate the feature? (Citation is welcome, sure, but not required.)
"The feature will be developed for, and implemented on, all projects."
http://meta.wikimedia.org/wiki/Image_filter_referendum/en
David Levy
On 16 September 2011 11:23, David Gerard dgerard@gmail.com wrote:
On 16 September 2011 09:40, Peter Gervai grinapo@gmail.com wrote:
On Fri, Sep 16, 2011 at 10:08, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
A strong majority of 86% percent voted to not allow the personal image filter [2] , despite the fact that the board already decided to introduce the feature.
I believe it is a fair assumption that we have voted for developing the feature,
Citation needed.
Meanwhile, over on Bugzilla… https://bugzilla.wikimedia.org/show_bug.cgi?id=30208 Am I the only one who thinks this is getting somewhat out of hand?
Stuff like "I'm getting tired of your aggressive comments and borderline personal attacks (…) All you did at Wikimania was to publish a flyer full of proven lies to reinforce your mantra" (responding to things I wouldn't at all consider aggressive) is perhaps a sign that tempers are more than a little frayed.
Michel
Am 16.09.2011 10:40, schrieb Peter Gervai:
On Fri, Sep 16, 2011 at 10:08, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
A strong majority of 86% percent voted to not allow the personal image filter [2] , despite the fact that the board already decided to introduce the feature.
I believe it is a fair assumption that we have voted for developing the feature, so wikipedias who need it can activate and use it, while those who do not want to use it will not request its activation, or will request its deactivation. I see no technical reason not to do that so I see no reason not to do it this way.
Peter
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
It weren't just a poll. It was also a discussion in search for arguments. One big issue is question on how we decide what is or might be objectionable. From the point of an encyclopedia nothing is objectionable, as long it is a fact and represented that way.
Another issue is the questioning in comparison to the referendum. The referendum showed that the global community is divided. But more then 2 weeks after the referendum we still have no results per project. This makes it impossible compare both polls and come to a conclusion what the reasons for the different outcome is: Where it just the (manipulative) questions of the referendum or does the German play a very different role in global context. Something we can answer. I asked for this results multiple times. But still no reaction whatsoever. This sucks.
Tobias
On Fri, Sep 16, 2011 at 11:40 AM, Peter Gervai grinapo@gmail.com wrote:
On Fri, Sep 16, 2011 at 10:08, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
A strong majority of 86% percent voted to not allow the personal image filter [2] , despite the fact that the board already decided to introduce the feature.
I believe it is a fair assumption that we have voted for developing the feature, so wikipedias who need it can activate and use it, while those who do not want to use it will not request its activation, or will request its deactivation. I see no technical reason not to do that so I see no reason not to do it this way.
Not so, the resolution was to implement, not merely develope.
What a strange assumption from Peter. I don't believe for one minute that WMF would commission a global referendum and then ignore the results. If there has been an official statement along these lines I would love to be pointed to it.
Cheers, Fae
On 16 September 2011 02:14, Fae fae@wikimedia.org.uk wrote:
What a strange assumption from Peter. I don't believe for one minute that WMF would commission a global referendum and then ignore the results. If there has been an official statement along these lines I would love to be pointed to it.
Yikes, this is a very fast-moving thread. I haven't read it all yet, but I wanted to jump in and say that yes, there has not yet been an official statement responding to the referendum results. There will be, but there isn't yet.
Currently, the referendum team is still doing some analysis of the results -- there are some questions we are hoping to get answered around language breakdown. And I am currently reading lots and lots of write-in comments.
If I had to guess, I would image there will be a statement within about two weeks. But that's not a commitment, just an estimate.
Thanks, Sue
-- Sue Gardner Executive Director Wikimedia Foundation
415 839 6885 office 415 816 9967 cell
Imagine a world in which every single human being can freely share in the sum of all knowledge. Help us make it a reality!
Hi all;
There are more issues with images in German Wikipedia.
It is funny how German Wikipedia doesn't allow images[1] (image added by me[2] in de:, and later removed by other[3]) because they follow the most restrictive copyright law from Germany, Austria and Switzerland[note 1], but they are now against giving people the choice to hide images.
I think that we can do a nice move here. We can enable image filter in German Wikipedia for all those who don't want to see copyrighted-images-for-German-law, meanwhile allowing other people to see all Commons splendour. Using the image filter to improve the rights of readers of German Wikipedia. Very cool, right? ; )
Regards, emijrp
[1] http://de.wikipedia.org/wiki/Wikipedia:Bildrechte#Wikipedia_richtet_sich_nac... [2] http://de.wikipedia.org/w/index.php?title=Alexander_Knox&oldid=81377280 [3] http://de.wikipedia.org/wiki/Alexander_Knox
[note 1] I heard there are German speaking users outside Europe, right? I heard too that, from Germany, you can follow interwiki and see that images in other Wikipedias, right? So, what is the sense of that policy? Are not the servers in USA?
2011/9/16 Tobias Oelgarte tobias.oelgarte@googlemail.com
Dear readers
Yesterday, on September 15th 2011, the German Wikipedia closed the poll (Meinungsbild) "Einführung persönlicher Bildfilter". [1] It asked the question if the personal image filter can be introduced or if it should not be introduced.
A strong majority of 86% percent voted to not allow the personal image filter [2] , despite the fact that the board already decided to introduce the feature.
The questions are:
- Will the board or the WMF proceed with the introduction of the
personal image filter against the will of it's second largest community?
- If the WMF/board does not care about the first question. Will it
affect the way the personal image filter will be implemented? For example: Not for all projects. A different implementation as suggested inside the "image filter referendum".
- Will there be an attempt to follow this example and to question other
communities the same question?
Greetings from Tobias Oelgarte
[1]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C... [2]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On Fri, Sep 16, 2011 at 12:39, emijrp emijrp@gmail.com wrote:
I think that we can do a nice move here. We can enable image filter in German Wikipedia for all those who don't want to see copyrighted-images-for-German-law, meanwhile allowing other people to see all Commons splendour. Using the image filter to improve the rights of readers of German Wikipedia. Very cool, right? ; )
Oh, you found the first useful purpose of the image filter!
Am 16.09.2011 12:42, schrieb Milos Rancic:
On Fri, Sep 16, 2011 at 12:39, emijrpemijrp@gmail.com wrote:
I think that we can do a nice move here. We can enable image filter in German Wikipedia for all those who don't want to see copyrighted-images-for-German-law, meanwhile allowing other people to see all Commons splendour. Using the image filter to improve the rights of readers of German Wikipedia. Very cool, right? ; )
Oh, you found the first useful purpose of the image filter!
He did not. Optionally hiding of the image wouldn't make it legal. The filter has nothing to do with this case.
On Fri, Sep 16, 2011 at 12:50, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Am 16.09.2011 12:42, schrieb Milos Rancic:
On Fri, Sep 16, 2011 at 12:39, emijrpemijrp@gmail.com wrote:
I think that we can do a nice move here. We can enable image filter in German Wikipedia for all those who don't want to see copyrighted-images-for-German-law, meanwhile allowing other people to see all Commons splendour. Using the image filter to improve the rights of readers of German Wikipedia. Very cool, right? ; )
Oh, you found the first useful purpose of the image filter!
He did not. Optionally hiding of the image wouldn't make it legal. The filter has nothing to do with this case.
Then, make it opt-out :P
On Fri, Sep 16, 2011 at 2:10 PM, Milos Rancic millosh@gmail.com wrote:
On Fri, Sep 16, 2011 at 12:50, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Am 16.09.2011 12:42, schrieb Milos Rancic:
On Fri, Sep 16, 2011 at 12:39, emijrpemijrp@gmail.com wrote:
I think that we can do a nice move here. We can enable image filter in German Wikipedia for all those who don't want to see copyrighted-images-for-German-law, meanwhile allowing other people to see all Commons splendour. Using the image filter to improve the rights of readers of German Wikipedia. Very cool, right? ; )
Oh, you found the first useful purpose of the image filter!
He did not. Optionally hiding of the image wouldn't make it legal. The filter has nothing to do with this case.
Then, make it opt-out :P
¨ And what if the English Wikipedia chooses to opt out?
'
On Fri, Sep 16, 2011 at 13:45, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
On Fri, Sep 16, 2011 at 2:10 PM, Milos Rancic millosh@gmail.com wrote:
On Fri, Sep 16, 2011 at 12:50, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Am 16.09.2011 12:42, schrieb Milos Rancic:
On Fri, Sep 16, 2011 at 12:39, emijrpemijrp@gmail.com wrote:
I think that we can do a nice move here. We can enable image filter in German Wikipedia for all those who don't want to see copyrighted-images-for-German-law, meanwhile allowing other people to see all Commons splendour. Using the image filter to improve the rights of readers of German Wikipedia. Very cool, right? ; )
Oh, you found the first useful purpose of the image filter!
He did not. Optionally hiding of the image wouldn't make it legal. The filter has nothing to do with this case.
Then, make it opt-out :P
¨ And what if the English Wikipedia chooses to opt out?
It's about implementing image filter on images which have copyright problems in Germany (but not in US), not about nudity.
On Fri, Sep 16, 2011 at 5:15 PM, Milos Rancic millosh@gmail.com wrote:
On Fri, Sep 16, 2011 at 13:45, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
On Fri, Sep 16, 2011 at 2:10 PM, Milos Rancic millosh@gmail.com wrote:
On Fri, Sep 16, 2011 at 12:50, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
He did not. Optionally hiding of the image wouldn't make it legal. The filter has nothing to do with this case.
Then, make it opt-out :P
And what if the English Wikipedia chooses to opt out?
It's about implementing image filter on images which have copyright problems in Germany (but not in US), not about nudity.
So?
What is your obsession with nudity about? The filter isn't about nudity.
On Fri, Sep 16, 2011 at 16:27, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
What is your obsession with nudity about? The filter isn't about nudity.
Ah, you are right! I completely missed the point of the image filter because my obsession... It's about scared places of indigenous peoples of Australia and similar.
Back to the initial "point": "Make it opt-out" was about users of German Wikipedia, not about the projects. Whatever the point of the filter is. If it's illegal to see apples in Germany, then they could impose the filter for all of the users and allow them to opt-out from not seeing apples.
On Fri, Sep 16, 2011 at 5:47 PM, Milos Rancic millosh@gmail.com wrote:
On Fri, Sep 16, 2011 at 16:27, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
What is your obsession with nudity about? The filter isn't about nudity.
Ah, you are right! I completely missed the point of the image filter because my obsession... It's about scared places of indigenous peoples of Australia and similar.
Back to the initial "point": "Make it opt-out" was about users of German Wikipedia, not about the projects. Whatever the point of the filter is. If it's illegal to see apples in Germany, then they could impose the filter for all of the users and allow them to opt-out from not seeing apples.
Let me quote you the initial point:
"A strong majority of 86% percent voted to not allow the personal image filter [2] , despite the fact that the board already decided to introduce the feature."
Am 16.09.2011 16:47, schrieb Milos Rancic:
On Fri, Sep 16, 2011 at 16:27, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
What is your obsession with nudity about? The filter isn't about nudity.
Ah, you are right! I completely missed the point of the image filter because my obsession... It's about scared places of indigenous peoples of Australia and similar.
Back to the initial "point": "Make it opt-out" was about users of German Wikipedia, not about the projects. Whatever the point of the filter is. If it's illegal to see apples in Germany, then they could impose the filter for all of the users and allow them to opt-out from not seeing apples.
You make really crucial mistakes in your argumentation. We have no legal problem to view the images. We have a legal problem to provide them. If you implement it as opt-out then you would rip off the rights of the readers. A complete fail. But again: We talk about the opt-in image filter feature and that the core of the second largest project isn't willed to accept it.
On Fri, Sep 16, 2011 at 7:55 PM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Am 16.09.2011 16:27, schrieb Jussi-Ville Heiskanen:
So? What is your obsession with nudity about? The filter isn't about nudity.
You may not noticed it. It _is_ about nudity _and_ many more controversial topics as well. Saying that it _is not_ about nudity would be a blatant lie.
Take a deep breath and read up the chain. You started this thread, and I am doing my level best to try to stop it being derailed from the subject of the German vote and its repercussions
Am 16.09.2011 16:15, schrieb Milos Rancic:
On Fri, Sep 16, 2011 at 13:45, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
On Fri, Sep 16, 2011 at 2:10 PM, Milos Rancicmillosh@gmail.com wrote:
On Fri, Sep 16, 2011 at 12:50, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Am 16.09.2011 12:42, schrieb Milos Rancic:
On Fri, Sep 16, 2011 at 12:39, emijrpemijrp@gmail.com wrote:
I think that we can do a nice move here. We can enable image filter in German Wikipedia for all those who don't want to see copyrighted-images-for-German-law, meanwhile allowing other people to see all Commons splendour. Using the image filter to improve the rights of readers of German Wikipedia. Very cool, right? ; )
Oh, you found the first useful purpose of the image filter!
He did not. Optionally hiding of the image wouldn't make it legal. The filter has nothing to do with this case.
Then, make it opt-out :P
¨ And what if the English Wikipedia chooses to opt out?
It's about implementing image filter on images which have copyright problems in Germany (but not in US), not about nudity.
And it got awesomely off-topic. Now we discuss about an opt-in filter to allow copyrighted images on the German Wikipedia? Please put to the topic.
On Fri, Sep 16, 2011 at 18:52, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Am 16.09.2011 16:15, schrieb Milos Rancic:
It's about implementing image filter on images which have copyright problems in Germany (but not in US), not about nudity.
And it got awesomely off-topic. Now we discuss about an opt-in filter to allow copyrighted images on the German Wikipedia? Please put to the topic.
Not opt-in, but opt-out :P
Sorry for hijacking the thread with a joke :)
I think that everything is so obvious that it's become tiresome to discuss it: * There is significant disproportion in position between editors with a couple of edits and the core of the community. * It's not likely that it would be ~85% against, but similar pool on English Wikipedia would likely finish with ~60% against. Hypothetical referendums on projects in many European languages would finish similarly to the referendum on German Wikipedia, as in this case macho-patriarchal culture, dominant in large parts of Europe, corresponds with libertarian positions, dominant among the core editors. * It's likely that staff and Board already know that correlation between the results of German Wikipedia referendum and global survey could be drawn to support previous two conclusions. Thus, they don't want to publish that part of data. * There is still significant minority of core editors who want the filter at any cost. * Board is divided and doesn't know what to decide.
I would repeat the best possible solution to end this: Implement it on English Wikipedia -- you (those who want that filter) have some numbers which would support that action -- and leave the rest of the projects alone.
On 16 September 2011 18:13, Milos Rancic millosh@gmail.com wrote:
- It's likely that staff and Board already know that correlation
between the results of German Wikipedia referendum and global survey could be drawn to support previous two conclusions. Thus, they don't want to publish that part of data.
That's a terrible thing to think of them. Of course, it would be immediately alleviated by publishing the data.
That's a terrible thing to think of them. Of course, it would be immediately alleviated by publishing the data.
Hang on, I thought that http://wikimediafoundation.org/wiki/Values (which underpin the Mission) means that WMF is obliged by their own published bylaws to openly publish the data in question?
Cheers, Fae
Am 16.09.2011 19:54, schrieb Fae:
That's a terrible thing to think of them. Of course, it would be immediately alleviated by publishing the data.
Hang on, I thought that http://wikimediafoundation.org/wiki/Values (which underpin the Mission) means that WMF is obliged by their own published bylaws to openly publish the data in question?
Cheers, Fae
It's an secret poll, and as far only the summary of the questions (?-10) and a snapshot from the comments (not the comments itself) was released. Overall very unsufficient data. Thats why i requested the results by project and the overall distribution of participation. It is a database-querry that takes seconds/minutes (if your bad some hours) and still could ensure anonymity, as i explained at Phillipes discussion at Meta. [1] I made the same request at the discussion page of the "referendum" more then two weaks ago, to which Phillipe replied.[2]
So after more then two weeks we still do not see any further results. Something that could make your nervous and very angry, if you consider the current situation.
[1] http://meta.wikimedia.org/wiki/User_talk:Philippe [2] http://meta.wikimedia.org/wiki/Talk:Image_filter_referendum/Results/en#Resul...
Am 16.09.2011 19:13, schrieb Milos Rancic:
- There is significant disproportion in position between editors with
a couple of edits and the core of the community.
That still has to be proven. I asked for localized (project based) data from the poll to inspect if there are huge cultural differences or if there is a general bias towards the filter. This was more then two weeks ago and i reminded Philippe repeatedly to release this data. So far nothing was released and one excuse followed the other. Thats why i can't support or oppose your statement. But assuming that it would be true would be as false as to say that it is false.
So i repeat my request again: "Philippe, can you hear me? Release the data as soon as possible, we need it".
- It's not likely that it would be ~85% against, but similar pool on
English Wikipedia would likely finish with ~60% against. Hypothetical referendums on projects in many European languages would finish similarly to the referendum on German Wikipedia, as in this case macho-patriarchal culture, dominant in large parts of Europe, corresponds with libertarian positions, dominant among the core editors.
You would have to proof that your facts are indeed true. But if you accept it as a huge difference between cultures, how can you impose a filter for a culture that doesn't need it or wants it?
How would you expect to find a good compromise in decisions on what to filter and what not? Do you intend to put an extremist conservative Arab and and the most liberal German inside the same room, close the door, go away, come back after two weeks and look if they could find a compromise about Yes or No? How should this work?
The referendum showed that cultural neutrality is important for the voters. But how do you think to find a compromise between hell and heaven, without having hell and heaven inside the discussions at commons at earth?
- It's likely that staff and Board already know that correlation
between the results of German Wikipedia referendum and global survey could be drawn to support previous two conclusions. Thus, they don't want to publish that part of data.
I doubt that. But if they do, I will call them "assholes for betrayal". Just to make it clear. It would also not suite the story onto who has access to the data and who has not.
- There is still significant minority of core editors who want the
filter at any cost.
A "significant minority" is a curios choice of words.
"A significant minority tries to abolish the constitution by any cost". Now ask yourself if you would follow their wishes. Thats the same sentence, you said, with different actors. Still happy with it?
- Board is divided and doesn't know what to decide.
We don't know what the board thinks. It does not communicate with us (the authors), it did not react to the discussions at Meta, it did not answer serious questions and in general is somewhere between a legend and a forgotten ghost that no one can see, even if present.
I would repeat the best possible solution to end this: Implement it on English Wikipedia -- you (those who want that filter) have some numbers which would support that action -- and leave the rest of the projects alone.
That would imply not to implement it on commons. Otherwise the the categorization/labeling/... could be misused by local providers inside regions that didn't intended to use this feature.
Tobias
On Fri, Sep 16, 2011 at 7:56 PM, Tobias Oelgarte < tobias.oelgarte@googlemail.com> wrote:
You would have to proof that your facts are indeed true. But if you accept it as a huge difference between cultures, how can you impose a filter for a culture that doesn't need it or wants it?
Just like a normal addition to Mediawiki: Those who don't want to use it, don't have to.
How would you expect to find a good compromise in decisions on what to filter and what not? Do you intend to put an extremist conservative Arab and and the most liberal German inside the same room, close the door, go away, come back after two weeks and look if they could find a compromise about Yes or No? How should this work?
Quite simple: add one filter for each, and describe for each what they filter, then let every user for themself decide whether to filter the one, the other, neither or both.
The referendum showed that cultural neutrality is important for the voters. But how do you think to find a compromise between hell and heaven, without having hell and heaven inside the discussions at commons at earth?
See above - if your filters are not almost the same, don't use the same filter, but create two different ones.
On Fri, Sep 16, 2011 at 08:19:05PM +0200, Andre Engels wrote:
On Fri, Sep 16, 2011 at 7:56 PM, Tobias Oelgarte < tobias.oelgarte@googlemail.com> wrote:
You would have to proof that your facts are indeed true. But if you accept it as a huge difference between cultures, how can you impose a filter for a culture that doesn't need it or wants it?
Just like a normal addition to Mediawiki: Those who don't want to use it, don't have to.
In this case there's this Evil Cat system we may need to set up. I don't want anyone to use that. Especially not evil people.
sincerely, Kim Bruning
Am 16.09.2011 20:19, schrieb Andre Engels:
On Fri, Sep 16, 2011 at 7:56 PM, Tobias Oelgarte< tobias.oelgarte@googlemail.com> wrote:
You would have to proof that your facts are indeed true. But if you accept it as a huge difference between cultures, how can you impose a filter for a culture that doesn't need it or wants it?
Just like a normal addition to Mediawiki: Those who don't want to use it, don't have to.
I would not have any problems if we would not play in the hands of censors (local ISPs, a simple proxy, regimes, institutions, ...) by actually labeling content as objectionable. Which gives away the control over the content by the user itself, while no one would invest the money if he would need to label the content itself.
How would you expect to find a good compromise in decisions on what to filter and what not? Do you intend to put an extremist conservative Arab and and the most liberal German inside the same room, close the door, go away, come back after two weeks and look if they could find a compromise about Yes or No? How should this work?
Quite simple: add one filter for each, and describe for each what they filter, then let every user for themself decide whether to filter the one, the other, neither or both.
You should know that there are hundreds of phobias, cultural conflicts and other categories of possibly objectionable content. Do you expect us to manage all this categories of filtering, or would you say that it will be narrowed down to be user friendly and manageable, while leaving out some categories and ignore the complies of some minorities?
The referendum showed that cultural neutrality is important for the voters. But how do you think to find a compromise between hell and heaven, without having hell and heaven inside the discussions at commons at earth?
See above - if your filters are not almost the same, don't use the same filter, but create two different ones.
See above at my comment. Maybe we should put this questioning together as one fact.
On Fri, Sep 16, 2011 at 9:13 PM, Tobias Oelgarte < tobias.oelgarte@googlemail.com> wrote:
I would not have any problems if we would not play in the hands of censors (local ISPs, a simple proxy, regimes, institutions, ...) by actually labeling content as objectionable. Which gives away the control over the content by the user itself, while no one would invest the money if he would need to label the content itself.
So how do you expect those censors to use this?
How would you expect to find a good compromise in decisions on what to filter and what not? Do you intend to put an extremist conservative Arab and and the most liberal German inside the same room, close the door, go away, come back after two weeks and look if they could find a compromise about Yes or No? How should this work?
Quite simple: add one filter for each, and describe for each what they filter, then let every user for themself decide whether to filter the
one,
the other, neither or both.
You should know that there are hundreds of phobias, cultural conflicts and other categories of possibly objectionable content. Do you expect us to manage all this categories of filtering, or would you say that it will be narrowed down to be user friendly and manageable, while leaving out some categories and ignore the complies of some minorities?
The referendum showed that cultural neutrality is important for the voters. But how do you think to find a compromise between hell and heaven, without having hell and heaven inside the discussions at commons at earth?
See above - if your filters are not almost the same, don't use the same filter, but create two different ones.
See above at my comment. Maybe we should put this questioning together as one fact.
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Sorry, I dropped some hot food on me as I wrote this, and then apparently accidentily hit sent.
On Fri, Sep 16, 2011 at 9:57 PM, Andre Engels andreengels@gmail.com wrote:
On Fri, Sep 16, 2011 at 9:13 PM, Tobias Oelgarte < tobias.oelgarte@googlemail.com> wrote:
I would not have any problems if we would not play in the hands of censors (local ISPs, a simple proxy, regimes, institutions, ...) by actually labeling content as objectionable. Which gives away the control over the content by the user itself, while no one would invest the money if he would need to label the content itself.
So how do you expect those censors to use this?
You should know that there are hundreds of phobias, cultural conflicts
and other categories of possibly objectionable content. Do you expect us to manage all this categories of filtering, or would you say that it will be narrowed down to be user friendly and manageable, while leaving out some categories and ignore the complies of some minorities?
I'd say, drop the idea that the filter is supposed to be perfect. A filter that is little-used can get a rough content first time around, preferably specified by the person asking for the filter, then people using the filter can suggest adding or removing images. Volunteers can go and work on the filters if they want, but if they don't, the filter will just be changed by such suggestions.
Then again, there is the alternative of only including filters with at least a certain amount of expected usage. I see no problem with not having a filter for everyone who asks for it. I don't think that doing things perfectly and not doing them at all are the only options.
I'd say, drop the idea that the filter is supposed to be perfect. A filter that is little-used can get a rough content first time around, preferably specified by the person asking for the filter, then people using the filter can suggest adding or removing images. Volunteers can go and work on the filters if they want, but if they don't, the filter will just be changed by such suggestions.
Then again, there is the alternative of only including filters with at least a certain amount of expected usage. I see no problem with not having a filter for everyone who asks for it. I don't think that doing things perfectly and not doing them at all are the only options.
I don't except it to work perfectly. Nothing is perfect by default. But even if it would perfectly we provide a simple tool (the filter labels/categories) to censors, to improve their doing, while we, the volunteers, would indirectly support them in doing so.
For example: The head of a group (state, religions group, ...) of people is trying to censor Wikipedia, because it might damage it's position. What would be easier to comply at the mailing list that a filter for xyz is seriously needed. Now he can start to add images to this filter, calling for volunteers that have to obey to do so. At the end we represent the opinion from the head of the group (not the individuals, that fear the head), publish it as consent and help them to justify their position.
What would someone living inside such a group think if the content is already labeled that way, that he should not look at it. Isn't it social pressure put on the free mind, especially if other members of the group are around?
On Sat, Sep 17, 2011 at 6:16 AM, Andre Engels andreengels@gmail.com wrote:
I'd say, drop the idea that the filter is supposed to be perfect. A filter that is little-used can get a rough content first time around, preferably specified by the person asking for the filter, then people using the filter can suggest adding or removing images. Volunteers can go and work on the filters if they want, but if they don't, the filter will just be changed by such suggestions.
Indeed. I think some of the problems some people are predicting have been drastically exaggerated.
As long as the option to hide all images is also implemented, we can quite simply add a disclaimer when anyone goes to turn on a filter indicating that if complete exclusion is particularly important to them, they should choose the option to hide everything by default.
Am 16.09.2011 22:53, schrieb Stephen Bain:
On Sat, Sep 17, 2011 at 6:16 AM, Andre Engelsandreengels@gmail.com wrote:
I'd say, drop the idea that the filter is supposed to be perfect. A filter that is little-used can get a rough content first time around, preferably specified by the person asking for the filter, then people using the filter can suggest adding or removing images. Volunteers can go and work on the filters if they want, but if they don't, the filter will just be changed by such suggestions.
Indeed. I think some of the problems some people are predicting have been drastically exaggerated.
As long as the option to hide all images is also implemented, we can quite simply add a disclaimer when anyone goes to turn on a filter indicating that if complete exclusion is particularly important to them, they should choose the option to hide everything by default.
Would we do that for text as well? Where is the fundamental difference between text and images? Both can be objectionable or offending to some readers/viewers. Is there a real difference?
Wouldn't be a simple button to hide all images be enough to reach our goal, without the need to introduce categories?
On 16/09/2011 23:55, Tobias Oelgarte wrote:
Am 16.09.2011 22:53, schrieb Stephen Bain:
Indeed. I think some of the problems some people are predicting have been drastically exaggerated.
As long as the option to hide all images is also implemented, we can quite simply add a disclaimer when anyone goes to turn on a filter indicating that if complete exclusion is particularly important to them, they should choose the option to hide everything by default.
Would we do that for text as well? Where is the fundamental difference between text and images? Both can be objectionable or offending to some readers/viewers. Is there a real difference?
Wouldn't be a simple button to hide all images be enough to reach our goal, without the need to introduce categories?
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
I have already demonstrated how the image filter could be used as a text filter. Incidentally most of these arguments have already been laid out clearly on Meta, de: and en:
Am 16.09.2011 21:57, schrieb Andre Engels:
On Fri, Sep 16, 2011 at 9:13 PM, Tobias Oelgarte< tobias.oelgarte@googlemail.com> wrote:
I would not have any problems if we would not play in the hands of censors (local ISPs, a simple proxy, regimes, institutions, ...) by actually labeling content as objectionable. Which gives away the control over the content by the user itself, while no one would invest the money if he would need to label the content itself.
So how do you expect those censors to use this?
Just ask yourself what our Wikipedia interface would do. The server provide the images (HTML-Documents with <img> tag) along with labels. Depending on the settings of the user some kind of Javascript will hide the images. This "passed along" labels could simply be used to exclude the image as the whole, making the "show image" button disappear. Since Wikipedia serves more or less static pages, due to seriously needed caching, the labels will need to be passed that way.
Now you should think about topic and try to understand why this opens for a new kind of censorship. Blocking Wikipedia as a whole is a problem for most providers. This will cause users to change the provider or to insist to have access to it. This is a pressure put onto the access provider. The provider itself isn't able to filter the image or the content, since this is a lot of working time and time costs money. But if we choose to label the content for no fee, we open a new field for partial censorship. The users could still access it, but they won't see anything. In the result there would be some complaints. But way less complaints as if Wikipedia wasn't present at all.
A good compromise for a censor.
How would you expect to find a good compromise in decisions on what to filter and what not? Do you intend to put an extremist conservative Arab and and the most liberal German inside the same room, close the door, go away, come back after two weeks and look if they could find a compromise about Yes or No? How should this work?
Quite simple: add one filter for each, and describe for each what they filter, then let every user for themself decide whether to filter the
one,
the other, neither or both.
You should know that there are hundreds of phobias, cultural conflicts and other categories of possibly objectionable content. Do you expect us to manage all this categories of filtering, or would you say that it will be narrowed down to be user friendly and manageable, while leaving out some categories and ignore the complies of some minorities?
The referendum showed that cultural neutrality is important for the voters. But how do you think to find a compromise between hell and heaven, without having hell and heaven inside the discussions at commons at earth?
See above - if your filters are not almost the same, don't use the same filter, but create two different ones.
See above at my comment. Maybe we should put this questioning together as one fact.
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On Sat, Sep 17, 2011 at 6:21 AM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Depending on the settings of the user some kind of Javascript will hide the images. This "passed along" labels could simply be used to exclude the image as the whole, making the "show image" button disappear.
That would depend on the implementation, but even if the 'show image' button were not present, the caption (which includes a link to the image description page) would still be there, indicating that an image had been blocked.
The provider itself isn't able to filter the image or the content, since this is a lot of working time and time costs money. But if we choose to label the content for no fee, we open a new field for partial censorship.
Blocking of HTTP requests to images subject to any filters by an ISP or some other intermediary would be fairly trivially avoided by requesting the image from a mirror, or via a proxy etc. The community has plenty of talented javascript coders who could implement such a workaround.
Moreover as above, the caption will still be present (and, depending on the implementation, the 'show image' button will be present but ineffective) and so the user will know that an image has been blocked. To avoid this, the ISP or intermediary would have to alter the HTML in transit to remove the caption to conceal the censorship. But if they have the capability and the desire to do that, then there are many more potent avenues for censorship they could already engage in, particularly avenues involving modification of the article text. The marginal risk presented here does not seem to be high.
On Sat, Sep 17, 2011 at 6:32 AM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
What would someone living inside such a group think if the content is already labeled that way, that he should not look at it. Isn't it social pressure put on the free mind, especially if other members of the group are around?
I find this 'social pressure to activate filters' line of argument quite flimsy. If a person would be under such social pressure, how are they not at present under enough pressure to avoid using Wikimedia projects (or at least articles where such pictures would be expected to be present) entirely?
Am 16.09.2011 22:52, schrieb Stephen Bain:
On Sat, Sep 17, 2011 at 6:21 AM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Depending on the settings of the user some kind of Javascript will hide the images. This "passed along" labels could simply be used to exclude the image as the whole, making the "show image" button disappear.
That would depend on the implementation, but even if the 'show image' button were not present, the caption (which includes a link to the image description page) would still be there, indicating that an image had been blocked.
Do you believe that it would help you if the image page itself will also be blocked upon the same labeling or that it wouldn't be easy to remove this traces as well? Speaking as a censor, this would have high priority for me. Showing others that i try to hide something? I would be stupid if i didn't do it that way. Leaving traces of what i have done would cause me possible problems.
Thats why a censor would not be so stupid to leave you a link to click on to view the image. He will remove the image together with it's surrounding HTML-Elements.
The provider itself isn't able to filter the image or the content, since this is a lot of working time and time costs money. But if we choose to label the content for no fee, we open a new field for partial censorship.
Blocking of HTTP requests to images subject to any filters by an ISP or some other intermediary would be fairly trivially avoided by requesting the image from a mirror, or via a proxy etc. The community has plenty of talented javascript coders who could implement such a workaround.
Therefor you will have to make the request from a mirror. But where does your browser get the information where to get the image? It lies within the filtered HTML-document. Ooops sorry. It isn't there anymore because it was filtered.
Moreover as above, the caption will still be present (and, depending on the implementation, the 'show image' button will be present but ineffective) and so the user will know that an image has been blocked. To avoid this, the ISP or intermediary would have to alter the HTML in transit to remove the caption to conceal the censorship. But if they have the capability and the desire to do that, then there are many more potent avenues for censorship they could already engage in, particularly avenues involving modification of the article text. The marginal risk presented here does not seem to be high.
I already stated that it is trivial to remove the caption (the whole image and it's container elements. Your Javascript experts could write a working code for that in half a day or less. That would not be an great effort. Altering text is much more difficult, because it isn't labeled. It changes over time. An image is included in a repeating pattern.
But now i must wonder about your argumentation. You said that the "risk is marginal" and it isn't a "high" priority task. Looking at the boards decision it is seen as an task of high importance. I'm wondering why it is that way.
On Sat, Sep 17, 2011 at 6:32 AM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
What would someone living inside such a group think if the content is already labeled that way, that he should not look at it. Isn't it social pressure put on the free mind, especially if other members of the group are around?
I find this 'social pressure to activate filters' line of argument quite flimsy. If a person would be under such social pressure, how are they not at present under enough pressure to avoid using Wikimedia projects (or at least articles where such pictures would be expected to be present) entirely?
For the exactly same reasoning as why providers tend not to block Wikipedia entirely. Blocking such projects partially, only representing the suitable information is a much better choice. If you look at something at Wikipedia and have no such feature, then you won't need an excuse. But if there is an filter, then you need an excuse why you didn't use it.
On Fri, Sep 16, 2011 at 19:56, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Am 16.09.2011 19:13, schrieb Milos Rancic: You would have to proof that your facts are indeed true. But if you accept it as a huge difference between cultures, how can you impose a filter for a culture that doesn't need it or wants it?
Differences between cultures are not so relevant if we are talking about Wiki[pm]edians. Similar results could be expected everywhere. I mean, you won't find that one large enough project shows strong cultural differences in comparison to another. Wikipedian/Wikimedian culture doesn't necessarily connect people (although it does), but it creates common set of values. While communities could differ, the reasons behind the difference are the same, but from different POV.
How would you expect to find a good compromise in decisions on what to filter and what not? Do you intend to put an extremist conservative Arab and and the most liberal German inside the same room, close the door, go away, come back after two weeks and look if they could find a compromise about Yes or No? How should this work?
"Extremist conservative Arab" is not likely a Wikipedian. Pan-Arabist yes, but "extremist conservative" not. Besides that, there is no difference between extremist conservative German and extremist conservative Arab, although the first is more likely Wikipedian than the second. The main reason for the filter are extremist conservative Americans, although majority of Americans share libertarian ideas.
But I agree with you in the sense that more permissive cultures shouldn't suffer because of less permissive cultures. But, again, the problem is that the Wikimedian culture is dominantly permissive, which is the main problem with the referendum.
- It's likely that staff and Board already know that correlation
between the results of German Wikipedia referendum and global survey could be drawn to support previous two conclusions. Thus, they don't want to publish that part of data.
I doubt that. But if they do, I will call them "assholes for betrayal". Just to make it clear. It would also not suite the story onto who has access to the data and who has not.
That's not betrayal, but fear. By now, they simply don't know what to do because they think that all options are bad. But, that's their problem. I would lie if I'd say that I don't enjoy it.
- There is still significant minority of core editors who want the
filter at any cost.
A "significant minority" is a curios choice of words.
"A significant minority tries to abolish the constitution by any cost". Now ask yourself if you would follow their wishes. Thats the same sentence, you said, with different actors. Still happy with it?
Image filter -- as designed for users -- is not a big deal. Thus, I don't have strong opinion toward the filter itself. Let them have it if they want that so much! But, not on the projects which don't want it.
- Board is divided and doesn't know what to decide.
We don't know what the board thinks. It does not communicate with us (the authors), it did not react to the discussions at Meta, it did not answer serious questions and in general is somewhere between a legend and a forgotten ghost that no one can see, even if present.
It's not so hard to guess if you followed them for some time: * Ting: ambivalent; would be much more happy without the whole drama * Jan-Bart: not his business, will support whatever others support * Phoebe: in favor * Stu: not his business, will support whatever others support * Bishakha: slightly in favor tactically, but very hesitant to do anything against community will * Matt: doesn't know what's going on as he doesn't read Board's emails; will support whatever others support, but after phone call * Sj: would close one's eye to image filter, but against imposing it against community's will * Arne: would close one's eye to image filter if it doesn't affect German Wikipedia (as German Wikipedia rejected it) * Jimmy: in favor * Kat: would close one's eye to image filter, but against imposing it against community's will
I would repeat the best possible solution to end this: Implement it on English Wikipedia -- you (those who want that filter) have some numbers which would support that action -- and leave the rest of the projects alone.
That would imply not to implement it on commons. Otherwise the the categorization/labeling/... could be misused by local providers inside regions that didn't intended to use this feature.
True. But, they would be able to use it even it'd been implemented just on English Wikipedia, as it would point to the images at upload.wikimedia.org. Interesting...
Anyway, that's to hard for me to think. Fortunately, I finished fourth on last election, so I don't have to think about it.
On Fri, Sep 16, 2011 at 09:01:04PM +0200, Milos Rancic wrote:
It's not so hard to guess if you followed them for some time:
- Ting: ambivalent; would be much more happy without the whole drama
- Jan-Bart: not his business, will support whatever others support
- Phoebe: in favor
- Stu: not his business, will support whatever others support
- Bishakha: slightly in favor tactically, but very hesitant to do
anything against community will
- Matt: doesn't know what's going on as he doesn't read Board's
emails; will support whatever others support, but after phone call
- Sj: would close one's eye to image filter, but against imposing it
against community's will
- Arne: would close one's eye to image filter if it doesn't affect
German Wikipedia (as German Wikipedia rejected it)
- Jimmy: in favor
- Kat: would close one's eye to image filter, but against imposing it
against community's will
You're better than I am! I only got Phoebe, Sj, Kat, and Jimmy. How'd you get the rest?
sincerely, Kim Bruning
On Fri, Sep 16, 2011 at 20:17, Kim Bruning kim@bruning.xs4all.nl wrote:
On Fri, Sep 16, 2011 at 09:01:04PM +0200, Milos Rancic wrote:
It's not so hard to guess if you followed them for some time:
- Ting: ambivalent; would be much more happy without the whole drama
- Jan-Bart: not his business, will support whatever others support
- Phoebe: in favor
- Stu: not his business, will support whatever others support
- Bishakha: slightly in favor tactically, but very hesitant to do
anything against community will
- Matt: doesn't know what's going on as he doesn't read Board's
emails; will support whatever others support, but after phone call
- Sj: would close one's eye to image filter, but against imposing it
against community's will
- Arne: would close one's eye to image filter if it doesn't affect
German Wikipedia (as German Wikipedia rejected it)
- Jimmy: in favor
- Kat: would close one's eye to image filter, but against imposing it
against community's will
You're better than I am! I only got Phoebe, Sj, Kat, and Jimmy. How'd you get the rest?
I am telepath and distance is not a problem for me :)
Am 16.09.2011 21:01, schrieb Milos Rancic:
On Fri, Sep 16, 2011 at 19:56, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Am 16.09.2011 19:13, schrieb Milos Rancic: You would have to proof that your facts are indeed true. But if you accept it as a huge difference between cultures, how can you impose a filter for a culture that doesn't need it or wants it?
Differences between cultures are not so relevant if we are talking about Wiki[pm]edians. Similar results could be expected everywhere. I mean, you won't find that one large enough project shows strong cultural differences in comparison to another. Wikipedian/Wikimedian culture doesn't necessarily connect people (although it does), but it creates common set of values. While communities could differ, the reasons behind the difference are the same, but from different POV.
This would imply that the referendum indeed asked the wrong questions. If all would have equal values, then i must wonder about the strong difference in result. We have a referendum which points out that many are in favor of this feature (important) and we have a Meinungsbild at the German Wikipedia closed with 86% against the filter. This is a huge difference. If it is not based on the fact that cultures are so different, what would be the reason? The questions and the interpretation?
One of the aspects why I'm so interested in per project raw data and overall participation (number of votes per project).
How would you expect to find a good compromise in decisions on what to filter and what not? Do you intend to put an extremist conservative Arab and and the most liberal German inside the same room, close the door, go away, come back after two weeks and look if they could find a compromise about Yes or No? How should this work?
"Extremist conservative Arab" is not likely a Wikipedian. Pan-Arabist yes, but "extremist conservative" not. Besides that, there is no difference between extremist conservative German and extremist conservative Arab, although the first is more likely Wikipedian than the second. The main reason for the filter are extremist conservative Americans, although majority of Americans share libertarian ideas.
But I agree with you in the sense that more permissive cultures shouldn't suffer because of less permissive cultures. But, again, the problem is that the Wikimedian culture is dominantly permissive, which is the main problem with the referendum.
It was just an example (a literal allegation). The current proposal (as represented in side the referendum) did not assume any cultural difference. My thoughts on this is, how we want to create filter categories which are cultural neutral. One common (easy to describe) example is nudity. What will be considered nude by an catholic priest and an common atheist, both from Germany. Will they come to the same conclusion if they look an swimsuits? I guess we can assume that they would have different opinions and a need for discussion.
Would we need this discussion until now and for all images? No we did not. We discussed about the articles and would be a good illustration for the subject. But now we don't talk about if something is good illustration. We talk about if it is objectionable by someone else. We judge for others what they would see as objectionable. That is inherently against the rule of NPOV. That isn't our job as an encyclopedia. We present the facts in neutral attitude toward the topic. We state the arguments of both or multiple sides. A filter only knows a yes or no to this question. We make a "final" decision what people don't want to see. That is not our job!
- It's likely that staff and Board already know that correlation
between the results of German Wikipedia referendum and global survey could be drawn to support previous two conclusions. Thus, they don't want to publish that part of data.
I doubt that. But if they do, I will call them "assholes for betrayal". Just to make it clear. It would also not suite the story onto who has access to the data and who has not.
That's not betrayal, but fear. By now, they simply don't know what to do because they think that all options are bad. But, that's their problem. I would lie if I'd say that I don't enjoy it.
I would also need to lie, but some progress would be nice. We already represented different alternative models to Ting at Nürnberg WikiCon 2011. So far his reaction described exactly what you think. They don't know what to do in the current situation. What we could do (i might not speak for all German users, but for many) is to implement a very simplified approach. A simple button to hide all images or no image. If you think you might read about a topic that could be controversial, you could enable this function and display images which you are sure about to be not offended. You could represent articles to your children, without the fear that some image might slip through the filter. Additionally we would have no problems with NPOV, no additional work. It would be very easy to implement. We would not play in the hands of censors or possible censors and we would solve many of the issues. (Mohamed? No image no Mohamed. Children? No image, no child will see porn.) No need to invest valuable time in a feature and a big categorization progress, with little or big wars inside this new playground.
The only thing this proposal can't do: It won't make "ugly words", words you don't want to read, disappear. But that was out of question from the beginning.
- There is still significant minority of core editors who want the
filter at any cost.
A "significant minority" is a curios choice of words.
"A significant minority tries to abolish the constitution by any cost". Now ask yourself if you would follow their wishes. Thats the same sentence, you said, with different actors. Still happy with it?
Image filter -- as designed for users -- is not a big deal. Thus, I don't have strong opinion toward the filter itself. Let them have it if they want that so much! But, not on the projects which don't want it.
I see the problem, that devil sticks within the details. It is not the filtering at the of a project. It is the whole infrastructure (not only the technical) which creates the problems. It's prejudice about content, it's labeling content as objectionable and so on. This will still happen, even if the German community would not participate or use/enable the filter.
The basic thought progress behind this whole idea is what _I'm_ opposed to.
- Board is divided and doesn't know what to decide.
We don't know what the board thinks. It does not communicate with us (the authors), it did not react to the discussions at Meta, it did not answer serious questions and in general is somewhere between a legend and a forgotten ghost that no one can see, even if present.
It's not so hard to guess if you followed them for some time:
- Ting: ambivalent; would be much more happy without the whole drama
- Jan-Bart: not his business, will support whatever others support
- Phoebe: in favor
- Stu: not his business, will support whatever others support
- Bishakha: slightly in favor tactically, but very hesitant to do
anything against community will
- Matt: doesn't know what's going on as he doesn't read Board's
emails; will support whatever others support, but after phone call
- Sj: would close one's eye to image filter, but against imposing it
against community's will
- Arne: would close one's eye to image filter if it doesn't affect
German Wikipedia (as German Wikipedia rejected it)
- Jimmy: in favor
- Kat: would close one's eye to image filter, but against imposing it
against community's will
I don't know where you got this information. But I would not wonder if it is as it is presented by you. At least in case of Ting and Jimbo you should have right. I learned with the time about Jimbo, his attitude towards topics and it's understanding. So i have no doubt that he would trade intellectual freedom against some more donations.
I saw Ting's reaction at the WikiCon and i have to say that he wasn't really prepared and not happy with the drama. But saying that it is already decided no matter what the German community would say, was a punch in the face of the audience. I guess she will remember that.
I would repeat the best possible solution to end this: Implement it on English Wikipedia -- you (those who want that filter) have some numbers which would support that action -- and leave the rest of the projects alone.
That would imply not to implement it on commons. Otherwise the the categorization/labeling/... could be misused by local providers inside regions that didn't intended to use this feature.
True. But, they would be able to use it even it'd been implemented just on English Wikipedia, as it would point to the images at upload.wikimedia.org. Interesting...
Anyway, that's to hard for me to think. Fortunately, I finished fourth on last election, so I don't have to think about it.
That is my personal main issue with the whole filter thing based on arbitrary non-neutral labeling of content and POV as the measure for judgment.
On Fri, Sep 16, 2011 at 10:09 PM, Tobias Oelgarte < tobias.oelgarte@googlemail.com> wrote:
This would imply that the referendum indeed asked the wrong questions. If all would have equal values, then i must wonder about the strong difference in result. We have a referendum which points out that many are in favor of this feature (important) and we have a Meinungsbild at the German Wikipedia closed with 86% against the filter. This is a huge difference. If it is not based on the fact that cultures are so different, what would be the reason? The questions and the interpretation?
There might be a difference because of the differences in voting requirements - those were very low for the 'referendum', so there would be a possibly large percentage of people who aren't hardcore Wikimedians, but people who are mostly readers and at most occasionally edit. On the other hand, this would also increase the chance of having sockpuppeting. Another reason could indeed be the questioning: Opponents of the plan could have not voted on the referendum because the whole issue seemed like it had been decided anyway. Then again, proponents might be less likely to vote in the German poll because it is non-anonymous in an environment which seemed opposed to their point of view.
It was just an example (a literal allegation). The current proposal (as
represented in side the referendum) did not assume any cultural difference. My thoughts on this is, how we want to create filter categories which are cultural neutral. One common (easy to describe) example is nudity. What will be considered nude by an catholic priest and an common atheist, both from Germany. Will they come to the same conclusion if they look an swimsuits? I guess we can assume that they would have different opinions and a need for discussion.
As said before, just get different categories, and let people choose among them. The priest could then choose to block "full nudity", "female toplessness", "people in underwear" and "people in swimwear", but not "images containing naked bellies" or "unveiled women", whereas the atheist could for example choose to only block "photographs of sexual organs" and watch the rest.
Would we need this discussion until now and for all images? No we did not. We discussed about the articles and would be a good illustration for the subject. But now we don't talk about if something is good illustration. We talk about if it is objectionable by someone else. We judge for others what they would see as objectionable. That is inherently against the rule of NPOV. That isn't our job as an encyclopedia. We present the facts in neutral attitude toward the topic. We state the arguments of both or multiple sides. A filter only knows a yes or no to this question. We make a "final" decision what people don't want to see. That is not our job!
I find it strange that you consider this an objection to a filter. Surely, giving someone an imperfect choice of what they consider objectionable is _less_ making a decision for them than judging in advance that nothing is objectionable?
I don't know where you got this information. But I would not wonder if it is as it is presented by you. At least in case of Ting and Jimbo you should have right. I learned with the time about Jimbo, his attitude towards topics and it's understanding. So i have no doubt that he would trade intellectual freedom against some more donations.
How are we giving away intellectual freedom with this?
That is my personal main issue with the whole filter thing based on arbitrary non-neutral labeling of content and POV as the measure for judgment.
What is POV about labelling something as being an image containing a nude human or an illustration supposed to represent a religious figure?
Am 16.09.2011 22:37, schrieb Andre Engels:
There might be a difference because of the differences in voting requirements - those were very low for the 'referendum', so there would be a possibly large percentage of people who aren't hardcore Wikimedians, but people who are mostly readers and at most occasionally edit. On the other hand, this would also increase the chance of having sockpuppeting. Another reason could indeed be the questioning: Opponents of the plan could have not voted on the referendum because the whole issue seemed like it had been decided anyway. Then again, proponents might be less likely to vote in the German poll because it is non-anonymous in an environment which seemed opposed to their point of view.
The requirements where different. Thats right. But i don't believe that this would influence the overall outcome. I monitored some articles about sexual topics for a while (especially Hentai, Lolicon, Futanari, and so on) that had controversial images in them. Additionally i looked up the numbers of views per day and compared them with the number of complaints over time. In the last year we had one short request to remove the well known image from Lolicon, because it was considered porn (which it isn't). That was the first and only complain in the whole time, despite the fact that every of this articles has more then 1000 views a day.
On EN the story is a little bit different. The articles have in average the double or triple amount of viewers, but you will get at least one complain in a month per article. This is still very very low. That means that one out of approximately 90.000 viewers will complain (including the constant repeating complainers). Assuming that only 1 of 100 viewers would leave a comment while being offended, then it is still approximately 1 in 1000 (0,1 %).
This a vague fact on for how many users the filter would be an maybe helpful tool. Compared to the German Wikipedia it is still high in relation. (I did not count the zeros after the comma.)
Given that, can we assume that the filter is really necessary or that it would be an valid explanation based on the fact that the polls had different requirements for voting?
If you compare my simple (not waterproof) calculation to the result of the German poll you might think that this isn't true. Only 86% opposed the filter. Thats right, but you will also have to consider the public comments. There are only some that found it important that the filter will be implemented, since they see use in it. The most opposing votes (14%) are: Let them do it, it won't hurt, but not strictly in favor of the filter itself.
It was just an example (a literal allegation). The current proposal (as
represented in side the referendum) did not assume any cultural difference. My thoughts on this is, how we want to create filter categories which are cultural neutral. One common (easy to describe) example is nudity. What will be considered nude by an catholic priest and an common atheist, both from Germany. Will they come to the same conclusion if they look an swimsuits? I guess we can assume that they would have different opinions and a need for discussion.
As said before, just get different categories, and let people choose among them. The priest could then choose to block "full nudity", "female toplessness", "people in underwear" and "people in swimwear", but not "images containing naked bellies" or "unveiled women", whereas the atheist could for example choose to only block "photographs of sexual organs" and watch the rest.
And again (now i must repeat myself) i have to say, that we aren't able to manage such an amount of categories. Every category increases the work drastically and complicates it. But even if we are able to categorize millions of images, then we still have to represent them to the reader who is willed to filter the content. A dialog with options to sort out of hundreds or more categories seems not the a viable option. Who would really take his time to go trough all points as an IP-user who loses it's settings if switching the browser or the workplace? That would be anything but not user friendly.
So you will have to make a compromise (i assume 10-20 categories), where you don't have this fine graded model. You proposed alone 7 categories for nudity. What about other topics? You will easily reach high numbers.
That is an high effort for categoization/labeling as well as for the user. It would just not work that way and I don't believe that we can proceed under the assumption of having an infinite amount of labels.
Would we need this discussion until now and for all images? No we did not. We discussed about the articles and would be a good illustration for the subject. But now we don't talk about if something is good illustration. We talk about if it is objectionable by someone else. We judge for others what they would see as objectionable. That is inherently against the rule of NPOV. That isn't our job as an encyclopedia. We present the facts in neutral attitude toward the topic. We state the arguments of both or multiple sides. A filter only knows a yes or no to this question. We make a "final" decision what people don't want to see. That is not our job!
I find it strange that you consider this an objection to a filter. Surely, giving someone an imperfect choice of what they consider objectionable is _less_ making a decision for them than judging in advance that nothing is objectionable?
Judging that nothing is objectionable is our job as the writers of an encyclopedia. We are not hiding facts based upon the view that it might be objectionable for some users. This is the most basic rule that every encyclopedia followed, if not under the control of a dictatorial system.
Thats exactly why library organizations or the editorial staff of encyclopedias are opposed to label content by non-neutral categories. For example have a look at
http://www.ala.org/ala/issuesadvocacy/intfreedom/librarybill/interpretations...
Which claims for more then 50 years that labeling after objectionableness is in violation with the basic rules of intellectual freedom. We have the same rules. One of them is NPOV, which covers this topic as well.
I don't know where you got this information. But I would not wonder if it is as it is presented by you. At least in case of Ting and Jimbo you should have right. I learned with the time about Jimbo, his attitude towards topics and it's understanding. So i have no doubt that he would trade intellectual freedom against some more donations.
How are we giving away intellectual freedom with this?
Follow the link above and you will hopefully understand what this is about. Intellectual freedom means to represent knowledge in way that does respect the facts, but does not respect the audience and their wishes or likings. It means to represent knowledge, but not to bend before people who think that something might be objectionable. Not censoring yourself and making knowledge available to everybody, that is the definition of intellectual freedom.
That is my personal main issue with the whole filter thing based on arbitrary non-neutral labeling of content and POV as the measure for judgment.
What is POV about labelling something as being an image containing a nude human or an illustration supposed to represent a religious figure?
The POV starts when you draw a line between what is nude and what is not nude. Is someone wearing a half transparent swimsuit nude? (yes/no) Is it still (not) nude if it is a tiny bit thinner/thicker? (yes/no)
You see that even this simple question can't be answered with a simple yes or no. No think about what you use to separate between nude content and non nude content. Do you have a strict catalog, with points to follow. Isn't this your point of view where nudity begins and where it ends? Do you think that your neighbor would come to exactly the same conclusion? He won't. It will not divide far away from each other. But this is also the simplest question.
What about violence. Is a boxing fight inside a ring violence? Does an image from after the fight depict violence, even so there is blood on the floor and both boxers shake hands? Now it gets a little bit more complicated.
Now we could move on to next topic and i could ensure you, that it won't get easier, only much harder. You could play the judge, but i can guarantee you, that won't be able to claim neutrality on this matters anymore.
To be truthful to yourself, you will have to admit that you have choosen "nudity" as the example, because it is much easier to handle then every other topic. But still you found a lot of categories to divide between different grades of nudity.
André Engels wrote:
As said before, just get different categories, and let people choose among them. The priest could then choose to block "full nudity", "female toplessness", "people in underwear" and "people in swimwear", but not "images containing naked bellies" or "unveiled women", whereas the atheist could for example choose to only block "photographs of sexual organs" and watch the rest.
As Tobias Oelgarte noted, it simply isn't feasible to categorize images in this manner (keeping in mind that those are merely examples of the countless categories that we would need to apply to millions of images, with thousands more uploaded every day), let alone to present such a large quantity of filter options to readers.
I find it strange that you consider this an objection to a filter. Surely, giving someone an imperfect choice of what they consider objectionable is _less_ making a decision for them than judging in advance that nothing is objectionable?
You're mischaracterizing the status quo. We haven't determined that "nothing is objectionable" to anyone; we rightly assume that _everything_ is potentially objectionable to someone (and refrain from favoring certain objections over others).
What is POV about labelling something as being an image containing a nude human or an illustration supposed to represent a religious figure?
Tobias Oelgarte described one key problem. Another lies in the labeling of some things and not others. Unless we were to create and apply a label for literally everything that someone finds objectionable, we'd be taking the non-neutral position that only certain objections (the ones for which filters exist) are reasonable.
You mentioned a hypothetical "unveiled women" category. Do you honestly believe that the idea of tagging images in this manner is remotely realistic?
What about images depicting miscegenation (another concept to which many people strongly object)? Are we to have such a category?
David Levy
On Sat, Sep 17, 2011 at 8:16 PM, David Levy lifeisunfair@gmail.com wrote:
I find it strange that you consider this an objection to a filter.
Surely,
giving someone an imperfect choice of what they consider objectionable is _less_ making a decision for them than judging in advance that nothing is objectionable?
You're mischaracterizing the status quo. We haven't determined that "nothing is objectionable" to anyone; we rightly assume that _everything_ is potentially objectionable to someone (and refrain from favoring certain objections over others).
Thereby giving those who have objections nothing just because there are others who we can't give what they want. If we had the same attitude towards article creation, we would not have published Wikipedia until we had articles on all subjects we could think of.
What is POV about labelling something as being an image containing a nude human or an illustration supposed to represent a religious figure?
Tobias Oelgarte described one key problem. Another lies in the labeling of some things and not others. Unless we were to create and apply a label for literally everything that someone finds objectionable, we'd be taking the non-neutral position that only certain objections (the ones for which filters exist) are reasonable.
We don't say they're unreasonable, we say that we don't cater to it, or at least not yet. That may be non-neutral, but no more non-neutral than that one subject has an article and the other not, or one picture is used to describe an article and the other not, or one category is deemed important enough to be used to categorize our articles, books, words and images and another not.
Or even clearer: that one language has a Wikipedia and another not. Wid we make a non-neutral choice that only certain languages (the ones for which Wikipedias exist) are valid languages to use for spreading knowledge?
You mentioned a hypothetical "unveiled women" category. Do you honestly believe that the idea of tagging images in this manner is remotely realistic?
I'd say it is, provided there are people wanting to use the filter, and not minding the fact that in the beginning it will be far from perfect.
What about images depicting miscegenation (another concept to which many people strongly object)? Are we to have such a category?
I'd say if there are people actually wanting to use such a filter, then yes, I would think we might well get one.
I wrote:
You're mischaracterizing the status quo. We haven't determined that "nothing is objectionable" to anyone; we rightly assume that _everything_ is potentially objectionable to someone (and refrain from favoring certain objections over others).
André Engels replied:
Thereby giving those who have objections nothing just because there are others who we can't give what they want.
I don't advocate maintaining the status quo. I support an alternative image filter implementation (endorsed by WMF trustee Samuel Klein), which would accommodate _everyone_ and require no determinations on the part of the community (let alone analysis/tagging of millions of files, with thousands more uploaded every day).
Please see the relevant discussion: http://meta.wikimedia.org/wiki/Talk:Image_filter_referendum/en/Categories#ge... or http://goo.gl/t6ly5
Unless we were to create and apply a label for literally everything that someone finds objectionable, we'd be taking the non-neutral position that only certain objections (the ones for which filters exist) exist) are reasonable.
We don't say they're unreasonable, we say that we don't cater to it, or at least not yet.
On what basis shall we determine the objections to which we cater?
Presumably, we would start by weeding out objections held by relatively small numbers of people. (This is problematic, but let's set aside that issue.)
Then what? To select only certain objections from the resultant pool, what criterion other than reasonableness (a completely subjective judgement) remains?
You might argue that we needn't narrow the aforementioned pool, but a large number of filter categories is unsustainable (both in its creation/maintenance and in its presentation to readers).
That may be non-neutral, but no more non-neutral than that one subject has an article and the other not, or one picture is used to describe an article and the other not, or one category is deemed important enough to be used to categorize our articles, books, words and images and another not.
All of those decisions are based — at least in part — on objective criteria. More importantly, all of those decisions are intrinsic to the WMF projects' operation.
Conversely, the image filter implementation that you advocate is neither essential nor technically feasible.
Or even clearer: that one language has a Wikipedia and another not. Wid we make a non-neutral choice that only certain languages (the ones for which Wikipedias exist) are valid languages to use for spreading knowledge?
You're describing an imbalance that exists through misfortune, *not* by design. Ideally, we would include every written language utilized as a primary means of communication. Sadly, some projects simply aren't viable (because they lack sufficient user bases).
No analogous situation forces us to treat readers differently based on their personal beliefs regarding what images are/aren't objectionable.
You mentioned a hypothetical "unveiled women" category. Do you honestly believe that the idea of tagging images in this manner is remotely realistic?
I'd say it is, provided there are people wanting to use the filter, and not minding the fact that in the beginning it will be far from perfect.
So we eventually will analyze millions of images (and monitor the thousands uploaded on a daily basis) to tag each and every one containing an unveiled woman?
What about images depicting miscegenation (another concept to which many people strongly object)? Are we to have such a category?
I'd say if there are people actually wanting to use such a filter, then yes, I would think we might well get one.
I admire your consistency, but I regard this approach as stupendously infeasible.
David Levy
On Sun, Sep 18, 2011 at 12:20 AM, David Levy lifeisunfair@gmail.com wrote:
I wrote:
You're mischaracterizing the status quo. We haven't determined that "nothing is objectionable" to anyone; we rightly assume that _everything_ is potentially objectionable to someone (and refrain from favoring certain objections over others).
André Engels replied:
No analogous situation forces us to treat readers differently based on their personal beliefs regarding what images are/aren't objectionable.
You mentioned a hypothetical "unveiled women" category. Do you honestly believe that the idea of tagging images in this manner is remotely realistic?
I'd say it is, provided there are people wanting to use the filter, and not minding the fact that in the beginning it will be far from perfect.
So we eventually will analyze millions of images (and monitor the thousands uploaded on a daily basis) to tag each and every one containing an unveiled woman?
What about images depicting miscegenation (another concept to which many people strongly object)? Are we to have such a category?
I'd say if there are people actually wanting to use such a filter, then yes, I would think we might well get one.
Wikimedia *used* to hold the position that we wouldn't aid China to block images of the Tianamen Massacre, and went to great lengths to assure that chinese users of Wikipedia could evade blocks to viewing. I am not sure you are on a right track with regards to our traditions and values here.
On Sun, Sep 18, 2011 at 3:49 AM, Jussi-Ville Heiskanen <cimonavaro@gmail.com
wrote:
Wikimedia *used* to hold the position that we wouldn't aid China to block images of the Tianamen Massacre, and went to great lengths to assure that chinese users of Wikipedia could evade blocks to viewing. I am not sure you are on a right track with regards to our traditions and values here.
There's a big difference between the two in that the Chinese case was about people wanting to decide what _others_ could see, the filter is about people wanting to decide what _they themselves_ would see.
On Sun, Sep 18, 2011 at 10:46 AM, Andre Engels andreengels@gmail.com wrote:
On Sun, Sep 18, 2011 at 3:49 AM, Jussi-Ville Heiskanen <cimonavaro@gmail.com
wrote:
Wikimedia *used* to hold the position that we wouldn't aid China to block images of the Tianamen Massacre, and went to great lengths to assure that chinese users of Wikipedia could evade blocks to viewing. I am not sure you are on a right track with regards to our traditions and values
a>> here.
There's a big difference between the two in that the Chinese case was about people wanting to decide what _others_ could see, the filter is about people wanting to decide what _they themselves_ would see.
Okay. Is there a commitment on the part of the foundation that they will help people using our filtering scheme and the usual browser add-ons to Wmake it impossible to view material on wikipedia from schools with a religious orientation, but students of way past any reasonable age of consent get past the filters?
Once you tag something, you lose control over what that tag is used for. We might be talking about primary schools, but we could just as easily be blocked by colleges and high schools, where limiting hormonal damage would be a joke.
On Sun, Sep 18, 2011 at 11:44 AM, Jussi-Ville Heiskanen < cimonavaro@gmail.com> wrote:
Okay. Is there a commitment on the part of the foundation that they will help people using our filtering scheme and the usual browser add-ons to Wmake it impossible to view material on wikipedia from schools with a religious orientation, but students of way past any reasonable age of consent get past the filters?
Not as far as I know.
Once you tag something, you lose control over what that tag is used for. We might be talking about primary schools, but we could just as easily be blocked by colleges and high schools, where limiting hormonal damage would be a joke.
So? The plans as they are include giving the user the option to remove the filter, or see blocked images, whenever they like.
Am 18.09.2011 14:02, schrieb Andre Engels:
On Sun, Sep 18, 2011 at 11:44 AM, Jussi-Ville Heiskanen< cimonavaro@gmail.com> wrote:
Okay. Is there a commitment on the part of the foundation that they will help people using our filtering scheme and the usual browser add-ons to Wmake it impossible to view material on wikipedia from schools with a religious orientation, but students of way past any reasonable age of consent get past the filters?
Not as far as I know.
Once you tag something, you lose control over what that tag is used for. We might be talking about primary schools, but we could just as easily be blocked by colleges and high schools, where limiting hormonal damage would be a joke.
So? The plans as they are include giving the user the option to remove the filter, or see blocked images, whenever they like.
While such content is excluded/filtered out by the local school network? Then they have the choice to not view the image with active filter or to not view the images with disabled filter...
Am 18.09.2011 09:46, schrieb Andre Engels:
On Sun, Sep 18, 2011 at 3:49 AM, Jussi-Ville Heiskanen<cimonavaro@gmail.com
wrote:
Wikimedia *used* to hold the position that we wouldn't aid China to block images of the Tianamen Massacre, and went to great lengths to assure that chinese users of Wikipedia could evade blocks to viewing. I am not sure you are on a right track with regards to our traditions and values here.
There's a big difference between the two in that the Chinese case was about people wanting to decide what _others_ could see, the filter is about people wanting to decide what _they themselves_ would see.
And who decides which image belongs to which category. The one that will use the filter or the one that tags the image?
Additionally: Is the reader able to choose if China would use the tags to exclude content before it can the reader? Wouldn't we be responsible it, if the feature is misused this way, since we know how easy it can be misused?
On Sun, Sep 18, 2011 at 11:45 AM, Tobias Oelgarte < tobias.oelgarte@googlemail.com> wrote:
Am 18.09.2011 09:46, schrieb Andre Engels:
On Sun, Sep 18, 2011 at 3:49 AM, Jussi-Ville Heiskanen<
cimonavaro@gmail.com
wrote:
Wikimedia *used* to hold the position that we wouldn't aid China to
block
images of the Tianamen Massacre, and went to great lengths to assure that chinese users of Wikipedia could evade blocks to viewing. I am not sure you are on a right track with regards to our traditions and values here.
There's a big difference between the two in that the Chinese case was
about
people wanting to decide what _others_ could see, the filter is about
people
wanting to decide what _they themselves_ would see.
And who decides which image belongs to which category. The one that will use the filter or the one that tags the image?
On itself the one who tags the image, but we happen to have a system for that in Wikimedia. It is called discussion and trying to reach consent. Who decides whether a page is in a category? Who decides whether a page has an image? Who decides whether something is decribed on a page? All the same.
Additionally: Is the reader able to choose if China would use the tags to exclude content before it can the reader? Wouldn't we be responsible it, if the feature is misused this way, since we know how easy it can be misused?
I don't think it's that easy, and if it were, the best thing would be to make it harder to misuse rather than to throw away the child with the bathwater.
Am 18.09.2011 13:56, schrieb Andre Engels:
On itself the one who tags the image, but we happen to have a system for that in Wikimedia. It is called discussion and trying to reach consent. Who decides whether a page is in a category? Who decides whether a page has an image? Who decides whether something is decribed on a page? All the same.
Our typical system of categories is designed to make it easier to /find/ (related) articles or media. Good luck trying that with a system that is designed to /hide/ things. And this doesn't seem like an awful waste of precious time to you? For a feature that is not all that likely to be popular on a global scale?
Regards, Oliver
2011/9/18 Oliver Koslowski o.nee@t-online.de:
Am 18.09.2011 13:56, schrieb Andre Engels:
On itself the one who tags the image, but we happen to have a system for that in Wikimedia. It is called discussion and trying to reach consent. Who decides whether a page is in a category? Who decides whether a page has an image? Who decides whether something is decribed on a page? All the same.
Our typical system of categories is designed to make it easier to /find/ (related) articles or media. Good luck trying that with a system that is designed to /hide/ things. And this doesn't seem like an awful waste of precious time to you? For a feature that is not all that likely to be popular on a global scale?
+1 At the beginning, I was quite neutral about a filter: I had no idea how it would work, and I wouldn't use it, but what if somebody else wants it?
But after reading nearly all comments on this list, I think that the arguments for a filter do not hold water. The pratical implemention would be a nightmare, and the purpose not really within Wikimedia mission. The thread above on how to create categories for a filter is full of irrational assumptions, impracticable propositions, and impossible solutions. It seems it is time to drop the whole idea...
Regards, Oliver
Regards,
Yann
2011/9/18 Oliver Koslowski o.nee@t-online.de:
Am 18.09.2011 13:56, schrieb Andre Engels:
On itself the one who tags the image, but we happen to have a system for that in Wikimedia. It is called discussion and trying to reach consent. Who decides whether a page is in a category? Who decides whether a page has an image? Who decides whether something is decribed on a page? All the same.
Our typical system of categories is designed to make it easier to /find/ (related) articles or media. Good luck trying that with a system that is designed to /hide/ things. And this doesn't seem like an awful waste of precious time to you? For a feature that is not all that likely to be popular on a global scale?
+1 At the beginning, I was quite neutral about a filter: I had no idea how it would work, and I wouldn't use it, but what if somebody else wants it?
But after reading nearly all comments on this list, I think that the arguments for a filter do not hold water. The pratical implemention would be a nightmare, and the purpose not really within Wikimedia mission. The thread above on how to create categories for a filter is full of irrational assumptions, impracticable propositions, and impossible solutions. It seems it is time to drop the whole idea...
Regards, Oliver
Regards,
Yann
I agree.
I do support "censorship". There is absolutely no excuse for hosting an image of Mohammad as a dog, but this is a Rube Goldburg boondoggle.
Fred
Am 18.09.2011 15:58, schrieb Fred Bauder:
I do support "censorship". There is absolutely no excuse for hosting an image of Mohammad as a dog, but this is a Rube Goldburg boondoggle.
Nothing wrong with hosting that picture (http://en.wikipedia.org/wiki/File:Muh-hund-original-rondellliten.JPG ). It's really about where it's used.
Should it be used in the article on Muhammad? Heck no. But in the article (http://en.wikipedia.org/wiki/Lars_Vilks_Muhammad_drawings_controversy ) describing the controversy that arised around that very picture? Sure enough, because it serves an encyclopedic purpose there.
Regards, Oliver
On 18 September 2011 14:38, Yann Forget yannfo@gmail.com wrote:
At the beginning, I was quite neutral about a filter: I had no idea how it would work, and I wouldn't use it, but what if somebody else wants it? But after reading nearly all comments on this list, I think that the arguments for a filter do not hold water. The pratical implemention would be a nightmare, and the purpose not really within Wikimedia mission. The thread above on how to create categories for a filter is full of irrational assumptions, impracticable propositions, and impossible solutions. It seems it is time to drop the whole idea...
The problem is that "offensive image" is a magical category.
The concept of a "magical category" is useful and important. It's something that sounds simple in ordinary language, but turns out to be a nightmare to implement and result in stupidity.
I got the phrase from http://lesswrong.com/lw/td/magical_categories/ , which discusses the problem of magical categories in artificial intelligence. But it's vastly useful in general.
A magical category will be something that is put forward as objective, and thus reducible to computer instructions, but everyone seems to have a different subjective interpretation. Worse yet, there is not even a reducible way to process as-yet-unknown examples, other than "I know it when I see it".
Magical categories are common in rhetoric and interpersonal human politics. An example is when someone appears to solve a problem in words, but you go "what!" because you know the details are subjective, squishy and not actually reducible in any way at all.
Apart from a simple "all images on/off" filter, *every* proposed offensive image category in this discussion has been a magical category: subjective, individual, argued over, and with a ton of "I know it when I see it."
In passing this resolution, the board appears to have fallen for the illusion that magical categories are possible to implement. This is unfortunate.
- d.
On Sun, Sep 18, 2011 at 3:02 PM, Oliver Koslowski o.nee@t-online.de wrote:
Am 18.09.2011 13:56, schrieb Andre Engels:
On itself the one who tags the image, but we happen to have a system for that in Wikimedia. It is called discussion and trying to reach consent.
Who
decides whether a page is in a category? Who decides whether a page has
an
image? Who decides whether something is decribed on a page? All the same.
Our typical system of categories is designed to make it easier to /find/ (related) articles or media. Good luck trying that with a system that is designed to /hide/ things.
I don't see a difference. I want to show images showing so-and-so, or I do not want to see them. It's all about saying whether images show so-and-so.
And this doesn't seem like an awful waste of precious time to you? For a feature that is not all that likely to be popular on a global scale?
It depends. If people want to do it, it is their choice how to use their volunteering time. If they don't, then bad luck to those using the feature.
I do agree that there are dozens of things in Wikipedia/Wikimedia/Mediawiki that I'd rather see; I chose the secon-lowest rating in the referendum, and might well have chosen the lowest had I not expected that to be understood as "I am against this". I do think there are many better things to do with our time and other means.
2011/9/18 Andre Engels andreengels@gmail.com:
On Sun, Sep 18, 2011 at 3:02 PM, Oliver Koslowski o.nee@t-online.de wrote:
Am 18.09.2011 13:56, schrieb Andre Engels:
On itself the one who tags the image, but we happen to have a system for that in Wikimedia. It is called discussion and trying to reach consent.
Who
decides whether a page is in a category? Who decides whether a page has
an
image? Who decides whether something is decribed on a page? All the same.
Our typical system of categories is designed to make it easier to /find/ (related) articles or media. Good luck trying that with a system that is designed to /hide/ things.
I don't see a difference. I want to show images showing so-and-so, or I do not want to see them. It's all about saying whether images show so-and-so.
Then we have a problem, because these are completely different things.
And this doesn't seem like an awful waste of precious time to you? For a feature that is not all that likely to be popular on a global scale?
It depends. If people want to do it, it is their choice how to use their volunteering time. If they don't, then bad luck to those using the feature.
This seems at best to be written without a real thought on the practical thing.
Take any controversial subject, being nudity or Muhammad. If people do not want to see the images, I doubt very much that they will review them to add categories. If people don't care about seeing the images, I also doubt that they will spend time adding catergories. Then who would add categories for the filter? Go figure...
I do agree that there are dozens of things in Wikipedia/Wikimedia/Mediawiki that I'd rather see; I chose the secon-lowest rating in the referendum, and might well have chosen the lowest had I not expected that to be understood as "I am against this". I do think there are many better things to do with our time and other means.
-- André Engels, andreengels@gmail.com
Regards,
Yann
André Engels wrote:
I don't see a difference. I want to show images showing so-and-so, or I do not want to see them. It's all about saying whether images show so-and-so.
In our normal categorization scheme, "so-and-so" might equal "a beach" or "a woman wearing a swimsuit." Such classifications are straightforward, uncontroversial and not mutually exclusive.
Under the proposed setup, "so-and-so" might equal "nudity." We'd need to decide whether the aforementioned photograph of a swimsuit-clad woman on a beach qualified, even if she merely appeared in the background (and therefore wouldn't be a factor in our normal categorization).
If people want to do it, it is their choice how to use their volunteering time. If they don't, then bad luck to those using the feature.
It's reasonable to oppose the change on the basis that it would divert time better spent on other tasks, particularly given the likelihood that many non-supporters would participate in the process to prevent misuse.
Additionally, if and when the WMF proudly announces the filters' introduction, the news media and general public won't accept "bad luck to those using the feature" as an excuse for its failure.
David Levy
On 19 September 2011 06:28, David Levy lifeisunfair@gmail.com wrote:
Additionally, if and when the WMF proudly announces the filters' introduction, the news media and general public won't accept "bad luck to those using the feature" as an excuse for its failure.
Oh, yes. The trouble with a magical category is not just that it's impossible to implement well - but that it's fraught as a public relations move.
What is the WMF going to be explicitly - and *implicitly* - promising readers? What is the publicity plan? Has this actually been mapped out at all?
- d.
I understand that the details (well, quite big and relevant details) of this concept was the topic of the survey. So probably it has not been mapped out yet (because it was/is unknown), but that would be the next step.
I also would like to make a sidenote: if the main argument of the German Wikipedians would be that this categorization an sich would be evil because it can be used by governments and ISP's etc, then I have to disappoint you: even if only one project would like to make the implementation of a filter possible for their readers, categorization would appear.
Further, categorization of images will be happening likely on Commons (my guess) - so even if you opt out as German Wikipedia (although personally I think it would be more interesting to do a reader survey inside the German langauge visitors before deciding on that) it would not help that specific scenario.
Lodewijk
Am 19 de Setembro de 2011 09:47 schrieb David Gerard dgerard@gmail.com:
On 19 September 2011 06:28, David Levy lifeisunfair@gmail.com wrote:
Additionally, if and when the WMF proudly announces the filters' introduction, the news media and general public won't accept "bad luck to those using the feature" as an excuse for its failure.
Oh, yes. The trouble with a magical category is not just that it's impossible to implement well - but that it's fraught as a public relations move.
What is the WMF going to be explicitly - and *implicitly* - promising readers? What is the publicity plan? Has this actually been mapped out at all?
- d.
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Many contributers to the poll mentioned that the categorization by sensitivities is already a big problem in itself. At first, as you mentioned, it can be misused. Either by third parties which could use it for aggressive filtering (completely hidden/cot out images) or directly at the Wiki itself. Since we have many images with in comparison few active users, it would be very easy for influential groups to push there POV. Such minorities can easily get local majority and there is no way to defend against them with argumentation or sources. We have no arguments or sources for single images regarding sensitivities.
The second problem will be the categorization progress. We would categorize the images for others, not our selfs, and we also have no sources for argumentation. But there is another problem. We already discuss about the inclusion of images inside related articles discussion pages. While some image might not be appropriate for inclusion in one article, it might be the perfect, valuable, needed for understanding, maybe offensive illustration for another article. The categorization far away from the article, not visible to users who don't enable the filter, will not be related to article needs. So we will discuss at the article first and then again at the new battle field. It's not hard to believe that this will cost us much more time and effort as anything else really worthy we could do in the meantime. It's a fight against the symptoms of a cultural problem without actually tackling it. We just push it away.
Am 19.09.2011 11:42, schrieb Lodewijk:
I understand that the details (well, quite big and relevant details) of this concept was the topic of the survey. So probably it has not been mapped out yet (because it was/is unknown), but that would be the next step.
I also would like to make a sidenote: if the main argument of the German Wikipedians would be that this categorization an sich would be evil because it can be used by governments and ISP's etc, then I have to disappoint you: even if only one project would like to make the implementation of a filter possible for their readers, categorization would appear.
Further, categorization of images will be happening likely on Commons (my guess) - so even if you opt out as German Wikipedia (although personally I think it would be more interesting to do a reader survey inside the German langauge visitors before deciding on that) it would not help that specific scenario.
Lodewijk
Am 19 de Setembro de 2011 09:47 schrieb David Gerarddgerard@gmail.com:
On 19 September 2011 06:28, David Levylifeisunfair@gmail.com wrote:
Additionally, if and when the WMF proudly announces the filters' introduction, the news media and general public won't accept "bad luck to those using the feature" as an excuse for its failure.
Oh, yes. The trouble with a magical category is not just that it's impossible to implement well - but that it's fraught as a public relations move.
What is the WMF going to be explicitly - and *implicitly* - promising readers? What is the publicity plan? Has this actually been mapped out at all?
- d.
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Zitat von Tobias Oelgarte tobias.oelgarte@googlemail.com:
The second problem will be the categorization progress. We would categorize the images for others, not our selfs, and we also have no sources for argumentation. But there is another problem. We already discuss about the inclusion of images inside related articles discussion pages. While some image might not be appropriate for inclusion in one article, it might be the perfect, valuable, needed for understanding, maybe offensive illustration for another article.
From what I understood the image filter will not have subjective criteria like "a little offensive", "very offensive", "pornography", but neutrally decidable criteria like "depicts nude female breasts", "depicts the face of Muhammad", "depicts mutilated dead body". If you select these criteria carefully there should be no need for any "sources" for your decision to put a file in the criterion's category. Either the image depicts the category topic or it doesn't.
Marcus Buck User:Slomox
Zitat von Tobias Oelgarte tobias.oelgarte@googlemail.com:
The second problem will be the categorization progress. We would categorize the images for others, not our selfs, and we also have no sources for argumentation. But there is another problem. We already discuss about the inclusion of images inside related articles discussion pages. While some image might not be appropriate for inclusion in one article, it might be the perfect, valuable, needed for understanding, maybe offensive illustration for another article.
From what I understood the image filter will not have subjective criteria like "a little offensive", "very offensive", "pornography", but neutrally decidable criteria like "depicts nude female breasts", "depicts the face of Muhammad", "depicts mutilated dead body". If you select these criteria carefully there should be no need for any "sources" for your decision to put a file in the criterion's category. Either the image depicts the category topic or it doesn't.
Marcus Buck User:Slomox
But some depictions of such things are offensive or pornographic and some not at all.
Fred
Am 19.09.2011 15:33, schrieb me@marcusbuck.org:
Zitat von Tobias Oelgartetobias.oelgarte@googlemail.com:
The second problem will be the categorization progress. We would categorize the images for others, not our selfs, and we also have no sources for argumentation. But there is another problem. We already discuss about the inclusion of images inside related articles discussion pages. While some image might not be appropriate for inclusion in one article, it might be the perfect, valuable, needed for understanding, maybe offensive illustration for another article.
From what I understood the image filter will not have subjective criteria like "a little offensive", "very offensive", "pornography", but neutrally decidable criteria like "depicts nude female breasts", "depicts the face of Muhammad", "depicts mutilated dead body". If you select these criteria carefully there should be no need for any "sources" for your decision to put a file in the criterion's category. Either the image depicts the category topic or it doesn't.
Marcus Buck User:Slomox
We discussed this already and came to the conclusion, that you would need hundreds of these categories to filter out most of the "objectionable content". But that is neither manageable from our side nor manageable by the user. You run into a deadlock. Either we will end up having some rather subjective categories or we have whole lot of them, we can't manage (at least not under the assumption to be user-friendly or wasting a whole lot of resources for tiny group of readers).
On Tue, Sep 20, 2011 at 12:47 AM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
We discussed this already and came to the conclusion, that you would need hundreds of these categories to filter out most of the "objectionable content".
And once again, the labelling doesn't need to be perfect (nothing on a wiki is) if an option to hide all images by default is implemented (which at present there seems to be broad support for, from most quarters).
The accuracy of filtering can then be disclaimed, with a recommendation that people can hide all images if they want a guarantee. Coarse-grained labelling is then good enough, and we can even adopt the position that where there is no consensus, the image will not be filtered.
On Tue, Sep 20, 2011 at 1:17 AM, David Gerard dgerard@gmail.com wrote:
I'd estimate the chances as pretty high that we're going to get a thorough exploration of every possible axis that's measured for a filter.
So you're thinking to apply this only to photos, then?
No. And of course artworks are being used as examples because they're going to present the corner cases. But all of these discussions seem to be proceeding on the basis that there are nothing but corner cases, when really (I would imagine) pretty much everything that will be filtered will be either: * actual images of human genitals [1], * actual images of dead human bodies, or * imagery subject to religious restriction. Almost all will be in the first two categories, and most of those in the first one, and will primarily be photographs.
On Tue, Sep 20, 2011 at 1:29 AM, Fae fae@wikimedia.org.uk wrote:
Er, Egyptian mummies are real bodies that would need real photographs.
For a wealth of horrific examples that need to be censored, please enjoy viewing http://commons.wikimedia.org/wiki/Category:Mummies
On the basis that the community, by and large, is not comprised wholly of idiots, I'm sure it will be capable of holding a sensible discussion as to whether images of mummies (not to forget bog bodies and Pompeii castings, as further examples) would be in or out of such a category.
And again, perfection is not necessary. If someone has "dead bodies" filtered and sees the filtered image placeholder with the caption "this is an Egyptian mummy", they can elect to show that particular image, or decide that they would like to turn off the filter. Or if such a "dead bodies" filter is described as not including Egyptian mummies, someone could decide to hide all images by default. This doesn't have to be difficult.
-- [1] Which, naturally, includes actual images of people undertaking all sorts of activities involving human genitals.
A "dead human bodies" category that excludes mummies "because we're not idiots" is, by definition, not neutral.
2011/9/19 Stephen Bain stephen.bain@gmail.com
On Tue, Sep 20, 2011 at 12:47 AM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
We discussed this already and came to the conclusion, that you would need hundreds of these categories to filter out most of the "objectionable content".
And once again, the labelling doesn't need to be perfect (nothing on a wiki is) if an option to hide all images by default is implemented (which at present there seems to be broad support for, from most quarters).
The accuracy of filtering can then be disclaimed, with a recommendation that people can hide all images if they want a guarantee. Coarse-grained labelling is then good enough, and we can even adopt the position that where there is no consensus, the image will not be filtered.
On Tue, Sep 20, 2011 at 1:17 AM, David Gerard dgerard@gmail.com wrote:
I'd estimate the chances as pretty high that we're going to get a thorough exploration of every possible axis that's measured for a filter.
So you're thinking to apply this only to photos, then?
No. And of course artworks are being used as examples because they're going to present the corner cases. But all of these discussions seem to be proceeding on the basis that there are nothing but corner cases, when really (I would imagine) pretty much everything that will be filtered will be either:
- actual images of human genitals [1],
- actual images of dead human bodies, or
- imagery subject to religious restriction.
Almost all will be in the first two categories, and most of those in the first one, and will primarily be photographs.
On Tue, Sep 20, 2011 at 1:29 AM, Fae fae@wikimedia.org.uk wrote:
Er, Egyptian mummies are real bodies that would need real photographs.
For a wealth of horrific examples that need to be censored, please enjoy viewing http://commons.wikimedia.org/wiki/Category:Mummies
On the basis that the community, by and large, is not comprised wholly of idiots, I'm sure it will be capable of holding a sensible discussion as to whether images of mummies (not to forget bog bodies and Pompeii castings, as further examples) would be in or out of such a category.
And again, perfection is not necessary. If someone has "dead bodies" filtered and sees the filtered image placeholder with the caption "this is an Egyptian mummy", they can elect to show that particular image, or decide that they would like to turn off the filter. Or if such a "dead bodies" filter is described as not including Egyptian mummies, someone could decide to hide all images by default. This doesn't have to be difficult.
-- [1] Which, naturally, includes actual images of people undertaking all sorts of activities involving human genitals.
-- Stephen Bain stephen.bain@gmail.com
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On 19 September 2011 17:42, M. Williamson node.ue@gmail.com wrote:
A "dead human bodies" category that excludes mummies "because we're not idiots" is, by definition, not neutral.
I agree, sounds like the only solution is that we pour away a hefty chunk of those charitably donated WMF millions in a few hundred thousand variations of the categories and anyone that likes, say, Bain's version of "common sense" can read BainChildFriendlyWiki.org instead of the horrid open Wikipedia with it's dreadful nudity, mutilated bodies, heresies and images of educational and cultural notability.
Alternatively anyone who has "common sense" can take Wikipedia for free and hack it about in their own time and cash in by selling it to schools that would like to benefit from a *guaranteed* child friendly and religiously tolerant (out of date) version.
Cheers, Fae
Fae wrote:
On 19 September 2011 17:42, M. Williamson node.ue@gmail.com wrote:
A "dead human bodies" category that excludes mummies "because we're not idiots" is, by definition, not neutral.
I agree, sounds like the only solution is that we pour away a hefty chunk of those charitably donated WMF millions in a few hundred thousand variations of the categories and anyone that likes, say, Bain's version of "common sense" can read BainChildFriendlyWiki.org instead of the horrid open Wikipedia with it's dreadful nudity, mutilated bodies, heresies and images of educational and cultural notability.
Alternatively anyone who has "common sense" can take Wikipedia for free and hack it about in their own time and cash in by selling it to schools that would like to benefit from a *guaranteed* child friendly and religiously tolerant (out of date) version.
Hasn't this already happened, albeit on a voluntary basis, and with free distribution?
On 19 September 2011 18:57, Phil Nash phnash@blueyonder.co.uk wrote:
Hasn't this already happened, albeit on a voluntary basis, and with free distribution? http://schools-wikipedia.org/
If that were sufficient for whatever purpose the Board is thinking of, this proposal wouldn't have happened.
So we need a detailed response from them as to
1. what they were thinking of that this doesn't cover 2. why the market hasn't provided it.
- d.
Does not meet the criteria previously discussed, I immediately find unrestricted depictions (and photographs) of bare breasted women and close up photographs of dead bodies. They made the mistake of putting culture before religious taboos and images that might scare children. * http://schools-wikipedia.org/wp/m/Mummy.htm * http://schools-wikipedia.org/images/150/15070.png.htm * http://schools-wikipedia.org/images/893/89336.jpg.htm * http://schools-wikipedia.org/images/925/92551.jpg.htm
Cheers, Fae
On 19 September 2011 18:24, Fae fae@wikimedia.org.uk wrote:
Alternatively anyone who has "common sense" can take Wikipedia for free and hack it about in their own time and cash in by selling it to schools that would like to benefit from a *guaranteed* child friendly and religiously tolerant (out of date) version.
Indeed. I'm still after explanations as to why, if this is so in demand, the market has failed.
- d.
Am 19.09.2011 18:08, schrieb Stephen Bain:
On Tue, Sep 20, 2011 at 12:47 AM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
We discussed this already and came to the conclusion, that you would need hundreds of these categories to filter out most of the "objectionable content".
And once again, the labelling doesn't need to be perfect (nothing on a wiki is) if an option to hide all images by default is implemented (which at present there seems to be broad support for, from most quarters).
If we implement an "hide all images" option, then we solved already 95% of all possible use cases mentioned before. Now we take on the doubtful work to categorize for even lower potential need? I support the "hide all" option. But if we have this feature, then, especially then, i see absolutely no need for any categorization. Then we create hobby project for censors with even less support from the community. I definitely can't follow your reasoning in this case.
Am 19.09.2011 18:08, schrieb Stephen Bain:
No. And of course artworks are being used as examples because they're going to present the corner cases. But all of these discussions seem to be proceeding on the basis that there are nothing but corner cases, when really (I would imagine) pretty much everything that will be filtered will be either:
- actual images of human genitals [1],
- actual images of dead human bodies, or
- imagery subject to religious restriction.
Almost all will be in the first two categories, and most of those in the first one, and will primarily be photographs.
Lets reverse the examples: If you only create such a categories, then depictions like the linked ones below will not be seen as offensive to some or count as "actual images of human genitals"? Would this not rise the question that the criteria are not sufficient?
* http://commons.wikimedia.org/wiki/File:Futanari.png * http://commons.wikimedia.org/wiki/File:Hentai_-_yuuree-redraw.jpg
Stephen Bain wrote:
And once again, the labelling doesn't need to be perfect (nothing on a wiki is) if an option to hide all images by default is implemented (which at present there seems to be broad support for, from most quarters).
With such an option in place, why take on the task of labeling images on readers' behalf? As previously discussed, it would be a logistical nightmare and enormous resource drain.
Furthermore, I don't regard "If you don't like our specific filters, you can just use the blanket one." as a remotely appropriate message to convey to the millions of readers whose beliefs we fail to individually accommodate.
The accuracy of filtering can then be disclaimed, with a recommendation that people can hide all images if they want a guarantee.
Or we can simply accept the idea's infeasibility and provide the non-broken image filter implementation alone.
Note that disclaimers won't satisfy the masses. (If they did, we wouldn't be having this discussion.) As soon as the WMF introduces such a feature, large segments of the public will expect it to function flawlessly and become outraged when it doesn't.
I foresee sensationalistic media reports about our "child protection filters that let through smut and gore." (I realize that we aren't advertising "child protection filters." Nonetheless, that's a likely perception, regardless of any disclaimers.)
And of course artworks are being used as examples because they're going to present the corner cases. But all of these discussions seem to be proceeding on the basis that there are nothing but corner cases, when really (I would imagine) pretty much everything that will be filtered will be either:
- actual images of human genitals [1],
- actual images of dead human bodies, or
- imagery subject to religious restriction.
Almost all will be in the first two categories, and most of those in the first one, and will primarily be photographs. [1] Which, naturally, includes actual images of people undertaking all sorts of activities involving human genitals.
Firstly, I don't know why you've singled out genitals. People commonly regard depictions of other portions of the human anatomy (such as buttocks and female breasts) as objectionable.
Secondly, "imagery subject to religious restriction" (which doesn't constitute a viable "category" in this context) includes "images of unveiled women." You state that "almost all will be in the first two categories," but I'm confident that we host significantly more images of unveiled women than images of human genitals and images of dead human bodies combined.
Thirdly, there's been a great deal of discussion regarding other images to which people commonly object, such those depicting violence (whatever that means), surgical procedures, spiders and snakes, to name but a few.
On the basis that the community, by and large, is not comprised wholly of idiots, I'm sure it will be capable of holding a sensible discussion as to whether images of mummies (not to forget bog bodies and Pompeii castings, as further examples) would be in or out of such a category.
...thereby arriving at an inherently subjective, binary determination that fails to meet many readers' expectations.
And again, perfection is not necessary. If someone has "dead bodies" filtered and sees the filtered image placeholder with the caption "this is an Egyptian mummy", they can elect to show that particular image, or decide that they would like to turn off the filter. Or if such a "dead bodies" filter is described as not including Egyptian mummies, someone could decide to hide all images by default.
Or we could simply provide that functionality alone, thereby enabling the same scenario.
This doesn't have to be difficult.
Indeed.
David Levy
but neutrally decidable criteria like "depicts nude female breasts", "depicts the face of Muhammad", "depicts mutilated dead body".
...
User:Slomox
All of these would be problematic; if these were the default criteria for a school to enforce on their pupils when using school computers, one could imagine images of many 18th century paintings or depictions of gods being excluded due to "nude female breasts" and one would have to exclude all photographs of ancient Egyptian mummies as they were all "mutilated" as part of the process for embalming.
Cheers, Fae
On 19 September 2011 15:50, Fae fae@wikimedia.org.uk wrote:
All of these would be problematic; if these were the default criteria for a school to enforce on their pupils when using school computers, one could imagine images of many 18th century paintings or depictions of gods being excluded due to "nude female breasts" and one would have to exclude all photographs of ancient Egyptian mummies as they were all "mutilated" as part of the process for embalming.
How much is "mutilated"? A scratch? Ten scratches? A hundred scratches? St Sebastian? http://commons.wikimedia.org/wiki/File:Sebastia.jpg
This is what I mean by "magical categories". "I know it when I see it" isn't enough. "Lots of us know it when we see it" is a blatant neutrality violation.
You need to be able to answer these sort of questions if you advocate a filtering system involving any assessment of image content.
- d.
On Tue, Sep 20, 2011 at 12:56 AM, David Gerard dgerard@gmail.com wrote:
How much is "mutilated"? A scratch? Ten scratches? A hundred scratches? St Sebastian? http://commons.wikimedia.org/wiki/File:Sebastia.jpg
I'm struggling to recall an example in any of these threads that's not an artwork.
On 19 September 2011 16:14, Stephen Bain stephen.bain@gmail.com wrote:
On Tue, Sep 20, 2011 at 12:56 AM, David Gerard dgerard@gmail.com wrote:
How much is "mutilated"? A scratch? Ten scratches? A hundred scratches? St Sebastian? http://commons.wikimedia.org/wiki/File:Sebastia.jpg
I'm struggling to recall an example in any of these threads that's not an artwork.
I'd estimate the chances as pretty high that we're going to get a thorough exploration of every possible axis that's measured for a filter.
So you're thinking to apply this only to photos, then?
- d.
I'm struggling to recall an example in any of these threads that's not an artwork.
...
Stephen Bain
Er, Egyptian mummies are real bodies that would need real photographs.
For a wealth of horrific examples that need to be censored, please enjoy viewing http://commons.wikimedia.org/wiki/Category:Mummies
Fae
Marcus Buck wrote:
From what I understood the image filter will not have subjective criteria like "a little offensive", "very offensive", "pornography", but neutrally decidable criteria like "depicts nude female breasts", "depicts the face of Muhammad", "depicts mutilated dead body".
The WMF outline cites "5–10 categories," with "sexual imagery" and "violent imagery" as examples: http://meta.wikimedia.org/wiki/Image_filter_referendum/en
As Tobias Oelgarte noted, even with the type of specificity to which you refer, the omission of countless widespread objections (a non-neutral, discriminatory practice) would be unavoidable. Who's going to analyze millions of images (with thousands more uploaded every day) to tag the ones containing unveiled women?
Tobias also pointed out that even if we _were_ to include every widespread objection, the resultant quantity of filter categories would be unmanageable.
As David Gerard noted, terms like "mutilated" are highly subjective. At what level of injury does a murder victim qualify? Fae mentioned ancient Egyptian mummies. Do they count?
Does an autopsy constitute "mutilation"? Is a dead man's circumcised penis "mutilated"? (Many people would argue that it is.)
Do non-photographic images qualify? If so, this article contains several images that arguably should be filtered: http://en.wikipedia.org/wiki/Descent_from_the_Cross
Speaking of artwork, Fae mentioned the depictions of "nude female breasts" contained therein. Do those count? What about photographs of breasts taken in medical contexts? Are those equivalent to those taken in sexual contexts? If not, how do we define a "sexual context"?
Do you foresee clear consensus in these areas?
This is what Fred Bauder means when he notes that "some depictions of such things are offensive or pornographic and some not at all." But this is entirely subjective. We can expect endless debates on these subjects and countless others (and probably tag wars as well).
Conversely, the alternative image filter implementation on which I've harped would enable readers to decide for themselves on an individual, case-by-case basis: http://meta.wikimedia.org/wiki/Talk:Image_filter_referendum/en/Categories#ge... or http://goo.gl/t6ly5
David Levy
On Mon, Sep 19, 2011 at 01:45:18PM -0400, David Levy wrote:
Speaking of artwork, Fae mentioned the depictions of "nude female breasts" contained therein. Do those count? What about photographs of breasts taken in medical contexts? Are those equivalent to those taken in sexual contexts? If not, how do we define a "sexual context"?
Also, patriotic contexts: http://en.wikipedia.org/wiki/Marianne especially: http://en.wikipedia.org/wiki/File:Eug%C3%A8ne_Delacroix_-_La_libert%C3%A9_gu...
Blocking Marianne in france might be tricky. NOT blocking her elsewhere? Also tricky.
sincerely, Kim Bruning
On 9/19/11 11:24 PM, Kim Bruning wrote:
On Mon, Sep 19, 2011 at 01:45:18PM -0400, David Levy wrote:
Speaking of artwork, Fae mentioned the depictions of "nude female breasts" contained therein. Do those count? What about photographs of breasts taken in medical contexts? Are those equivalent to those taken in sexual contexts? If not, how do we define a "sexual context"?
Also, patriotic contexts: http://en.wikipedia.org/wiki/Marianne especially: http://en.wikipedia.org/wiki/File:Eug%C3%A8ne_Delacroix_-_La_libert%C3%A9_gu...
Blocking Marianne in france might be tricky. NOT blocking her elsewhere? Also tricky.
sincerely, Kim Bruning
Censoring Freedom incarnated... for a free encyclopedia.... ça ne manque pas de sel....
Am 18.09.2011 13:56, schrieb Andre Engels:
On Sun, Sep 18, 2011 at 11:45 AM, Tobias Oelgarte< tobias.oelgarte@googlemail.com> wrote:
Am 18.09.2011 09:46, schrieb Andre Engels:
On Sun, Sep 18, 2011 at 3:49 AM, Jussi-Ville Heiskanen<
cimonavaro@gmail.com
wrote: Wikimedia *used* to hold the position that we wouldn't aid China to
block
images of the Tianamen Massacre, and went to great lengths to assure that chinese users of Wikipedia could evade blocks to viewing. I am not sure you are on a right track with regards to our traditions and values here.
There's a big difference between the two in that the Chinese case was
about
people wanting to decide what _others_ could see, the filter is about
people
wanting to decide what _they themselves_ would see.
And who decides which image belongs to which category. The one that will use the filter or the one that tags the image?
On itself the one who tags the image, but we happen to have a system for that in Wikimedia. It is called discussion and trying to reach consent. Who decides whether a page is in a category? Who decides whether a page has an image? Who decides whether something is decribed on a page? All the same.
There you have a lot of room for compromise. You might shorten an argument or decide to expand another. Most importantly you have arguments that you can quote. The decision for including an image is (should be) measured by value for illustration.
The filter-tagging is the opposite. You have no room for compromise. It does belong to an category or it does not. How would a compromise look like?
Which arguments will be used in this discussions. To expect to see quotes/sources to define if this, this particular image, is offensive/objectionable or not?
Have a try. I uploaded http://commons.wikimedia.org/wiki/File:Anime_Girl.svg some time ago. Now put neutral arguments on the table, if you would tag it as "nudity" or why you would not do so. It's a very simple task compared to others. So lets hear your argumentation and what the compromise should look like, if you would not come to the same conclusion.
I'm bold. I state as the first argument that it does not belong to the category "nudity", because the depicted figure wears clothes.
Additionally: Is the reader able to choose if China would use the tags to exclude content before it can the reader? Wouldn't we be responsible it, if the feature is misused this way, since we know how easy it can be misused?
I don't think it's that easy, and if it were, the best thing would be to make it harder to misuse rather than to throw away the child with the bathwater.
Maybe it would be lot easier to use a contraceptive. Then you won't have a child that you might be thrown away with the bathwater.
* Andre Engels wrote:
Thereby giving those who have objections nothing just because there are others who we can't give what they want. If we had the same attitude towards article creation, we would not have published Wikipedia until we had articles on all subjects we could think of.
They are given plenty, in fact there are all sorts of filters already in place that lead to people not being exposed to media they do not wish to be exposed to, starting with laws, what people regard as appropriate for encyclopedic content, and local cultural norms affecting the latter, to name just a few examples. They also get to see images they do not object to without additional effort, and they have the option to hide them all, and they can be careful which articles they load, avoiding those likely to feature media they do not want. Exposure to things you are uncomfort- able with, where the feeling is not overwhelmingly shared, like when you click a link to a foreign Wikipedia, is also giving them something, most likely something good (consider images that cannot legally be displayed publically in some jurisdiction; they will be so displayed in others in- cluding on Wikipedias where they are relevant, with no filtering for the users from the jurisdiction that banned it, up to where it's likely that a court that Wikimedia cares about orders the image to be taken down).
You can consider the last point from the other direction: if you don't like to see media with human suffering, horror, graphic violence, so you filter them, what should you see when reading about Nazi concentration camps? Profile pictures of Nazi leaders, architecture, maps maybe, but please, not the prisoners? What about people who find it wrong to make it easy for others to scroll down a Wikipedia article without the reader being visually confronted with human suffering if there is a consensus to display it at all in the article in this manner? Or, for that matter, that in this context it is wrong to require the reader to tell the com- puter "Yes, please show me piles or rotting corpses of starved people!"? Note that it may be the user of the filter who thinks this, in which case not giving them a filter that would do this is addressing one of their needs aswell (while failing to address another need; giving them a context-aware filter that avoids this problem would work of course, but then the system would be harder to use making it worse, and so on).
So we already do plenty so people are not overexposed to media that they reasonably do not wish to be exposed to (note that I use "wish" broadly, someone suffering phobiae for instance has more of a need than a wish to avoid certain kinds of media). To a point really where I am unsure what is left that can realistically be optimized even further, and I am some- what surprised the "referendum" had so many people voting that this is of the highest priority (whatever that means given that this wasn't com- pared to other things that should be of high priority), though since it was already decided to introduce a filter because there is a problem, it can be assumed that some of the votes are subject to confirmation bias.
(I do not know people who would say they frequently encounter articles on Wikipedia featuring images that would disturb them no matter where they appear and would thus prefer to have those pictures hidden for them, though I do know people who would prefer that medical articles concerning humans feature images that go beyond nudity, like showing the symptoms of a disease, or a corpse that has been cut open, towards the end in a specially labeled section, and people who do not like insects much, and thus do not browse around articles on insects. Neither of the two examples leads to me thinking a filter as proposed is the solution.)
We don't say they're unreasonable, we say that we don't cater to it, or at least not yet. That may be non-neutral, but no more non-neutral than that one subject has an article and the other not, or one picture is used to describe an article and the other not, or one category is deemed important enough to be used to categorize our articles, books, words and images and another not.
Or even clearer: that one language has a Wikipedia and another not. Wid we make a non-neutral choice that only certain languages (the ones for which Wikipedias exist) are valid languages to use for spreading knowledge?
These analogies are all invalid as individual preference is rejected as argument in all of these cases, while the filter is solely concerned with individual preference (rather: aggregated individual preferences). Tagging an image with tag X, knowing that this will lead to the image being hidden to users who chose to hide all X images, is a matter of whether these users want the image to be hidden, who are in a minority, because if the majority agrees an image should be hidden, it would not be shown at all, no need for a personal filter; with the added problem that the people affected by the tagging are the least likely to engage in discussions about the tagging; and the added problem that this is by its very nature a leaky system, as editors will consider whether some image is filtered for some in their decisions whether to include it to begin with (add a, to a minority, "controversial" image to an article that lacks controversial images and likely get reverted, for instance).
On Sun, Sep 18, 2011 at 4:16 AM, David Levy lifeisunfair@gmail.com wrote:
Tobias Oelgarte described one key problem. Another lies in the labeling of some things and not others. Unless we were to create and apply a label for literally everything that someone finds objectionable, we'd be taking the non-neutral position that only certain objections (the ones for which filters exist) are reasonable.
NPOV involves determining whether viewpoints are widely held, are held by substantial or significant minorities, or are held by an extremely small or vastly limited minority and therefore not suitable to be covered in articles. This is an editorial decision-making process that all editors perform all the time. Determining which filters to work on is entirely analogous to this process, which is inherently neutral.
Am 18.09.2011 02:45, schrieb Stephen Bain:
On Sun, Sep 18, 2011 at 4:16 AM, David Levylifeisunfair@gmail.com wrote:
Tobias Oelgarte described one key problem. Another lies in the labeling of some things and not others. Unless we were to create and apply a label for literally everything that someone finds objectionable, we'd be taking the non-neutral position that only certain objections (the ones for which filters exist) are reasonable.
NPOV involves determining whether viewpoints are widely held, are held by substantial or significant minorities, or are held by an extremely small or vastly limited minority and therefore not suitable to be covered in articles. This is an editorial decision-making process that all editors perform all the time. Determining which filters to work on is entirely analogous to this process, which is inherently neutral.
You must be kidding me to describe that as "inherently neutral". You miss the point that articles are build upon reputable sources. Therefore the sources are the ones that state the different point of views. We quote that points and gather them. We only exclude viewpoints which did not pass an editorial process already.
Categorizing the images is not the same as gathering viewpoints from sources. It is the viewpoint of the contributer.
Given your argumentation we would only need to write about one opinion. Our opinion.
Stephen Bain wrote:
NPOV involves determining whether viewpoints are widely held, are held by substantial or significant minorities, or are held by an extremely small or vastly limited minority and therefore not suitable to be covered in articles. This is an editorial decision-making process that all editors perform all the time. Determining which filters to work on is entirely analogous to this process, which is inherently neutral.
Gauging a viewpoint's level of coverage by reliable sources is achievable via objective criteria. We don't take anyone's side and aren't bound by practical limitations on the number of widely covered views we can document (irrespective of the quantity of articles required).
Conversely, category-based filtering would require us to accept/reject our readers' views in a binary fashion. (A type of "objectionable" image would either receive a filter or not.) This would convey a formal determination that x beliefs warrant accommodation and y beliefs don't, which isn't remotely the same as documenting these views in a neutral manner.
Perhaps you have in mind that we could accommodate objections that are "widely held" or "held by substantial or significant minorities," thereby excluding only the ones "held by an extremely small or vastly limited minority." As I noted in another reply, setting aside any philosophical issues, this isn't technically feasible. For logistical reasons, the numerical limit would be far lower (with an example of "5–10 categories" cited by the WMF).
The above doesn't even touch on the categories' population, which would entail non-stop argumentation over whether particular images belong in particular categories. Once again, contrary to the creation of articles documenting a wide range of views, the decision would be binary: filter or don't filter. Unlike our normal categorization scheme's large number of objective classifications, this would rely on a small number of subjective ones (created not to provide neutral descriptions, but to label the images' subjects "potentially objectionable").
There's no need for any of this. We can accommodate _everyone_ via a vastly simpler, fully neutral setup. If you haven't already, please see the relevant discussion: http://meta.wikimedia.org/wiki/Talk:Image_filter_referendum/en/Categories#ge... or http://goo.gl/t6ly5
David Levy
On Fri, Sep 16, 2011 at 07:13:26PM +0200, Milos Rancic wrote:
I would repeat the best possible solution to end this: Implement it on English Wikipedia -- you (those who want that filter) have some numbers which would support that action -- and leave the rest of the projects alone.
Thanks for throwing us en.wikipedians under the bus dude! ;-)
Might be nice to wait with enwiki implementation too until we have something that we know will not have Evil Repercussions.
Else we end up forking enwiki, if my spidey^Wwiki senses don't betray me. O:-)
sincerly, Kim Bruning
On Fri, Sep 16, 2011 at 9:10 PM, Kim Bruning kim@bruning.xs4all.nl wrote:
Might be nice to wait with enwiki implementation too until we have something that we know will not have Evil Repercussions.
Else we end up forking enwiki, if my spidey^Wwiki senses don't betray me. O:-)
Definitely. I will personally bankroll any fork that will have the momentum of one third of editorship behind it, or in a pinch just a quarter of editorship, if nearly half of the core editors are included in that quarter.
I guess I should qualify the above by saying I will not bankroll a hostile fork. I will not bankroll somebodys private project. Has to have real momentum, otherwise no point in forking, might as well try to work things out *within* the community.
Just to be clear, I might not join a hypothetical fork myself as an editor, but I feel the ability to fork is so precious a boon for projects like us, that it would be worth it to bankroll one purely on the principle of the thing.
On 17 September 2011 09:17, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
Just to be clear, I might not join a hypothetical fork myself as an editor, but I feel the ability to fork is so precious a boon for projects like us, that it would be worth it to bankroll one purely on the principle of the thing.
We need people to try the technical basics of a fork, i.e. taking an en:wp dump, an images dump, a copy of MediaWiki and relevant extensions and making it functional and documenting the procedure.
- d.
On Sat, Sep 17, 2011 at 7:11 PM, David Gerard dgerard@gmail.com wrote:
... We need people to try the technical basics of a fork, i.e. taking an en:wp dump, an images dump, ..
Is there an images dump?
On 17 September 2011 10:16, John Vandenberg jayvdb@gmail.com wrote:
On Sat, Sep 17, 2011 at 7:11 PM, David Gerard dgerard@gmail.com wrote:
We need people to try the technical basics of a fork, i.e. taking an en:wp dump, an images dump, ..
Is there an images dump?
If there isn't, there should be.
(I'm now trying to work out how to get the images without using up all my bandwidth allowances ever.)
- d.
On Sat, Sep 17, 2011 at 11:23, David Gerard dgerard@gmail.com wrote:
(I'm now trying to work out how to get the images without using up all my bandwidth allowances ever.)
Something like: rsync -av --bwlimit=50 (if you want to give ~5Mbps; number is in KiBps)
David Gerard wrote:
On 17 September 2011 10:16, John Vandenberg jayvdb@gmail.com wrote:
On Sat, Sep 17, 2011 at 7:11 PM, David Gerard dgerard@gmail.com wrote:
We need people to try the technical basics of a fork, i.e. taking an en:wp dump, an images dump, ..
Is there an images dump?
If there isn't, there should be.
(I'm now trying to work out how to get the images without using up all my bandwidth allowances ever.)
It's easy enough to get a VPS with unlimited bandwidth. It's a few terabytes of data, though, depending on what you're talking about. Thumbnails, current images, older versions of images, deleted images, math renderings, etc. The sanest solution probably involves mailing a hard drive to someone and then having them mail it back.
MZMcBride
On 17 September 2011 15:06, MZMcBride z@mzmcbride.com wrote:
David Gerard wrote:
On 17 September 2011 10:16, John Vandenberg jayvdb@gmail.com wrote:
On Sat, Sep 17, 2011 at 7:11 PM, David Gerard dgerard@gmail.com wrote:
We need people to try the technical basics of a fork, i.e. taking an en:wp dump, an images dump, ..
Is there an images dump?
If there isn't, there should be.
(I'm now trying to work out how to get the images without using up all my bandwidth allowances ever.)
It's easy enough to get a VPS with unlimited bandwidth. It's a few terabytes of data, though, depending on what you're talking about. Thumbnails, current images, older versions of images, deleted images, math renderings, etc. The sanest solution probably involves mailing a hard drive to someone and then having them mail it back.
MZMcBride
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Of course if you're only thinking about forking Wikipedia, rather than Commons, you can just use InstantCommons [1]. For English Wikipedia you'd still have a lot of fair use to copy, but German and many other languages wouldn't have that problem.
That said, there really ought to be an image dump available too.
Pete / the wub
On Sat, Sep 17, 2011 at 03:35:44PM +0100, Peter Coombe wrote:
Of course if you're only thinking about forking Wikipedia, rather than Commons, you can just use InstantCommons [1]. For English Wikipedia you'd still have a lot of fair use to copy, but German and many other languages wouldn't have that problem.
That said, there really ought to be an image dump available too.
Pete / the wub
I think we want to investigate forking commons too.
If any of you fine folks could use a vps, what specs do you need? I prolly have a little space to make one. :-) Debian ok for starters?
I'm not sure forking is a good idea right here, right now, but it's always useful to check that failing to an external backup is still tractable.
sincerely, Kim Bruning
On 16/09/2011 18:13, Milos Rancic wrote:
I would repeat the best possible solution to end this: Implement it on English Wikipedia -- you (those who want that filter) have some numbers which would support that action -- and leave the rest of the projects alone. _______________________________________________ foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
I hope that was another joke. If you look at the discussion on Meta many of the strongest opponents were en: illuminati such as David Levy.
That is an legal issue. We do that to comply with the law, since that image isn't in public domain under German jurisdiction. This has nothing to do with hiding perfectly legal content. Additionally an optional filter would not help to make it legal. Filter or no filter wouldn't change a thing.
Two different topics, one wrong assumption.
Tobias
Am 16.09.2011 12:39, schrieb emijrp:
Hi all;
There are more issues with images in German Wikipedia.
It is funny how German Wikipedia doesn't allow images[1] (image added by me[2] in de:, and later removed by other[3]) because they follow the most restrictive copyright law from Germany, Austria and Switzerland[note 1], but they are now against giving people the choice to hide images.
I think that we can do a nice move here. We can enable image filter in German Wikipedia for all those who don't want to see copyrighted-images-for-German-law, meanwhile allowing other people to see all Commons splendour. Using the image filter to improve the rights of readers of German Wikipedia. Very cool, right? ; )
Regards, emijrp
[1] http://de.wikipedia.org/wiki/Wikipedia:Bildrechte#Wikipedia_richtet_sich_nac... [2] http://de.wikipedia.org/w/index.php?title=Alexander_Knox&oldid=81377280 [3] http://de.wikipedia.org/wiki/Alexander_Knox
[note 1] I heard there are German speaking users outside Europe, right? I heard too that, from Germany, you can follow interwiki and see that images in other Wikipedias, right? So, what is the sense of that policy? Are not the servers in USA?
2011/9/16 Tobias Oelgartetobias.oelgarte@googlemail.com
Dear readers
Yesterday, on September 15th 2011, the German Wikipedia closed the poll (Meinungsbild) "Einführung persönlicher Bildfilter". [1] It asked the question if the personal image filter can be introduced or if it should not be introduced.
A strong majority of 86% percent voted to not allow the personal image filter [2] , despite the fact that the board already decided to introduce the feature.
The questions are:
- Will the board or the WMF proceed with the introduction of the
personal image filter against the will of it's second largest community?
- If the WMF/board does not care about the first question. Will it
affect the way the personal image filter will be implemented? For example: Not for all projects. A different implementation as suggested inside the "image filter referendum".
- Will there be an attempt to follow this example and to question other
communities the same question?
Greetings from Tobias Oelgarte
[1]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C... [2]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
2011/9/16 Tobias Oelgarte tobias.oelgarte@googlemail.com
That is an legal issue. We do that to comply with the law, since that image isn't in public domain under German jurisdiction.
Who is "we"? And, why does German jurisdiction matter here?
Am 16.09.2011 12:57, schrieb emijrp:
2011/9/16 Tobias Oelgartetobias.oelgarte@googlemail.com
That is an legal issue. We do that to comply with the law, since that image isn't in public domain under German jurisdiction.
Who is "we"? And, why does German jurisdiction matter here?
It's called "Schutzlandprinzip" [1]. Sorry that there is no English article about this topic as a whole. But you might read
* http://en.wikipedia.org/wiki/Berne_Convention_for_the_Protection_of_Literary...
* http://en.wikipedia.org/wiki/Agreement_on_Trade-Related_Aspects_of_Intellect...
* http://en.wikipedia.org/wiki/World_Intellectual_Property_Organization_Copyri...
* http://en.wikipedia.org/wiki/Paris_Convention_for_the_Protection_of_Industri...
Since when is the German Wikipedia under the domain of German jurisdiction? The German Wikipedia is an international project hosted in the United States. Am I missing something here?
Ryan Kaldari
On 9/16/11 3:48 AM, Tobias Oelgarte wrote:
That is an legal issue. We do that to comply with the law, since that image isn't in public domain under German jurisdiction. This has nothing to do with hiding perfectly legal content. Additionally an optional filter would not help to make it legal. Filter or no filter wouldn't change a thing.
Two different topics, one wrong assumption.
Tobias
Am 16.09.2011 12:39, schrieb emijrp:
Hi all;
There are more issues with images in German Wikipedia.
It is funny how German Wikipedia doesn't allow images[1] (image added by me[2] in de:, and later removed by other[3]) because they follow the most restrictive copyright law from Germany, Austria and Switzerland[note 1], but they are now against giving people the choice to hide images.
I think that we can do a nice move here. We can enable image filter in German Wikipedia for all those who don't want to see copyrighted-images-for-German-law, meanwhile allowing other people to see all Commons splendour. Using the image filter to improve the rights of readers of German Wikipedia. Very cool, right? ; )
Regards, emijrp
[1] http://de.wikipedia.org/wiki/Wikipedia:Bildrechte#Wikipedia_richtet_sich_nac... [2] http://de.wikipedia.org/w/index.php?title=Alexander_Knox&oldid=81377280 [3] http://de.wikipedia.org/wiki/Alexander_Knox
[note 1] I heard there are German speaking users outside Europe, right? I heard too that, from Germany, you can follow interwiki and see that images in other Wikipedias, right? So, what is the sense of that policy? Are not the servers in USA?
2011/9/16 Tobias Oelgartetobias.oelgarte@googlemail.com
Dear readers
Yesterday, on September 15th 2011, the German Wikipedia closed the poll (Meinungsbild) "Einführung persönlicher Bildfilter". [1] It asked the question if the personal image filter can be introduced or if it should not be introduced.
A strong majority of 86% percent voted to not allow the personal image filter [2] , despite the fact that the board already decided to introduce the feature.
The questions are:
- Will the board or the WMF proceed with the introduction of the
personal image filter against the will of it's second largest community?
- If the WMF/board does not care about the first question. Will it
affect the way the personal image filter will be implemented? For example: Not for all projects. A different implementation as suggested inside the "image filter referendum".
- Will there be an attempt to follow this example and to question other
communities the same question?
Greetings from Tobias Oelgarte
[1]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C... [2]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
You miss the point that German contributers are bound to German law. They have to play along the German- or DACH (German, Austria, Switzerland) law, if they don't want to be in personal trouble. Since we don't want the provider, the WMF, to be in trouble we also comply to US-law as well. Thats why we are bound to comply to both sets laws.
For instance we can't rely on fair use, since there is no fair use in Germany. We have exceptions for press (we aren't) and for educational purpose (we might be, but not always). Additionally the project decided to enforce free content over content with very tight rational.
Am 16.09.2011 20:04, schrieb Ryan Kaldari:
Since when is the German Wikipedia under the domain of German jurisdiction? The German Wikipedia is an international project hosted in the United States. Am I missing something here?
Ryan Kaldari
On 9/16/11 3:48 AM, Tobias Oelgarte wrote:
That is an legal issue. We do that to comply with the law, since that image isn't in public domain under German jurisdiction. This has nothing to do with hiding perfectly legal content. Additionally an optional filter would not help to make it legal. Filter or no filter wouldn't change a thing.
Two different topics, one wrong assumption.
Tobias
Am 16.09.2011 12:39, schrieb emijrp:
Hi all;
There are more issues with images in German Wikipedia.
It is funny how German Wikipedia doesn't allow images[1] (image added by me[2] in de:, and later removed by other[3]) because they follow the most restrictive copyright law from Germany, Austria and Switzerland[note 1], but they are now against giving people the choice to hide images.
I think that we can do a nice move here. We can enable image filter in German Wikipedia for all those who don't want to see copyrighted-images-for-German-law, meanwhile allowing other people to see all Commons splendour. Using the image filter to improve the rights of readers of German Wikipedia. Very cool, right? ; )
Regards, emijrp
[1] http://de.wikipedia.org/wiki/Wikipedia:Bildrechte#Wikipedia_richtet_sich_nac... [2] http://de.wikipedia.org/w/index.php?title=Alexander_Knox&oldid=81377280 [3] http://de.wikipedia.org/wiki/Alexander_Knox
[note 1] I heard there are German speaking users outside Europe, right? I heard too that, from Germany, you can follow interwiki and see that images in other Wikipedias, right? So, what is the sense of that policy? Are not the servers in USA?
2011/9/16 Tobias Oelgartetobias.oelgarte@googlemail.com
Dear readers
Yesterday, on September 15th 2011, the German Wikipedia closed the poll (Meinungsbild) "Einführung persönlicher Bildfilter". [1] It asked the question if the personal image filter can be introduced or if it should not be introduced.
A strong majority of 86% percent voted to not allow the personal image filter [2] , despite the fact that the board already decided to introduce the feature.
The questions are:
- Will the board or the WMF proceed with the introduction of the
personal image filter against the will of it's second largest community?
- If the WMF/board does not care about the first question. Will it
affect the way the personal image filter will be implemented? For example: Not for all projects. A different implementation as suggested inside the "image filter referendum".
- Will there be an attempt to follow this example and to question other
communities the same question?
Greetings from Tobias Oelgarte
[1]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C... [2]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Your stupid!
Am 16.09.2011 12:39, schrieb emijrp:
Hi all;
There are more issues with images in German Wikipedia.
It is funny how German Wikipedia doesn't allow images[1] (image added by me[2] in de:, and later removed by other[3]) because they follow the most restrictive copyright law from Germany, Austria and Switzerland[note 1], but they are now against giving people the choice to hide images.
I think that we can do a nice move here. We can enable image filter in German Wikipedia for all those who don't want to see copyrighted-images-for-German-law, meanwhile allowing other people to see all Commons splendour. Using the image filter to improve the rights of readers of German Wikipedia. Very cool, right? ; )
Regards, emijrp
[1] http://de.wikipedia.org/wiki/Wikipedia:Bildrechte#Wikipedia_richtet_sich_nac... [2] http://de.wikipedia.org/w/index.php?title=Alexander_Knox&oldid=81377280 [3] http://de.wikipedia.org/wiki/Alexander_Knox
[note 1] I heard there are German speaking users outside Europe, right? I heard too that, from Germany, you can follow interwiki and see that images in other Wikipedias, right? So, what is the sense of that policy? Are not the servers in USA?
2011/9/16 Tobias Oelgarte tobias.oelgarte@googlemail.com
Dear readers
Yesterday, on September 15th 2011, the German Wikipedia closed the poll (Meinungsbild) "Einführung persönlicher Bildfilter". [1] It asked the question if the personal image filter can be introduced or if it should not be introduced.
A strong majority of 86% percent voted to not allow the personal image filter [2] , despite the fact that the board already decided to introduce the feature.
The questions are:
- Will the board or the WMF proceed with the introduction of the
personal image filter against the will of it's second largest community?
- If the WMF/board does not care about the first question. Will it
affect the way the personal image filter will be implemented? For example: Not for all projects. A different implementation as suggested inside the "image filter referendum".
- Will there be an attempt to follow this example and to question other
communities the same question?
Greetings from Tobias Oelgarte
[1]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C... [2]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Not really, no. What does german law have to do with reading or editing some website in Tampa or Virginia from somewhere in Asia? I can understand that german nationals cannot use that image, but whatabout other germen speakers?
Strainu
2011/9/16 Liesel koehler-liesel73@gmx.de:
Your stupid!
Am 16.09.2011 12:39, schrieb emijrp:
Hi all;
There are more issues with images in German Wikipedia.
It is funny how German Wikipedia doesn't allow images[1] (image added by me[2] in de:, and later removed by other[3]) because they follow the most restrictive copyright law from Germany, Austria and Switzerland[note 1], but they are now against giving people the choice to hide images.
I think that we can do a nice move here. We can enable image filter in German Wikipedia for all those who don't want to see copyrighted-images-for-German-law, meanwhile allowing other people to see all Commons splendour. Using the image filter to improve the rights of readers of German Wikipedia. Very cool, right? ; )
Regards, emijrp
[1] http://de.wikipedia.org/wiki/Wikipedia:Bildrechte#Wikipedia_richtet_sich_nac... [2] http://de.wikipedia.org/w/index.php?title=Alexander_Knox&oldid=81377280 [3] http://de.wikipedia.org/wiki/Alexander_Knox
[note 1] I heard there are German speaking users outside Europe, right? I heard too that, from Germany, you can follow interwiki and see that images in other Wikipedias, right? So, what is the sense of that policy? Are not the servers in USA?
2011/9/16 Tobias Oelgarte tobias.oelgarte@googlemail.com
Dear readers
Yesterday, on September 15th 2011, the German Wikipedia closed the poll (Meinungsbild) "Einführung persönlicher Bildfilter". [1] It asked the question if the personal image filter can be introduced or if it should not be introduced.
A strong majority of 86% percent voted to not allow the personal image filter [2] , despite the fact that the board already decided to introduce the feature.
The questions are:
- Will the board or the WMF proceed with the introduction of the
personal image filter against the will of it's second largest community?
- If the WMF/board does not care about the first question. Will it
affect the way the personal image filter will be implemented? For example: Not for all projects. A different implementation as suggested inside the "image filter referendum".
- Will there be an attempt to follow this example and to question other
communities the same question?
Greetings from Tobias Oelgarte
[1]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C... [2]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Very simple: There is the Schutzlandprinzip. [1] To bad that EN has no article for it. But you can read the following articles to understand why this example was stupid, and that we bound to this laws, even if the servers are at nirvana: * http://en.wikipedia.org/wiki/Berne_Convention_for_the_Protection_of_Literary... * http://en.wikipedia.org/wiki/Agreement_on_Trade-Related_Aspects_of_Intellect... * http://en.wikipedia.org/wiki/World_Intellectual_Property_Organization_Copyri... * http://en.wikipedia.org/wiki/Paris_Convention_for_the_Protection_of_Industri...
[1] http://de.wikipedia.org/wiki/Schutzlandprinzip
Am 16.09.2011 13:01, schrieb Strainu:
Not really, no. What does german law have to do with reading or editing some website in Tampa or Virginia from somewhere in Asia? I can understand that german nationals cannot use that image, but whatabout other germen speakers?
Strainu
2011/9/16 Lieselkoehler-liesel73@gmx.de:
Your stupid!
Am 16.09.2011 12:39, schrieb emijrp:
Hi all;
There are more issues with images in German Wikipedia.
It is funny how German Wikipedia doesn't allow images[1] (image added by me[2] in de:, and later removed by other[3]) because they follow the most restrictive copyright law from Germany, Austria and Switzerland[note 1], but they are now against giving people the choice to hide images.
I think that we can do a nice move here. We can enable image filter in German Wikipedia for all those who don't want to see copyrighted-images-for-German-law, meanwhile allowing other people to see all Commons splendour. Using the image filter to improve the rights of readers of German Wikipedia. Very cool, right? ; )
Regards, emijrp
[1] http://de.wikipedia.org/wiki/Wikipedia:Bildrechte#Wikipedia_richtet_sich_nac... [2] http://de.wikipedia.org/w/index.php?title=Alexander_Knox&oldid=81377280 [3] http://de.wikipedia.org/wiki/Alexander_Knox
[note 1] I heard there are German speaking users outside Europe, right? I heard too that, from Germany, you can follow interwiki and see that images in other Wikipedias, right? So, what is the sense of that policy? Are not the servers in USA?
2011/9/16 Tobias Oelgartetobias.oelgarte@googlemail.com
Dear readers
Yesterday, on September 15th 2011, the German Wikipedia closed the poll (Meinungsbild) "Einführung persönlicher Bildfilter". [1] It asked the question if the personal image filter can be introduced or if it should not be introduced.
A strong majority of 86% percent voted to not allow the personal image filter [2] , despite the fact that the board already decided to introduce the feature.
The questions are:
- Will the board or the WMF proceed with the introduction of the
personal image filter against the will of it's second largest community?
- If the WMF/board does not care about the first question. Will it
affect the way the personal image filter will be implemented? For example: Not for all projects. A different implementation as suggested inside the "image filter referendum".
- Will there be an attempt to follow this example and to question other
communities the same question?
Greetings from Tobias Oelgarte
[1]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C... [2]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Again, who are "we"? And why do German laws matter with USA servers? Why does only German Wikipedia exclude that images? Are English Wikipedia or Commons blocked in Germany?
By the way, not all German Wikipedians/readers are under German laws.
And I don't remember any Wikipedia removing content to respect Chinese local laws when Wikipedia was blocked there.
2011/9/16 Tobias Oelgarte tobias.oelgarte@googlemail.com
Very simple: There is the Schutzlandprinzip. [1] To bad that EN has no article for it. But you can read the following articles to understand why this example was stupid, and that we bound to this laws, even if the servers are at nirvana:
http://en.wikipedia.org/wiki/Berne_Convention_for_the_Protection_of_Literary...
http://en.wikipedia.org/wiki/Agreement_on_Trade-Related_Aspects_of_Intellect...
http://en.wikipedia.org/wiki/World_Intellectual_Property_Organization_Copyri...
http://en.wikipedia.org/wiki/Paris_Convention_for_the_Protection_of_Industri...
[1] http://de.wikipedia.org/wiki/Schutzlandprinzip
Am 16.09.2011 13:01, schrieb Strainu:
Not really, no. What does german law have to do with reading or editing some website in Tampa or Virginia from somewhere in Asia? I can understand that german nationals cannot use that image, but whatabout other germen speakers?
Strainu
2011/9/16 Lieselkoehler-liesel73@gmx.de:
Your stupid!
Am 16.09.2011 12:39, schrieb emijrp:
Hi all;
There are more issues with images in German Wikipedia.
It is funny how German Wikipedia doesn't allow images[1] (image added
by
me[2] in de:, and later removed by other[3]) because they follow the
most
restrictive copyright law from Germany, Austria and Switzerland[note
1], but
they are now against giving people the choice to hide images.
I think that we can do a nice move here. We can enable image filter in German Wikipedia for all those who don't want to see copyrighted-images-for-German-law, meanwhile allowing other people to
see
all Commons splendour. Using the image filter to improve the rights of readers of German Wikipedia. Very cool, right? ; )
Regards, emijrp
[1]
http://de.wikipedia.org/wiki/Wikipedia:Bildrechte#Wikipedia_richtet_sich_nac...
[2]
http://de.wikipedia.org/w/index.php?title=Alexander_Knox&oldid=81377280
[3] http://de.wikipedia.org/wiki/Alexander_Knox
[note 1] I heard there are German speaking users outside Europe, right?
I
heard too that, from Germany, you can follow interwiki and see that
images
in other Wikipedias, right? So, what is the sense of that policy? Are
not
the servers in USA?
2011/9/16 Tobias Oelgartetobias.oelgarte@googlemail.com
Dear readers
Yesterday, on September 15th 2011, the German Wikipedia closed the
poll
(Meinungsbild) "Einführung persönlicher Bildfilter". [1] It asked the question if the personal image filter can be introduced or if it
should
not be introduced.
A strong majority of 86% percent voted to not allow the personal image filter [2] , despite the fact that the board already decided to introduce the feature.
The questions are:
- Will the board or the WMF proceed with the introduction of the
personal image filter against the will of it's second largest
community?
- If the WMF/board does not care about the first question. Will it
affect the way the personal image filter will be implemented? For example: Not for all projects. A different implementation as suggested inside the "image filter referendum".
- Will there be an attempt to follow this example and to question
other
communities the same question?
Greetings from Tobias Oelgarte
[1]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
[2]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe:
https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
German laws matter for german citizend. It's still not clear what that means for the rest of us (except that de.wp took a decision)
2011/9/16 emijrp emijrp@gmail.com:
Again, who are "we"? And why do German laws matter with USA servers? Why does only German Wikipedia exclude that images? Are English Wikipedia or Commons blocked in Germany?
By the way, not all German Wikipedians/readers are under German laws.
And I don't remember any Wikipedia removing content to respect Chinese local laws when Wikipedia was blocked there.
2011/9/16 Tobias Oelgarte tobias.oelgarte@googlemail.com
Very simple: There is the Schutzlandprinzip. [1] To bad that EN has no article for it. But you can read the following articles to understand why this example was stupid, and that we bound to this laws, even if the servers are at nirvana:
http://en.wikipedia.org/wiki/Berne_Convention_for_the_Protection_of_Literary...
http://en.wikipedia.org/wiki/Agreement_on_Trade-Related_Aspects_of_Intellect...
http://en.wikipedia.org/wiki/World_Intellectual_Property_Organization_Copyri...
http://en.wikipedia.org/wiki/Paris_Convention_for_the_Protection_of_Industri...
[1] http://de.wikipedia.org/wiki/Schutzlandprinzip
Am 16.09.2011 13:01, schrieb Strainu:
Not really, no. What does german law have to do with reading or editing some website in Tampa or Virginia from somewhere in Asia? I can understand that german nationals cannot use that image, but whatabout other germen speakers?
Strainu
2011/9/16 Lieselkoehler-liesel73@gmx.de:
Your stupid!
Am 16.09.2011 12:39, schrieb emijrp:
Hi all;
There are more issues with images in German Wikipedia.
It is funny how German Wikipedia doesn't allow images[1] (image added
by
me[2] in de:, and later removed by other[3]) because they follow the
most
restrictive copyright law from Germany, Austria and Switzerland[note
1], but
they are now against giving people the choice to hide images.
I think that we can do a nice move here. We can enable image filter in German Wikipedia for all those who don't want to see copyrighted-images-for-German-law, meanwhile allowing other people to
see
all Commons splendour. Using the image filter to improve the rights of readers of German Wikipedia. Very cool, right? ; )
Regards, emijrp
[1]
http://de.wikipedia.org/wiki/Wikipedia:Bildrechte#Wikipedia_richtet_sich_nac...
[2]
http://de.wikipedia.org/w/index.php?title=Alexander_Knox&oldid=81377280
[3] http://de.wikipedia.org/wiki/Alexander_Knox
[note 1] I heard there are German speaking users outside Europe, right?
I
heard too that, from Germany, you can follow interwiki and see that
images
in other Wikipedias, right? So, what is the sense of that policy? Are
not
the servers in USA?
2011/9/16 Tobias Oelgartetobias.oelgarte@googlemail.com
Dear readers
Yesterday, on September 15th 2011, the German Wikipedia closed the
poll
(Meinungsbild) "Einführung persönlicher Bildfilter". [1] It asked the question if the personal image filter can be introduced or if it
should
not be introduced.
A strong majority of 86% percent voted to not allow the personal image filter [2] , despite the fact that the board already decided to introduce the feature.
The questions are:
- Will the board or the WMF proceed with the introduction of the
personal image filter against the will of it's second largest
community?
- If the WMF/board does not care about the first question. Will it
affect the way the personal image filter will be implemented? For example: Not for all projects. A different implementation as suggested inside the "image filter referendum".
- Will there be an attempt to follow this example and to question
other
communities the same question?
Greetings from Tobias Oelgarte
[1]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
[2]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe:
https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
It means that the German Wikipedia has to respect DACH(German, Austria, Switzerland)-law as well as US-law. The first affects the authors directly. The second because the servers are hosted in US. If we don't comply with US-law the page might be taken down by US jurisdiction. It we don't comply with DACH-law we might be prosecuted by ourself.
Thats why we have to respect both laws and thats also why Commons needs to comply to the law of any country that signed the previously linked treaties.
Quite simple isn't it? But this has nothing to do with the original topic anymore. We talked about the fact, that a big majority of the German speaking contributers don't want that filter and how the WMF thinks to react onto this.
Tobias
Am 16.09.2011 13:28, schrieb Strainu:
German laws matter for german citizend. It's still not clear what that means for the rest of us (except that de.wp took a decision)
2011/9/16 emijrpemijrp@gmail.com:
Again, who are "we"? And why do German laws matter with USA servers? Why does only German Wikipedia exclude that images? Are English Wikipedia or Commons blocked in Germany?
By the way, not all German Wikipedians/readers are under German laws.
And I don't remember any Wikipedia removing content to respect Chinese local laws when Wikipedia was blocked there.
2011/9/16 Tobias Oelgartetobias.oelgarte@googlemail.com
Very simple: There is the Schutzlandprinzip. [1] To bad that EN has no article for it. But you can read the following articles to understand why this example was stupid, and that we bound to this laws, even if the servers are at nirvana:
http://en.wikipedia.org/wiki/Berne_Convention_for_the_Protection_of_Literary...
http://en.wikipedia.org/wiki/Agreement_on_Trade-Related_Aspects_of_Intellect...
http://en.wikipedia.org/wiki/World_Intellectual_Property_Organization_Copyri...
http://en.wikipedia.org/wiki/Paris_Convention_for_the_Protection_of_Industri...
[1] http://de.wikipedia.org/wiki/Schutzlandprinzip
Am 16.09.2011 13:01, schrieb Strainu:
Not really, no. What does german law have to do with reading or editing some website in Tampa or Virginia from somewhere in Asia? I can understand that german nationals cannot use that image, but whatabout other germen speakers?
Strainu
2011/9/16 Lieselkoehler-liesel73@gmx.de:
Your stupid!
Am 16.09.2011 12:39, schrieb emijrp:
Hi all;
There are more issues with images in German Wikipedia.
It is funny how German Wikipedia doesn't allow images[1] (image added
by
me[2] in de:, and later removed by other[3]) because they follow the
most
restrictive copyright law from Germany, Austria and Switzerland[note
1], but
they are now against giving people the choice to hide images.
I think that we can do a nice move here. We can enable image filter in German Wikipedia for all those who don't want to see copyrighted-images-for-German-law, meanwhile allowing other people to
see
all Commons splendour. Using the image filter to improve the rights of readers of German Wikipedia. Very cool, right? ; )
Regards, emijrp
[1]
http://de.wikipedia.org/wiki/Wikipedia:Bildrechte#Wikipedia_richtet_sich_nac...
[2]
http://de.wikipedia.org/w/index.php?title=Alexander_Knox&oldid=81377280
[3] http://de.wikipedia.org/wiki/Alexander_Knox
[note 1] I heard there are German speaking users outside Europe, right?
I
heard too that, from Germany, you can follow interwiki and see that
images
in other Wikipedias, right? So, what is the sense of that policy? Are
not
the servers in USA?
2011/9/16 Tobias Oelgartetobias.oelgarte@googlemail.com
> Dear readers > > Yesterday, on September 15th 2011, the German Wikipedia closed the
poll
> (Meinungsbild) "Einführung persönlicher Bildfilter". [1] It asked the > question if the personal image filter can be introduced or if it
should
> not be introduced. > > A strong majority of 86% percent voted to not allow the personal image > filter [2] , despite the fact that the board already decided to > introduce the feature. > > The questions are: > * Will the board or the WMF proceed with the introduction of the > personal image filter against the will of it's second largest
community?
> * If the WMF/board does not care about the first question. Will it > affect the way the personal image filter will be implemented? For > example: Not for all projects. A different implementation as suggested > inside the "image filter referendum". > * Will there be an attempt to follow this example and to question
other
> communities the same question? > > Greetings from > Tobias Oelgarte > > [1] > >
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
> [2] > >
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
> _______________________________________________ > foundation-l mailing list > foundation-l@lists.wikimedia.org > Unsubscribe:
https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Direct this question at Commons. Why are images deleted that are perfectly legal after US-law but not after DACH-(German, Austria, Switzerland)-law _and_ the other way around? Ask them, and you will get your answer.
I mention this, because it is common practice, that you might not heard about. You really should try to understand the matter of copyright and that it has nothing to do with _self_ censoring or the image filter.
We (the German authors) are bound to German law, since the page is directed at an major German readership (Schutzlandprinzip). But we also have to take care of US-law, since the servers are hosted inside the US. The later applies for the content the WMF hosts, not for us German contributers. But to be nice, we consider US-law as well.
Tobias
Am 16.09.2011 13:22, schrieb emijrp:
Again, who are "we"? And why do German laws matter with USA servers? Why does only German Wikipedia exclude that images? Are English Wikipedia or Commons blocked in Germany?
By the way, not all German Wikipedians/readers are under German laws.
And I don't remember any Wikipedia removing content to respect Chinese local laws when Wikipedia was blocked there.
2011/9/16 Tobias Oelgartetobias.oelgarte@googlemail.com
Very simple: There is the Schutzlandprinzip. [1] To bad that EN has no article for it. But you can read the following articles to understand why this example was stupid, and that we bound to this laws, even if the servers are at nirvana:
http://en.wikipedia.org/wiki/Berne_Convention_for_the_Protection_of_Literary...
http://en.wikipedia.org/wiki/Agreement_on_Trade-Related_Aspects_of_Intellect...
http://en.wikipedia.org/wiki/World_Intellectual_Property_Organization_Copyri...
http://en.wikipedia.org/wiki/Paris_Convention_for_the_Protection_of_Industri...
[1] http://de.wikipedia.org/wiki/Schutzlandprinzip
Am 16.09.2011 13:01, schrieb Strainu:
Not really, no. What does german law have to do with reading or editing some website in Tampa or Virginia from somewhere in Asia? I can understand that german nationals cannot use that image, but whatabout other germen speakers?
Strainu
2011/9/16 Lieselkoehler-liesel73@gmx.de:
Your stupid!
Am 16.09.2011 12:39, schrieb emijrp:
Hi all;
There are more issues with images in German Wikipedia.
It is funny how German Wikipedia doesn't allow images[1] (image added
by
me[2] in de:, and later removed by other[3]) because they follow the
most
restrictive copyright law from Germany, Austria and Switzerland[note
1], but
they are now against giving people the choice to hide images.
I think that we can do a nice move here. We can enable image filter in German Wikipedia for all those who don't want to see copyrighted-images-for-German-law, meanwhile allowing other people to
see
all Commons splendour. Using the image filter to improve the rights of readers of German Wikipedia. Very cool, right? ; )
Regards, emijrp
[1]
http://de.wikipedia.org/wiki/Wikipedia:Bildrechte#Wikipedia_richtet_sich_nac...
[2]
http://de.wikipedia.org/w/index.php?title=Alexander_Knox&oldid=81377280
[3] http://de.wikipedia.org/wiki/Alexander_Knox
[note 1] I heard there are German speaking users outside Europe, right?
I
heard too that, from Germany, you can follow interwiki and see that
images
in other Wikipedias, right? So, what is the sense of that policy? Are
not
the servers in USA?
2011/9/16 Tobias Oelgartetobias.oelgarte@googlemail.com
Dear readers
Yesterday, on September 15th 2011, the German Wikipedia closed the
poll
(Meinungsbild) "Einführung persönlicher Bildfilter". [1] It asked the question if the personal image filter can be introduced or if it
should
not be introduced.
A strong majority of 86% percent voted to not allow the personal image filter [2] , despite the fact that the board already decided to introduce the feature.
The questions are:
- Will the board or the WMF proceed with the introduction of the
personal image filter against the will of it's second largest
community?
- If the WMF/board does not care about the first question. Will it
affect the way the personal image filter will be implemented? For example: Not for all projects. A different implementation as suggested inside the "image filter referendum".
- Will there be an attempt to follow this example and to question
other
communities the same question?
Greetings from Tobias Oelgarte
[1]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
[2]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe:
https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
2011/9/16 Tobias Oelgarte tobias.oelgarte@googlemail.com:
We (the German authors) are bound to German law, since the page is directed at an major German readership (Schutzlandprinzip). But we also have to take care of US-law, since the servers are hosted inside the US. The later applies for the content the WMF hosts, not for us German contributers. But to be nice, we consider US-law as well.
Tobias, that I can understand. What I don't understand is why one should call emijrp stupid instead of explaining that this is a German Wikipedia rule ment to protect the majority of users users from inadvertently breaking the law of their country.
And I still don't understand why you called that example "stupid".
Strainu
Am 16.09.2011 13:50, schrieb Strainu:
2011/9/16 Tobias Oelgartetobias.oelgarte@googlemail.com:
We (the German authors) are bound to German law, since the page is directed at an major German readership (Schutzlandprinzip). But we also have to take care of US-law, since the servers are hosted inside the US. The later applies for the content the WMF hosts, not for us German contributers. But to be nice, we consider US-law as well.
Tobias, that I can understand. What I don't understand is why one should call emijrp stupid instead of explaining that this is a German Wikipedia rule ment to protect the majority of users users from inadvertently breaking the law of their country.
And I still don't understand why you called that example "stupid".
Strainu
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
I call this example "stupid", since it was constructed to make a point, even so it was lack of knowledge to propose it that way and it creates an awkward subtopic with no relation to the initial problem (see headline) whatsoever.
This was the blatantly assumption/imputation that the image filter would be fine, since we already self censor ourself; which isn't true. We just respect the copyright law, to protect the readers/reusers. (Print a book with that image and publish it inside Germany. If noticed by the original author/photographer you might be in trouble).
Tobias
Am 16.09.2011 13:22, schrieb emijrp:
Again, who are "we"? And why do German laws matter with USA servers? Why does only German Wikipedia exclude that images? Are English Wikipedia or Commons blocked in Germany?
It's not like the picture is used in every Wikipedia either.
How about we just agree that the German Wikipedia project decided to apply the German/Austrian/Swiss laws when it comes to images?
And again, on Commons you will always find deletions based on local law rather than applying U.S. laws. And vice-versa.
Regards, Oliver
On Fri, Sep 16, 2011 at 3:22 PM, emijrp emijrp@gmail.com wrote:
By the way, not all German Wikipedians/readers are under German laws.
Most are. And if German Wikipedia community believes that in order for de.wp to be more useful it should be made redistributable under the law of most German-speaking countries, that would be a reasonable choice for them.
--vvv
Warum werden Bilder die gemäß der Panoramafreiheit in DACH zulässig sind auf Commons gelöscht. Warum werden Bilder die über 100 jahre alt sind und bei denen der Autor unbekannt ist auf Commons gelöscht
Wir brauchen also einen Filter der solche Bilder für Länder wegzensiert, die keine Panoramafreiheit kennen.
Liesel
Am 16.09.2011 13:01, schrieb Strainu:
Not really, no. What does german law have to do with reading or editing some website in Tampa or Virginia from somewhere in Asia? I can understand that german nationals cannot use that image, but whatabout other germen speakers?
Strainu
2011/9/16 Liesel koehler-liesel73@gmx.de:
Your stupid!
Am 16.09.2011 12:39, schrieb emijrp:
Hi all;
There are more issues with images in German Wikipedia.
It is funny how German Wikipedia doesn't allow images[1] (image added by me[2] in de:, and later removed by other[3]) because they follow the most restrictive copyright law from Germany, Austria and Switzerland[note 1], but they are now against giving people the choice to hide images.
I think that we can do a nice move here. We can enable image filter in German Wikipedia for all those who don't want to see copyrighted-images-for-German-law, meanwhile allowing other people to see all Commons splendour. Using the image filter to improve the rights of readers of German Wikipedia. Very cool, right? ; )
Regards, emijrp
[1] http://de.wikipedia.org/wiki/Wikipedia:Bildrechte#Wikipedia_richtet_sich_nac... [2] http://de.wikipedia.org/w/index.php?title=Alexander_Knox&oldid=81377280 [3] http://de.wikipedia.org/wiki/Alexander_Knox
[note 1] I heard there are German speaking users outside Europe, right? I heard too that, from Germany, you can follow interwiki and see that images in other Wikipedias, right? So, what is the sense of that policy? Are not the servers in USA?
2011/9/16 Tobias Oelgarte tobias.oelgarte@googlemail.com
Dear readers
Yesterday, on September 15th 2011, the German Wikipedia closed the poll (Meinungsbild) "Einführung persönlicher Bildfilter". [1] It asked the question if the personal image filter can be introduced or if it should not be introduced.
A strong majority of 86% percent voted to not allow the personal image filter [2] , despite the fact that the board already decided to introduce the feature.
The questions are:
- Will the board or the WMF proceed with the introduction of the
personal image filter against the will of it's second largest community?
- If the WMF/board does not care about the first question. Will it
affect the way the personal image filter will be implemented? For example: Not for all projects. A different implementation as suggested inside the "image filter referendum".
- Will there be an attempt to follow this example and to question other
communities the same question?
Greetings from Tobias Oelgarte
[1]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C... [2]
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
wikimedia-l@lists.wikimedia.org