Ladies and Gentlemen,
The committee running the vote on the features for the Personal Image Filter have released their interim report and vote count. You may see the results at http://meta.wikimedia.org/wiki/Image_filter_referendum/Results/en. Please note that the results are not final: although the vote count is, and has been finalized, the analysis of comments is ongoing.
Posted on behalf of the committee, Philippe ___________________ Philippe Beaudette Head of Reader Relations Wikimedia Foundation, Inc.
philippe@wikimedia.org
On 4 September 2011 05:33, Philippe Beaudette pbeaudette@wikimedia.org wrote:
The committee running the vote on the features for the Personal Image Filter have released their interim report and vote count. You may see the results at http://meta.wikimedia.org/wiki/Image_filter_referendum/Results/en.
The bimodal distribution in the first graph suggests this feature will continue to be controversial (to say the least), with fans saying "we had the majority" and foes saying "there is clearly not a consensus".
So. What happens now?
- d.
On the other hand: "I think it is important not to..." isn't necessarily interpreted the same as "I think it is not important to...". Which of those answers was meant by the respondents that chose "0"?
\Mike
On 04/09 2011 11:17, David Gerard wrote:
On 4 September 2011 05:33, Philippe Beaudettepbeaudette@wikimedia.org wrote:
The committee running the vote on the features for the Personal Image Filter have released their interim report and vote count. You may see the results at http://meta.wikimedia.org/wiki/Image_filter_referendum/Results/en.
The bimodal distribution in the first graph suggests this feature will continue to be controversial (to say the least), with fans saying "we had the majority" and foes saying "there is clearly not a consensus".
So. What happens now?
- d.
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Blame the wording of the survey for not be clear enough? _____ *Béria Lima*
*Imagine um mundo onde é dada a qualquer pessoa a possibilidade de ter livre acesso ao somatório de todo o conhecimento humano. É isso o que estamos a fazer http://wikimediafoundation.org/wiki/Nossos_projetos.*
On 4 September 2011 11:35, Mikael mikael79@gmail.com wrote:
On the other hand: "I think it is important not to..." isn't necessarily interpreted the same as "I think it is not important to...". Which of those answers was meant by the respondents that chose "0"?
\Mike
On 04/09 2011 11:17, David Gerard wrote:
On 4 September 2011 05:33, Philippe Beaudettepbeaudette@wikimedia.org
wrote:
The committee running the vote on the features for the Personal Image
Filter
have released their interim report and vote count. You may see the
results
at http://meta.wikimedia.org/wiki/Image_filter_referendum/Results/en.
The bimodal distribution in the first graph suggests this feature will continue to be controversial (to say the least), with fans saying "we had the majority" and foes saying "there is clearly not a consensus".
So. What happens now?
- d.
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
I said from the beginning that this poll was too badly designed for anyone to be able to draw useful conclusions from whatever the results are. I think that has been proven correct.
A very large proportion of voters said they don't consider the feature important. If they simply mean "not important" then the result could be considered a mandate to proceed. If they actually mean they are opposed to the feature, which seems likely given the number of negative comments, then there is not even a clear majority in favour.
While I personally am in favour of this feature, I urge the Foundation not to proceed with it without further consultation. To ask the community for their views and then not actually take those views into account (which you can't do since you can't tell what they are) would be a an insult to the community and would significantly harm relations between the Foundation and those it exists to serve.
The Foundation needs to be mature enough to admit that they've screwed up this survey, apologise and try again. Next time, start by figuring out what you want to achieve by asking the questions and then choose the questions accordingly. On Sep 4, 2011 11:39 AM, "Béria Lima" berialima@gmail.com wrote:
Blame the wording of the survey for not be clear enough? _____ *Béria Lima*
*Imagine um mundo onde é dada a qualquer pessoa a possibilidade de ter
livre
acesso ao somatório de todo o conhecimento humano. É isso o que estamos a fazer http://wikimediafoundation.org/wiki/Nossos_projetos.*
On 4 September 2011 11:35, Mikael mikael79@gmail.com wrote:
On the other hand: "I think it is important not to..." isn't necessarily interpreted the same as "I think it is not important to...". Which of those answers was meant by the respondents that chose "0"?
\Mike
On 04/09 2011 11:17, David Gerard wrote:
On 4 September 2011 05:33, Philippe Beaudettepbeaudette@wikimedia.org
wrote:
The committee running the vote on the features for the Personal Image
Filter
have released their interim report and vote count. You may see the
results
at http://meta.wikimedia.org/wiki/Image_filter_referendum/Results/en.
The bimodal distribution in the first graph suggests this feature will continue to be controversial (to say the least), with fans saying "we had the majority" and foes saying "there is clearly not a consensus".
So. What happens now?
- d.
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Hello,
Frankly, I am quite unhappy about the referendum and share the concerns expressed by Thomas. I think that the Foundation did not take those Wikimedians serious who are opposed to the filter. The Foundation avoided the direct question whether someone is for or against the filter at all, this most important question was denied to the community.
From this perspective, the first question can be seen as manipulative.
On German language Wikipedia, there is a poll of its own. http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
It will go until September 15th, but by now the results are as follows: Against the filter 231, for the filter 44. Undecided 14.
Kind regards Ziko
2011/9/4 Thomas Dalton thomas.dalton@gmail.com:
I said from the beginning that this poll was too badly designed for anyone to be able to draw useful conclusions from whatever the results are. I think that has been proven correct.
A very large proportion of voters said they don't consider the feature important. If they simply mean "not important" then the result could be considered a mandate to proceed. If they actually mean they are opposed to the feature, which seems likely given the number of negative comments, then there is not even a clear majority in favour.
While I personally am in favour of this feature, I urge the Foundation not to proceed with it without further consultation. To ask the community for their views and then not actually take those views into account (which you can't do since you can't tell what they are) would be a an insult to the community and would significantly harm relations between the Foundation and those it exists to serve.
The Foundation needs to be mature enough to admit that they've screwed up this survey, apologise and try again. Next time, start by figuring out what you want to achieve by asking the questions and then choose the questions accordingly.
On 4 September 2011 14:08, Ziko van Dijk zvandijk@googlemail.com wrote:
Frankly, I am quite unhappy about the referendum and share the concerns expressed by Thomas. I think that the Foundation did not take those Wikimedians serious who are opposed to the filter. The Foundation avoided the direct question whether someone is for or against the filter at all, this most important question was denied to the community. From this perspective, the first question can be seen as manipulative.
I see no reason to assume it was not merely badly thought-out, rather than intended as manipulative.
Could someone please detail the process by which the questions were written?
On German language Wikipedia, there is a poll of its own. http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C... It will go until September 15th, but by now the results are as follows: Against the filter 231, for the filter 44. Undecided 14.
Could someone also please detail why this - the key question surrounding the whole issue - was not asked?
- d.
On Sun, Sep 04, 2011 at 03:08:54PM +0200, Ziko van Dijk wrote:
Hello,
On German language Wikipedia, there is a poll of its own. http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
Assuming that the .de community is similar to the wikimedia community at large, I think that the difference in results can largely be explained directly by the design of the !referendum.
The emerging discrepancies between the german vote and the !referendum, together with the known deficiencies in the !referendum design warrant some -slight- cause for concern, perhaps.
I'd like to run some sort of audit to allay potential concerns. Any ideas as to practicability and/or execution?
sincerely, Kim Bruning
On Sun, Sep 4, 2011 at 10:43 AM, Kim Bruning kim@bruning.xs4all.nl wrote:
On Sun, Sep 04, 2011 at 03:08:54PM +0200, Ziko van Dijk wrote:
Hello,
On German language Wikipedia, there is a poll of its own.
http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Einf%C3%BChrung_pers%C...
Assuming that the .de community is similar to the wikimedia community at large, I think that the difference in results can largely be explained directly by the design of the !referendum.
The emerging discrepancies between the german vote and the !referendum, together with the known deficiencies in the !referendum design warrant some -slight- cause for concern, perhaps.
I'd like to run some sort of audit to allay potential concerns. Any ideas as to practicability and/or execution?
sincerely, Kim Bruning
What type of audit? If you're speaking of data security/integrity, that's handled by SPI and there could be no tampering. If you're speaking of design, etc, there's room for a conversation. :) ~~~~
On 09/04/2011 07:43 PM, Kim Bruning wrote:
Assuming that the .de community is similar to the wikimedia community at large […]
That is where I disagree. The personal image filter doesn't make much sense in German Wikipedia, since the German culture is generally pretty liberal with respect to depictions of sexuality, (partially) violence and of course Muhammed. So it's clear that there is simply no or a very small necessity for a filter; thus the rejection.
The other extreme would be Acehnese Wikipedia (remember the boycott?[1]) There it might be a unique opportunity to help dissolve a very difficult conflict.
The image filter makes sense for some Wikipedia versions and for some not. All depending on the cultural background. The difficult question is, what to do with very multi-cultural Wikipedias, like English.
--Tobias
[1] http://lists.wikimedia.org/pipermail/foundation-l/2010-July/thread.html#5984...
On 4 September 2011 20:11, church.of.emacs.ml church.of.emacs.ml@googlemail.com wrote:
On 09/04/2011 07:43 PM, Kim Bruning wrote:
Assuming that the .de community is similar to the wikimedia community at large […]
That is where I disagree. The personal image filter doesn't make much sense in German Wikipedia, since the German culture is generally pretty liberal with respect to depictions of sexuality, (partially) violence and of course Muhammed. So it's clear that there is simply no or a very small necessity for a filter; thus the rejection.
What about Swastikas?
On 4 September 2011 20:42, Thomas Dalton thomas.dalton@gmail.com wrote:
On 4 September 2011 20:11, church.of.emacs.ml church.of.emacs.ml@googlemail.com wrote:
That is where I disagree. The personal image filter doesn't make much sense in German Wikipedia, since the German culture is generally pretty liberal with respect to depictions of sexuality, (partially) violence and of course Muhammed. So it's clear that there is simply no or a very small necessity for a filter; thus the rejection.
What about Swastikas?
http://de.wikipedia.org/wiki/Swastika looks good to me (visually and in Google translation).
(I realise you may have been asking more broadly than an educational context.)
- d.
http://de.wikipedia.org/wiki/Swastika looks good to me (visually and in Google translation).
(I realise you may have been asking more broadly than an educational context.)
You'll find them only in educational contexts as they are prohibited by law in any other context in Germany and this special law is enforced on dewp by the administration (which means, the use of such pictures in non-educational context is banned and leads to user blocks, and in some cases to police investigation).
Th.
On 4 September 2011 20:50, David Gerard dgerard@gmail.com wrote:
On 4 September 2011 20:42, Thomas Dalton thomas.dalton@gmail.com wrote:
On 4 September 2011 20:11, church.of.emacs.ml church.of.emacs.ml@googlemail.com wrote:
That is where I disagree. The personal image filter doesn't make much sense in German Wikipedia, since the German culture is generally pretty liberal with respect to depictions of sexuality, (partially) violence and of course Muhammed. So it's clear that there is simply no or a very small necessity for a filter; thus the rejection.
What about Swastikas?
http://de.wikipedia.org/wiki/Swastika looks good to me (visually and in Google translation).
(I realise you may have been asking more broadly than an educational context.)
I never said there was anything wrong with the German Wikipedia. I was suggesting that swastikas might be something German people would want to filter out, even if none of them are offended by sex, violence, or images of Muhammad. Even if that's not the case, there are all kinds of other things people might want to filter out. Sex, violence and Muhammad are just some of the most obvious examples, so they tend to be the ones we talk about.
On Sun, Sep 04, 2011 at 08:57:22PM +0100, Thomas Dalton wrote:
(I realise you may have been asking more broadly than an educational context.)
I never said there was anything wrong with the German Wikipedia. I was suggesting that swastikas might be something German people would want to filter out, even if none of them are offended by sex, violence, or images of Muhammad.
I...I don't think so. <speechless>
sincerely, Kim Bruning
On 4 September 2011 20:57, Thomas Dalton thomas.dalton@gmail.com wrote:
I never said there was anything wrong with the German Wikipedia. I was suggesting that swastikas might be something German people would want to filter out, even if none of them are offended by sex, violence, or images of Muhammad. Even if that's not the case, there are all kinds of other things people might want to filter out. Sex, violence and Muhammad are just some of the most obvious examples, so they tend to be the ones we talk about.
Well, yes, quite plausibly (I'm not German so I can't say from personal experience). That said, you can't go to an article called [[Swastika]] and not expect to see swastikas, any more than you can go to an article called [[Cock ring]] and not expect to see a cock ring.
The trouble is that at its edges, education is fundamentally disconcerting, upsetting and subversive. And that this is a matter only of degree, not of kind.
- d.
On 4 September 2011 21:12, David Gerard dgerard@gmail.com wrote:
Well, yes, quite plausibly (I'm not German so I can't say from personal experience). That said, you can't go to an article called [[Swastika]] and not expect to see swastikas, any more than you can go to an article called [[Cock ring]] and not expect to see a cock ring.
The trouble is that at its edges, education is fundamentally disconcerting, upsetting and subversive. And that this is a matter only of degree, not of kind.
I agree, and I would never turn on such a filter. That doesn't mean that other people shouldn't be allowed to if they want to.
On Sun, Sep 04, 2011 at 09:16:42PM +0100, Thomas Dalton wrote:
The trouble is that at its edges, education is fundamentally disconcerting, upsetting and subversive. And that this is a matter only of degree, not of kind.
I agree, and I would never turn on such a filter. That doesn't mean that other people shouldn't be allowed to if they want to.
Right, but then they won't be educated.
But, if they don't want to be educated, erm, why are they using an encyclopedia in the first place?
sincerely, Kim Bruning
On 4 September 2011 20:28, Kim Bruning kim@bruning.xs4all.nl wrote:
On Sun, Sep 04, 2011 at 09:16:42PM +0100, Thomas Dalton wrote:
I agree, and I would never turn on such a filter. That doesn't mean that other people shouldn't be allowed to if they want to.
Right, but then they won't be educated. But, if they don't want to be educated, erm, why are they using an encyclopedia in the first place?
Because they *demand* an Internet that acts as a one-way filter to their bubble, enhancing without contradicting!
- d.
On 04/09/2011 4:28 PM, David Gerard wrote:
Because they *demand* an Internet that acts as a one-way filter to their bubble, enhancing without contradicting!
Even if they did (which I believe to not even be true of readers at large -- just of a tiny but loud minority), I don't see how we're obligated to oblige.
In fact, I think we are morally bound to not only not provide for a mechanism to do so, but to make it hard for someone else to do so. Our objective is to give free access to knowledge to all, not to provide echo chambers where points of view and hear themselves in safety from the world outside.
-- Coren / Marc
On 4 September 2011 20:28, Kim Bruning kim@bruning.xs4all.nl wrote:
On Sun, Sep 04, 2011 at 09:16:42PM +0100, Thomas Dalton wrote:
The trouble is that at its edges, education is fundamentally disconcerting, upsetting and subversive. And that this is a matter only of degree, not of kind.
I agree, and I would never turn on such a filter. That doesn't mean that other people shouldn't be allowed to if they want to.
Right, but then they won't be educated.
But, if they don't want to be educated, erm, why are they using an encyclopedia in the first place?
They won't be educated *as much*. They can still be educated. If they don't use Wikipedia at all because of fear of seeing things they don't want to see (or, because their parents fear they will see things their parents thing they shouldn't see), then they aren't getting educated by Wikipedia at all. Seeing almost all of Wikipedia is better than seeing none of it.
On Sun, Sep 04, 2011 at 09:29:25PM +0100, Thomas Dalton wrote:
On 4 September 2011 20:28, Kim Bruning kim@bruning.xs4all.nl wrote:
On Sun, Sep 04, 2011 at 09:16:42PM +0100, Thomas Dalton wrote:
The trouble is that at its edges, education is fundamentally disconcerting, upsetting and subversive. And that this is a matter only of degree, not of kind.
I agree, and I would never turn on such a filter. That doesn't mean that other people shouldn't be allowed to if they want to.
Right, but then they won't be educated.
But, if they don't want to be educated, erm, why are they using an encyclopedia in the first place?
They won't be educated *as much*. They can still be educated. If they don't use Wikipedia at all because of fear of seeing things they don't want to see (or, because their parents fear they will see things their parents thing they shouldn't see), then they aren't getting educated by Wikipedia at all. Seeing almost all of Wikipedia is better than seeing none of it.
Seeing *almost* all of wikipedia introduces potential bias, which can actually be rather much worse than seeing none of wikipedia at all.
I think we have a rule about that.
aka: Thomas Dalton Wrote:
... they ... can still ... fear... their parents...
Hmm, I'd need just a few more words to *really* misquote you. ;-)
sincerely, Kim Bruning
On 4 September 2011 20:38, Kim Bruning kim@bruning.xs4all.nl wrote:
On Sun, Sep 04, 2011 at 09:29:25PM +0100, Thomas Dalton wrote:
They won't be educated *as much*. They can still be educated. If they don't use Wikipedia at all because of fear of seeing things they don't want to see (or, because their parents fear they will see things their parents thing they shouldn't see), then they aren't getting educated by Wikipedia at all. Seeing almost all of Wikipedia is better than seeing none of it.
Seeing *almost* all of wikipedia introduces potential bias, which can actually be rather much worse than seeing none of wikipedia at all. I think we have a rule about that.
Yes (maybe). It's not at all clear that this use case should not be ignored to avoid the possibility of compromising the encyclopedia.
I have to ask: if there's such a demand for a censored Wikipedia, where are the third-party providers? Anyone? This is a serious question. Even workplace filtermakers don't censor Wikipedia, as far as I know.
- d.
Yes (maybe). It's not at all clear that this use case should not be ignored to avoid the possibility of compromising the encyclopedia.
I have to ask: if there's such a demand for a censored Wikipedia, where are the third-party providers? Anyone? This is a serious question. Even workplace filtermakers don't censor Wikipedia, as far as I know.
Some workplace filters don't allow for certain subjects to be searched. I work at a major museum institution, I cannot view subject matter about certain sex topics (and I'm the Wikipedian in Residence, so I'm on WP most of my day). (i.e. "sexual differences").
I don't know why people are wigging out so badly about the image filter. If people want to use it, great, and if you don't, DON'T. But perhaps I'm misunderstanding something about the idea. I voted for it, and it seems the people who dislike the idea are the only one's speaking out on the list.
The idea that there is a choice is very empowering. Just like people filter television cable programming for their children, and internet access. Sometimes this appear when you least expect them, and to allow our users the choice, is great. I will probably never use it (even though I just found out there are plenty of things that gross me out that end up on Wikipedia by way of Commons images), but, I support the option.
And to say that a 4 year old being restricted from seeing nudity on Wikipedia is "not educating them" just makes me laugh out loud. Just like I wouldn't want my 4 year old (and no, I don't have kids, but I have nieces, nephews, etc) watching porn, playing violent video games or watching John Waters movies. :P (And I love John Waters!).
It's really fascinating how freely Wikipedians and Wikimedians love to throw around the word censorship. Someone should do a study on that.
Sarah
On Sun, Sep 04, 2011 at 04:51:27PM -0400, Sarah Stierch wrote:
Yes (maybe). It's not at all clear that this use case should not be ignored to avoid the possibility of compromising the encyclopedia.
Some workplace filters don't allow for certain subjects to be searched. I work at a major museum institution, I cannot view subject matter about certain sex topics (and I'm the Wikipedian in Residence, so I'm on WP most of my day). (i.e. "sexual differences").
That could be inconvenient at times. My favorite example so far is when a filter blocked me from looking up "The internet is for porn".
Which (fyi) is a song from an award-winning broadway musical http://en.wikipedia.org/wiki/The_internet_is_for_porn
(Just in case you were thinking strange thoughts)
A much worse case is when I'm trying to download software. You can imagine that in a multi-megabyte download, there's bound to be a sequence of bytes 83, 69, 88, 33 (ASCII for "SEX!") in there somewhere. Not to mention things like strong language in the linux kernal source code. http://www.vidarholen.net/contents/wordcount/
This means I get to either go home or (these days) use tethered G3 and sneakernet to actually do my <censored> job. ;-)
I don't know why people are wigging out so badly about the image filter. If people want to use it, great, and if you don't, DON'T. But perhaps I'm misunderstanding something about the idea. I voted for it, and it seems the people who dislike the idea are the only one's speaking out on the list.
* There's nothing wrong with the filter program itself * The problem is with categorizing things to work with such a program. * This is called prejudicial labelling * AMA defines prejudicial labelling as "A censoring tool" * This definition has existed for over half a century.
We also have huge discussions where it is explained in detail *why* and *how* such categories can be used for censorship. We also have discussed how a category system that starts out innocent and neutral can be subverted to serve in a censorship role. No one has found solutions how to prevent that from happening. AMA certainly hasn't been able to do so in the last 60 years. We might be smarter than AMA, but it's a hard problem.
Sincerely, Kim Bruning
I really wish people would read previous discussions.
* There's nothing wrong with the filter program itself
- The problem is with categorizing things to work with such a program.
- This is called prejudicial labelling
- AMA defines prejudicial labelling as "A censoring tool"
- This definition has existed for over half a century.
We also have huge discussions where it is explained in detail *why* and *how* such categories can be used for censorship. We also have discussed how a category system that starts out innocent and neutral can be subverted to serve in a censorship role. No one has found solutions how to prevent that from happening. AMA certainly hasn't been able to do so in the last 60 years. We might be smarter than AMA, but it's a hard problem.
Thanks for the clarification. This is not an area of research or interest to me strongly, so, a better understand is always really great. I just get frustrated, and today, has been one of those days (sorry to take it out on you guys! =)
I really wish people would read previous discussions.
Don't be passive aggressive ;) Some of these threads have a *ton* of replies, eventually a lot of us just hit the delete button until something jumps out. Myself, and others, appreciate clarifications. Wikipedia and related sister projects are bad at explaining things, so it's always nice to have kind folks like you explain things to some of us (again!) :).
Sarah
On 4 September 2011 22:18, Sarah Stierch sarah.stierch@gmail.com wrote:
I really wish people would read previous discussions.
Don't be passive aggressive ;)
I think it's an entirely reasonable statement, given what Kim's cited in his reply is stuff that came up in the last week.
- d.
On Sun, Sep 4, 2011 at 5:23 PM, David Gerard dgerard@gmail.com wrote:
On 4 September 2011 22:18, Sarah Stierch sarah.stierch@gmail.com wrote:
I really wish people would read previous discussions.
Don't be passive aggressive ;)
I think it's an entirely reasonable statement, given what Kim's cited in his reply is stuff that came up in the last week.
I travel a lot, I work a lot, and sometimes other things take priority over read really long Foundation-L threads. I apologize if I didn't make the time and burdened Kim in anyway.
And again, thanks Kim for repeating everything you've already said in the past and lazy people like me really appreciate it and I'd give you a barnstar for repetition if they made one. =)
in #wikilove (and frustration sometimes!),
-Sarah
On Sun, Sep 04, 2011 at 05:30:11PM -0400, Sarah Stierch wrote:
in #wikilove (and frustration sometimes!),
You're certainly very graceful, online. :-)
sincerely, Kim Bruning
On 4 September 2011 21:18, Kim Bruning kim@bruning.xs4all.nl wrote:
I really wish people would read previous discussions.
But it's all "LOL so simple" if you don't.
- d.
On 4 September 2011 21:18, Kim Bruning kim@bruning.xs4all.nl wrote:
I really wish people would read previous discussions.
I read the discussions, I just don't see any merit in the arguments. Of course the labels are prejudiced, that's the whole point. People can choose which prejudice they want and filter on those labels.
On Sun, Sep 04, 2011 at 10:50:26PM +0100, Thomas Dalton wrote:
On 4 September 2011 21:18, Kim Bruning kim@bruning.xs4all.nl wrote:
I really wish people would read previous discussions.
I read the discussions, I just don't see any merit in the arguments. Of course the labels are prejudiced, that's the whole point. People can choose which prejudice they want and filter on those labels.
Yes, exactly! You're smart! :-)
Now, one definition of censorship is : * Filtering on the basis of prejudicial labels.
We're not actually allowed to censor, because censorship is evil.
If we want to do this, we'll need to figure out a way to make an image filter which does not use prejudicial labels.
sincerely, Kim Bruning
-- A consensus a day keeps the arbcom away.
On Sep 4, 2011 11:02 PM, "Kim Bruning" kim@bruning.xs4all.nl wrote:
On Sun, Sep 04, 2011 at 10:50:26PM +0100, Thomas Dalton wrote:
On 4 September 2011 21:18, Kim Bruning kim@bruning.xs4all.nl wrote:
I really wish people would read previous discussions.
I read the discussions, I just don't see any merit in the arguments. Of course the labels are prejudiced, that's the whole point. People can choose which prejudice they want and filter on those labels.
Yes, exactly! You're smart! :-)
Now, one definition of censorship is :
- Filtering on the basis of prejudicial labels.
We're not actually allowed to censor, because censorship is evil.
If we want to do this, we'll need to figure out a way to make an image
filter
which does not use prejudicial labels.
Or we just reject that definition as obviously not applicable. If people are choosing for themselves whether to filter and, if so, what on then it clearly isn't censorship.
On Sun, Sep 04, 2011 at 11:54:44PM +0100, Thomas Dalton wrote:
Yes, exactly! You're smart! :-)
Now, one definition of censorship is :
- Filtering on the basis of prejudicial labels.
We're not actually allowed to censor, because censorship is evil.
If we want to do this, we'll need to figure out a way to make an image
filter
which does not use prejudicial labels.
Or we just reject that definition as obviously not applicable. If people are choosing for themselves whether to filter and, if so, what on then it clearly isn't censorship.
[citation needed]
I don't see why it isn't applicable. You have a censorship tool (your prejudicial labelling scheme), and you are applying it for its intended purpose (albeit mildly).
I think that's pretty much sufficient to cross the line into actual censorship. Even if you can't quite see how right now, AMA probably can and has. (I can easily think of some scenarios myself, if you like. In fact, I gave some tangential examples on this list today.)
But... even if we can't agree that *that* is actually across the line, the same censorship tool can still be used by others for more sinister purposes. High quality prejudicial categorization would most certainly be a boon for 3rd party censors, in many many ways.
So the options you are advocating are either (arguably) actual censorship, or (if we can't agree to that) the enabling of 3rd party censorship.
The board themselves in their decision are very careful not to cross those lines. My one issue with the board is merely that I think it is very hard _not_ to cross the line.
Of course, some people don't see the danger, and blithely cross the line anyway. (Thus proving my point for me much better than anything I could say myself O:-) )
sincerely, Kim Bruning
citation: http://www.ala.org/ala/issuesadvocacy/intfreedom/librarybill/interpretations...
On Sep 5, 2011 12:20 AM, "Kim Bruning" kim@bruning.xs4all.nl wrote:
On Sun, Sep 04, 2011 at 11:54:44PM +0100, Thomas Dalton wrote:
Yes, exactly! You're smart! :-)
Now, one definition of censorship is :
- Filtering on the basis of prejudicial labels.
We're not actually allowed to censor, because censorship is evil.
If we want to do this, we'll need to figure out a way to make an image
filter
which does not use prejudicial labels.
Or we just reject that definition as obviously not applicable. If people
are
choosing for themselves whether to filter and, if so, what on then it clearly isn't censorship.
[citation needed]
I don't see why it isn't applicable. You have a censorship tool (your prejudicial labelling scheme), and you are applying it for its intended purpose (albeit mildly).
Please define "censorship" because I think the word must mean something very different to you than it does to me. To me it means one person stopping another person from seeing something the first person doesn't want the second person to see. That clearly doesn't apply here since there is only one person.
On 5 September 2011 00:26, Thomas Dalton thomas.dalton@gmail.com wrote:
Please define "censorship" because I think the word must mean something very different to you than it does to me. To me it means one person stopping another person from seeing something the first person doesn't want the second person to see. That clearly doesn't apply here since there is only one person.
There are clearly at least two: the second being the person or persons selecting the default filter list.
- d.
On 5 September 2011 00:46, David Gerard dgerard@gmail.com wrote:
On 5 September 2011 00:26, Thomas Dalton thomas.dalton@gmail.com wrote:
Please define "censorship" because I think the word must mean something very different to you than it does to me. To me it means one person stopping another person from seeing something the first person doesn't want the second person to see. That clearly doesn't apply here since there is only one person.
There are clearly at least two: the second being the person or persons selecting the default filter list.
Why does there need to be a default list? Just let anyone create filters and perhaps use a keyword and rating system to help people find useful ones (although I'd expect a lot of the discussions about what filters to use to happen off Wikipedia, among the groups that want that particular thing filtered).
If we start making decisions about what people might want to filter then I agree we have a problem, but I don't see any need for us to make those decisions.
On Monday 05 September 2011 03:53 AM, Kim Bruning wrote:
On Sun, Sep 04, 2011 at 11:54:44PM +0100, Thomas Dalton wrote:
Yes, exactly! You're smart! :-)
Now, one definition of censorship is :
- Filtering on the basis of prejudicial labels.
We're not actually allowed to censor, because censorship is evil.
If we want to do this, we'll need to figure out a way to make an image
filter
which does not use prejudicial labels.
Or we just reject that definition as obviously not applicable. If people are choosing for themselves whether to filter and, if so, what on then it clearly isn't censorship.
[citation needed]
I don't see why it isn't applicable. You have a censorship tool (your prejudicial labelling scheme), and you are applying it for its intended purpose (albeit mildly).
Hi Kim, I find your discussion of labelling schemes (and the American Library Associations guidelines) extremely useful and interesting. Thank you for taking the time to explain this carefully. It has helped clear up, for me, similar questions to the kind that Sarah and others raised on this list earlier.
I think that's pretty much sufficient to cross the line into actual censorship. Even if you can't quite see how right now, AMA probably can and has. (I can easily think of some scenarios myself, if you like. In fact, I gave some tangential examples on this list today.)
But... even if we can't agree that *that* is actually across the line, the same censorship tool can still be used by others for more sinister purposes. High quality prejudicial categorization would most certainly be a boon for 3rd party censors, in many many ways.
So the options you are advocating are either (arguably) actual censorship, or (if we can't agree to that) the enabling of 3rd party censorship.
The board themselves in their decision are very careful not to cross those lines. My one issue with the board is merely that I think it is very hard _not_ to cross the line.
Of course, some people don't see the danger, and blithely cross the line anyway. (Thus proving my point for me much better than anything I could say myself O:-) )
sincerely, Kim Bruning
citation: http://www.ala.org/ala/issuesadvocacy/intfreedom/librarybill/interpretations...
In relation to the ALA link (which is an exemplar of concision and moral clarity), I have a few related questions.
1) Would the article rating tool (Good? Useful? Reliable? etc.) or indeed any other comparable qualitative rating/ranking (for e.g. GA/ FA status) similarly classify as prejudicial labelling? I ask this because in the article rating tool, I can see it fitting under the same category, but can't see how it would lead to the same results. An archive or library would never employ a qualitative rating like we did, but it makes sense on a place like Wikipedia, and I guess it's because we're not a traditionally constructed archive or library - though very similar in some aspects.
2) In relation to labelling and filtering in a system as being discussed on this list, how would it be if Wikimedia/Wikipedia did not actually facilitate prejudicial labels on images itself but instead built a system that allowed individual users to do so in a way only viewable/useful to that individual user as well as specific others they shared it with? Suppose there was a system like this:
(a) I, as an individual user, can apply whatever labels I like to images on Wikipedia, but these labels will be only visible to me when logged in. (b) Wikimedia/Wikipedia provides a filter I can use, but it will work with my values - i.e. I have to be logged in and have trained it to respond to my labels, as I applied them. (c) I can elect to keep my filtering values (i.e. labels) private or public (d) If public, other users who wish to filter but don't have the desire/knowledge to do so can apply my values depending on their inclinations and mine (i.e. different kind of filter values, porn, violence, religion, etc. and who knows what). That is, a kind of social network for filtering Wikipedia. (e) But in the end, Wikipedia as such - in the general un-logged sense and to users who elect not to use labels for filtering in this manner - is unaffected by the labels that different users have constructed since they have no official place in it; the labels that users apply exist solely in that user's space.
My question is, would this be preferable? Or still prejudicial?
Cheers, Achal
On Mon, Sep 5, 2011 at 12:37 AM, Achal Prabhala aprabhala@gmail.com wrote:
On Monday 05 September 2011 03:53 AM, Kim Bruning wrote:
On Sun, Sep 04, 2011 at 11:54:44PM +0100, Thomas Dalton wrote:
Yes, exactly! You're smart! :-)
Now, one definition of censorship is :
- Filtering on the basis of prejudicial labels.
We're not actually allowed to censor, because censorship is evil.
If we want to do this, we'll need to figure out a way to make an image
filter
which does not use prejudicial labels.
Or we just reject that definition as obviously not applicable. If people are choosing for themselves whether to filter and, if so, what on then it clearly isn't censorship.
[citation needed]
I don't see why it isn't applicable. You have a censorship tool (your prejudicial labelling scheme), and you are applying it for its intended purpose (albeit mildly).
Hi Kim, I find your discussion of labelling schemes (and the American Library Associations guidelines) extremely useful and interesting. Thank you for taking the time to explain this carefully. It has helped clear up, for me, similar questions to the kind that Sarah and others raised on this list earlier.
I think that's pretty much sufficient to cross the line into actual censorship. Even if you can't quite see how right now, AMA probably can and has. (I can easily think of some scenarios myself, if you like. In fact, I gave some tangential examples on this list today.)
But... even if we can't agree that *that* is actually across the line, the same censorship tool can still be used by others for more sinister purposes. High quality prejudicial categorization would most certainly be a boon for 3rd party censors, in many many ways.
So the options you are advocating are either (arguably) actual censorship, or (if we can't agree to that) the enabling of 3rd party censorship.
The board themselves in their decision are very careful not to cross those lines. My one issue with the board is merely that I think it is very hard _not_ to cross the line.
Of course, some people don't see the danger, and blithely cross the line anyway. (Thus proving my point for me much better than anything I could say myself O:-) )
sincerely, Kim Bruning
citation: http://www.ala.org/ala/issuesadvocacy/intfreedom/librarybill/interpretations...
In relation to the ALA link (which is an exemplar of concision and moral clarity), I have a few related questions.
- Would the article rating tool (Good? Useful? Reliable? etc.) or
indeed any other comparable qualitative rating/ranking (for e.g. GA/ FA status) similarly classify as prejudicial labelling? I ask this because in the article rating tool, I can see it fitting under the same category, but can't see how it would lead to the same results. An archive or library would never employ a qualitative rating like we did, but it makes sense on a place like Wikipedia, and I guess it's because we're not a traditionally constructed archive or library - though very similar in some aspects.
Achal -- yes, I believe a strong case can be made that qualitative rating would fall under the ALA's intent (in traditional libraries, a book might be labeled as "award winner" -- that's an objective fact. It would not be labeled as "good".)
The difference lies in our role as active editors (vs the librarian role as curators), making active choices; a reference work is a different kind of project from a library. It also lies in a difference in intent -- what the ALA speaks out about is labeling that is intended to restrict access. None of our labeling intends to restrict access to anything for anyone.
-- phoebe
On Mon, Sep 05, 2011 at 06:25:05AM -0700, phoebe ayers wrote:
The difference lies in our role as active editors (vs the librarian role as curators), making active choices; a reference work is a different kind of project from a library. It also lies in a difference in intent -- what the ALA speaks out about is labeling that is intended to restrict access. None of our labeling intends to restrict access to anything for anyone.
I guess this is where we get to the point where I disagree with you Phoebe :)
We both agree that restricting access is evil.
I think you believe there is a way in which we can make a labelling scheme for filtering that is not intended to restrict access.
I believe that filtering is -per definition- a form of restricting access. The proposed filter itself is fairly benign.However, the same labels that are used on wikipedia to help good people to restrict themselves being exposed to bad pictures, can equally be used by bad people to restrict access to good pictures.
I have the impression you believe in the good in people. :) I do too. Rotten apples are very rare!
In this case though, I think it only takes just one rotten apple to ruin everyone's day. So we need to plan to ensure that there is no way the rare rotten apple can subvert our work.
I know you believe that this is possible. We have a smart community, surely someone can come up with a working solution.
I'm not so sure. My experience is that filters and their databases tend to have all kinds of unintended side effects and collateral damage. I've never seen it go right. Wikipedia would be the first time that it ever did. I'm not saying it's entirely impossible. Just that apparently it is very hard. And if we accidentally miss something, it's going to ruin our day, our month or even our year.
If we succeed, we anger our friends, and our enemies will only clamor slightly less loudly. I'm not sure we will reach many new people. I have seen some reports, but none answered that particular question afaik. (Have I missed anything?)
If we happen to fail in the wrong way, one worst case scenario is that our mission becomes doomed. (If I were evil, I'd know exactly how to make that happen)
So it's a high risk, low reward kind of play, in my personal assesment. The board has said that they want this. I think they surely must have a different risk assesment. :-)
So that explains some of my practical reasons for being somewhat skeptical -not of the filter- but of the category system behind it.
sincerely, Kim Bruning
On 4 September 2011 22:50, Thomas Dalton thomas.dalton@gmail.com wrote:
On 4 September 2011 21:18, Kim Bruning kim@bruning.xs4all.nl wrote:
I really wish people would read previous discussions.
I read the discussions, I just don't see any merit in the arguments. Of course the labels are prejudiced, that's the whole point. People can choose which prejudice they want and filter on those labels.
I'm glad the ALA-unbiased method of selecting labels is clear to everyone. Oh, wait.
- d.
On 4 September 2011 22:52, David Gerard dgerard@gmail.com wrote:
I'm glad the ALA-unbiased method of selecting labels is clear to everyone. Oh, wait.
The selection of labels isn't supposed to be unbiased. Users select whichever labels they want. All you have to do is make sure it's easy for people to create new labels if none of the existing ones fit their needs, and you're sorted.
On Sun, Sep 4, 2011 at 11:54 PM, Thomas Dalton thomas.dalton@gmail.comwrote:
The selection of labels isn't supposed to be unbiased. Users select whichever labels they want. All you have to do is make sure it's easy for people to create new labels if none of the existing ones fit their needs, and you're sorted.
That won't work, for several reasons. First, the proposal as made in the referendum talks about 5 to 10 categories. Thus, after 10 people have created their labels, there are none left. Second, even if they create labels, there needs to be someone to do the labelling - asking someone who doesn't want to see certain pictures to select all such pictures himself by hand doesn't seem to be a very effective way of working if he really does not want to see the pictures.
On Sep 4, 2011 11:34 PM, "Andre Engels" andreengels@gmail.com wrote:
On Sun, Sep 4, 2011 at 11:54 PM, Thomas Dalton <thomas.dalton@gmail.com wrote:
The selection of labels isn't supposed to be unbiased. Users select whichever labels they want. All you have to do is make sure it's easy for people to create new labels if none of the existing ones fit their needs, and you're sorted.
That won't work, for several reasons. First, the proposal as made in the referendum talks about 5 to 10 categories. Thus, after 10 people have created their labels, there are none left.
That was just an example. There's no reason the final implementation has to work that way.
Second, even if they create labels, there needs to be someone to do the labelling - asking someone who doesn't want to see certain pictures to select all such pictures himself
by
hand doesn't seem to be a very effective way of working if he really does not want to see the pictures.
I don't think that will actually be a problem. There are plenty of people that want to tell other people what not to look at but don't feel bound by the same rules themselves. They can label the images.
Am 04.09.11 22:18 schrieb Kim Bruning:
- There's nothing wrong with the filter program itself
That's wrong. What's wrong with the whole programme is that the Foundation did not ask Wikimedians whether they liked to have it or not They just did it. Period. Now, it will be there soon, and the whole "referendum" indeed was not about introducing the filter or not, but rather about what kind of filter you liked most (i.e., a red, or a green one, rather)? This is no way to run a community these days. We want to participate and we want to have a say in these things.
Regards, Jürgen.
On 5 September 2011 14:57, Juergen Fenn juergen.fenn@gmx.de wrote:
Am 04.09.11 22:18 schrieb Kim Bruning:
- There's nothing wrong with the filter program itself
That's wrong. What's wrong with the whole programme is that the Foundation did not ask Wikimedians whether they liked to have it or not They just did it. Period. Now, it will be there soon, and the whole "referendum" indeed was not about introducing the filter or not, but rather about what kind of filter you liked most (i.e., a red, or a green one, rather)? This is no way to run a community these days. We want to participate and we want to have a say in these things.
Let's be clear, here: the WMF hasn't done anything yet. This feature has not been implemented. Everyone agreed there should be some community consultation and that's what this poll was supposed to be. If the poll had been done properly, we wouldn't have a problem. The only problem is the the poll was so poorly designed that it will need to be completely re-done to draw any useful conclusions.
On Tue, Sep 6, 2011 at 3:39 AM, Thomas Dalton thomas.dalton@gmail.com wrote:
If the poll had been done properly, we wouldn't have a problem. The only problem is the the poll was so poorly designed that it will need to be completely re-done to draw any useful conclusions.
It provides a quite satisfactory 'yes' in answer to the question of whether it is worth the devs' time coding beginning development. We're merely talking about a proposed software feature here.
On 05/09/2011 2:08 PM, Stephen Bain wrote:
It provides a quite satisfactory 'yes' in answer to the question of whether it is worth the devs' time coding beginning development. We're merely talking about a proposed software feature here.
What? It does give an answer despite the question being asked being flawed: "this is far from clear and not a penny should be spent on this before community concerns are addressed and a real consultation takes place". I.e.: the opposite.
-- Coren / Marc
On Sep 5, 2011 7:08 PM, "Stephen Bain" stephen.bain@gmail.com wrote:
On Tue, Sep 6, 2011 at 3:39 AM, Thomas Dalton thomas.dalton@gmail.com
wrote:
If the poll had been done properly, we wouldn't have a problem. The only problem is the the poll was so poorly designed that it will need to be completely re-done to draw any useful conclusions.
It provides a quite satisfactory 'yes' in answer to the question of whether it is worth the devs' time coding beginning development. We're merely talking about a proposed software feature here.
I don't think the results say anything of the sort. The main division was on the general question, not any of the ones about details. The issue of whether this feature should exist at all is still very much open.
On 5 September 2011 19:08, Stephen Bain stephen.bain@gmail.com wrote:
It provides a quite satisfactory 'yes' in answer to the question of whether it is worth the devs' time coding beginning development. We're merely talking about a proposed software feature here.
I didn't see that question on the survey.
I'm still waiting to hear from someone on how the questions were actually selected.
- d.
On Tue, Sep 6, 2011 at 5:59 AM, David Gerard dgerard@gmail.com wrote:
I didn't see that question on the survey.
The first question asked people how important they considered it to be that the projects offer the feature. The perceived importance of offering a new software feature indicates the level/quantity of dev resources that should be allocated to developing it.
On Sep 5, 2011 9:19 PM, "Stephen Bain" stephen.bain@gmail.com wrote:
On Tue, Sep 6, 2011 at 5:59 AM, David Gerard dgerard@gmail.com wrote:
I didn't see that question on the survey.
The first question asked people how important they considered it to be that the projects offer the feature. The perceived importance of offering a new software feature indicates the level/quantity of dev resources that should be allocated to developing it.
That's only true if there is general agreement that the feature would be nice to have and there is just a question of whether it is worth the effort. That it not the case here.
On 5 September 2011 21:23, Thomas Dalton thomas.dalton@gmail.com wrote:
On Sep 5, 2011 9:19 PM, "Stephen Bain" stephen.bain@gmail.com wrote:
The first question asked people how important they considered it to be that the projects offer the feature. The perceived importance of offering a new software feature indicates the level/quantity of dev resources that should be allocated to developing it.
That's only true if there is general agreement that the feature would be nice to have and there is just a question of whether it is worth the effort. That it not the case here.
Indeed. The "1" peak is, I submit, not people saying "I really don't care".
- d.
On Tue, Sep 6, 2011 at 6:23 AM, Thomas Dalton thomas.dalton@gmail.com wrote:
That's only true if there is general agreement that the feature would be nice to have and there is just a question of whether it is worth the effort. That it not the case here.
The referendum was pretty clearly predicated on the basis that the feature was going forward:
"The Board of Trustees has directed the Wikimedia Foundation to develop and implement a personal image hiding feature."
"[The referendum was held] to gather more input in to the development and usage of an opt-in personal image hiding feature".
And from the resolution:
"We ask the Executive Director, in consultation with the community, to develop and implement a personal image hiding feature..."
(not "We ask the Executive Director, so long as the can't-recognise-the-irony-in-fighting-censorship-by-stopping-people-choosing-what-they-want-to-see crowd gives their blessing, to develop and implement...")
The questions are all relating to the development of the feature, save for the 'culturally neutral' question: the first is about how to prioritise it, and the others are about setting out the specs for the feature.
On 5 September 2011 21:35, Stephen Bain stephen.bain@gmail.com wrote:
The referendum was pretty clearly predicated on the basis that the feature was going forward: "The Board of Trustees has directed the Wikimedia Foundation to develop and implement a personal image hiding feature." "[The referendum was held] to gather more input in to the development and usage of an opt-in personal image hiding feature".
However, you initially claimed the referendum itself constituted support for the feature itself:
"It provides a quite satisfactory 'yes' in answer to the question of whether it is worth the devs' time coding beginning development."
We're pointing out that it doesn't provide any such thing at all.
- d.
On Sep 6, 2011 6:43 AM, "David Gerard" dgerard@gmail.com wrote:
However, you initially claimed the referendum itself constituted support for the feature itself:
"It provides a quite satisfactory 'yes' in answer to the question of whether it is worth the devs' time coding beginning development."
We're pointing out that it doesn't provide any such thing at all.
It indicated importance. The mean response to the first question of 5.7 and the medium response of 6 points to the community considering it moderately important that the feature be offered, which suggests moderate dedication of dev resources to its development.
On 5 September 2011 22:09, Stephen Bain stephen.bain@gmail.com wrote:
It indicated importance. The mean response to the first question of 5.7 and the medium response of 6 points to the community considering it moderately important that the feature be offered, which suggests moderate dedication of dev resources to its development.
The mean and median are statistical gibberish in a distribution that pathologically bimodal. You should know better than to make any claim that the numbers you quote are meaningful.
- d.
On Sep 6, 2011 7:11 AM, "David Gerard" dgerard@gmail.com wrote:
The mean and median are statistical gibberish in a distribution that pathologically bimodal. You should know better than to make any claim that the numbers you quote are meaningful.
16% of respondents chose '0' and 20% chose '10'. Nearly 2/3 of respondents chose a response other than one of the two extremes.
(There's also a third spike at '5', representing a typical/normal/no different than other features level of importance, with the remaining responses being weighted towards the upper end of the scale.)
On Mon, Sep 5, 2011 at 4:11 PM, David Gerard dgerard@gmail.com wrote:
On 5 September 2011 22:09, Stephen Bain stephen.bain@gmail.com wrote:
It indicated importance. The mean response to the first question of 5.7
and
the medium response of 6 points to the community considering it
moderately
important that the feature be offered, which suggests moderate dedication
of
dev resources to its development.
The mean and median are statistical gibberish in a distribution that pathologically bimodal. You should know better than to make any claim that the numbers you quote are meaningful.
- d.
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Note: the following quote is not aimed at anyone <quote> Figures lie and liars figure. </quote> The point is that we're all too smart on this list to fall for statistical arguments no matter what we're discussing. So instead we chase our tail. In all this discussion, Kim Bruning is speaking in plain smart speak in a humorous manner so I'm pretty much following his posts. The point of rational discussion that I find interesting is his demi-glace reduction in the very last email he sent. I'm working with gmail, so quoting it is annoying.
On Sep 5, 2011 9:35 PM, "Stephen Bain" stephen.bain@gmail.com wrote:
The questions are all relating to the development of the feature, save for the 'culturally neutral' question: the first is about how to prioritise it, and the others are about setting out the specs for the feature.
That first question would have only been useful for prioritisation if it had given a selection of features to compare it to.
Hello,
I also detest the use of the word "censorship", which is obviously out of range here. It's simply about what individuals want to see or not. Some Wikimedians are rather short sighted or ignorant towards the fact that other people may think and feel differently.
Still, I had preferred to let the Wikimedians vote on the introduction in general. Or, better, prepare the tool and then let the single communities decide.
German language Wikipedia has indeed a rather homogenous community, compared to more global linguistic communities.
Kind regards Ziko
2011/9/4 Sarah Stierch sarah.stierch@gmail.com:
Yes (maybe). It's not at all clear that this use case should not be ignored to avoid the possibility of compromising the encyclopedia.
I have to ask: if there's such a demand for a censored Wikipedia, where are the third-party providers? Anyone? This is a serious question. Even workplace filtermakers don't censor Wikipedia, as far as I know.
Some workplace filters don't allow for certain subjects to be searched. I work at a major museum institution, I cannot view subject matter about certain sex topics (and I'm the Wikipedian in Residence, so I'm on WP most of my day). (i.e. "sexual differences").
I don't know why people are wigging out so badly about the image filter. If people want to use it, great, and if you don't, DON'T. But perhaps I'm misunderstanding something about the idea. I voted for it, and it seems the people who dislike the idea are the only one's speaking out on the list.
The idea that there is a choice is very empowering. Just like people filter television cable programming for their children, and internet access. Sometimes this appear when you least expect them, and to allow our users the choice, is great. I will probably never use it (even though I just found out there are plenty of things that gross me out that end up on Wikipedia by way of Commons images), but, I support the option.
And to say that a 4 year old being restricted from seeing nudity on Wikipedia is "not educating them" just makes me laugh out loud. Just like I wouldn't want my 4 year old (and no, I don't have kids, but I have nieces, nephews, etc) watching porn, playing violent video games or watching John Waters movies. :P (And I love John Waters!).
It's really fascinating how freely Wikipedians and Wikimedians love to throw around the word censorship. Someone should do a study on that.
Sarah
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundationhttp://www.glamwiki.org Wikipedian-in-Residence, Archives of American Arthttp://en.wikipedia.org/wiki/User:SarahStierch and Sarah Stierch Consulting
*Historical, cultural & artistic research & advising.*
http://www.sarahstierch.com/ _______________________________________________ foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On Sun, Sep 04, 2011 at 11:15:28PM +0200, Ziko van Dijk wrote:
Hello,
I also detest the use of the word "censorship", which is obviously out of range here. It's simply about what individuals want to see or not.
Right, strictly speaking the issue is with the danger of "prejudicial labelling", which is a "tool for censorship" (according to AMA); prejudicial labelling is not -in itself- censorship.
I can imagine how people can get tired/are lazy/don't want to split hairs. :-)
sincerely, Kim Bruning
starting from "is there a tool" has some truth :) if there would be an itch, somebody would scratch it. if there is a real need a tool would exist written by somebody. and if somebody really feels to use such a tool, this person would switch it on.
because, contrary to a mediawiki parser and wysiwyg editor, its easy to do. and it is easier to do than e.g. wikitrust: http://www.wikitrust.net/. and its easier to do than handling the mediawiki bug list with over 500 (five hundred) bugs unhandled [1].
maybe this whole story is much ado about nothing, http://en.wikipedia.org/wiki/Much_Ado_About_Nothing ...
[1] buglist - https://bugzilla.wikimedia.org/buglist.cgi?columnlist=opendate%2Cvotes%2Cbug...
rupert.
On Sun, Sep 4, 2011 at 23:15, Ziko van Dijk zvandijk@googlemail.com wrote:
Hello,
I also detest the use of the word "censorship", which is obviously out of range here. It's simply about what individuals want to see or not. Some Wikimedians are rather short sighted or ignorant towards the fact that other people may think and feel differently.
Still, I had preferred to let the Wikimedians vote on the introduction in general. Or, better, prepare the tool and then let the single communities decide.
German language Wikipedia has indeed a rather homogenous community, compared to more global linguistic communities.
Kind regards Ziko
2011/9/4 Sarah Stierch sarah.stierch@gmail.com:
Yes (maybe). It's not at all clear that this use case should not be ignored to avoid the possibility of compromising the encyclopedia.
I have to ask: if there's such a demand for a censored Wikipedia, where are the third-party providers? Anyone? This is a serious question. Even workplace filtermakers don't censor Wikipedia, as far as I know.
Some workplace filters don't allow for certain subjects to be searched. I work at a major museum institution, I cannot view subject matter about certain sex topics (and I'm the Wikipedian in Residence, so I'm on WP most of my day). (i.e. "sexual differences").
I don't know why people are wigging out so badly about the image filter. If people want to use it, great, and if you don't, DON'T. But perhaps I'm misunderstanding something about the idea. I voted for it, and it seems the people who dislike the idea are the only one's speaking out on the list.
The idea that there is a choice is very empowering. Just like people filter television cable programming for their children, and internet access. Sometimes this appear when you least expect them, and to allow our users the choice, is great. I will probably never use it (even though I just found out there are plenty of things that gross me out that end up on Wikipedia by way of Commons images), but, I support the option.
And to say that a 4 year old being restricted from seeing nudity on Wikipedia is "not educating them" just makes me laugh out loud. Just like I wouldn't want my 4 year old (and no, I don't have kids, but I have nieces, nephews, etc) watching porn, playing violent video games or watching John Waters movies. :P (And I love John Waters!).
It's really fascinating how freely Wikipedians and Wikimedians love to throw around the word censorship. Someone should do a study on that.
Sarah
-- GLAMWIKI Partnership Ambassador for the Wikimedia Foundationhttp://www.glamwiki.org Wikipedian-in-Residence, Archives of American Arthttp://en.wikipedia.org/wiki/User:SarahStierch and Sarah Stierch Consulting
*Historical, cultural & artistic research & advising.*
http://www.sarahstierch.com/ _______________________________________________ foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
-- Ziko van Dijk The Netherlands http://zikoblog.wordpress.com/
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Am 04.09.11 22:51 schrieb Sarah Stierch:
I don't know why people are wigging out so badly about the image filter. If people want to use it, great, and if you don't, DON'T. But perhaps I'm misunderstanding something about the idea. I voted for it, and it seems the people who dislike the idea are the only one's speaking out on the list.
Sarah, besides other issues you might like to consider that it was not possible at all to vote for the filter in the first place because this question has never been asked.
Regards, Jürgen.
On 4 September 2011 21:38, David Gerard dgerard@gmail.com wrote:
Yes (maybe). It's not at all clear that this use case should not be ignored to avoid the possibility of compromising the encyclopedia.
I have to ask: if there's such a demand for a censored Wikipedia, where are the third-party providers? Anyone? This is a serious question. Even workplace filtermakers don't censor Wikipedia, as far as I know.
It is worth noting here that even if they wanted to partially restrict access to "live" Wikipedia, it would currently be impractical to do so - there's no easy way of identifying all the problematic sections other than with fairly haphazard keyword matching, meaning that it's an all-or-nothing affair, and "all" is decidedly unpopular (though we do hear of it sometimes).
As to why no-one is distributing a "filtered" version of Wikipedia, I think that falls more under the general heading of "where are the major third-party reusers that anyone actually cares about?" - the non-existence of a commercial filtered version is less of a surprise when we consider the dearth of commercial packaged versions at all...
On 05/09/2011 10:55 AM, Andrew Gray wrote:
As to why no-one is distributing a "filtered" version of Wikipedia, I think that falls more under the general heading of "where are the major third-party reusers that anyone actually cares about?" - the non-existence of a commercial filtered version is less of a surprise when we consider the dearth of commercial packaged versions at all...
You'd think a "safe" version would be a valuable service that many would be willing to pay for, given the hordes of people beating down our doors demanding just that...
oh, wait.
-- Coren / Marc
On 5 September 2011 11:02, Marc A. Pelletier marc@uberbox.org wrote:
On 05/09/2011 10:55 AM, Andrew Gray wrote:
As to why no-one is distributing a "filtered" version of Wikipedia, I think that falls more under the general heading of "where are the major third-party reusers that anyone actually cares about?" - the non-existence of a commercial filtered version is less of a surprise when we consider the dearth of commercial packaged versions at all...
You'd think a "safe" version would be a valuable service that many would be willing to pay for, given the hordes of people beating down our doors demanding just that...
oh, wait.
They already exist, and have for years. We call them "mirrors.
Risker/Anne
On 05/09/2011 11:04 AM, Risker wrote:
They already exist, and have for years. We call them "mirrors.
Does anyone actually /use/ those mirrors except by accident of search engine? I've never seen any evidence that they get any significant traffic.
IMO, they are not much higher the Internet totem pole than typosquatters.
-- Coren / Marc
On Mon, Sep 5, 2011 at 5:04 PM, Risker risker.wp@gmail.com wrote:
On 5 September 2011 11:02, Marc A. Pelletier marc@uberbox.org wrote:
On 05/09/2011 10:55 AM, Andrew Gray wrote:
As to why no-one is distributing a "filtered" version of Wikipedia, I think that falls more under the general heading of "where are the major third-party reusers that anyone actually cares about?" - the non-existence of a commercial filtered version is less of a surprise when we consider the dearth of commercial packaged versions at all...
You'd think a "safe" version would be a valuable service that many would be willing to pay for, given the hordes of people beating down our doors demanding just that...
oh, wait.
They already exist, and have for years. We call them "mirrors.
Yes, but most mirrors are just that - mirrors. As far as I know, there is no Wikipedia mirror that actually contains extra functionality - like improved searching, wisiwyg editing, automatic translation, image filtering, or whatever else one could think of.
there are however generic internet filters - foundations which serve as internet provider and filter out "unsafe" pages (usually with a religious foundation). These usually have problems though, because they are recognized as open proxy, and thus blocked. this is a popular service in parts of NL - and potentially keeps editors away because they have no on-site way of filtering. But maybe some think we shouldn't want those people as editors anyway... (yes, that last is sarcasm)
Please note that the group wikipedians, authors, is somewhat self selected, and we're just running a self fulfilling prophecy. Wikipedians will often be relatively more liberal - but why should we force liberal views upon other people? I don't like the filters, and i wouldn't want them (except when someone comes up with a troll-filter) - but I do think that people have the right not to see/hear things, just like you should have the right to say them.
I do however not understand why we are having the fundamental discussion all over again. I think it is pretty clear there is a large group of people who want the technology developed - we could next discuss where we want it implemented (it seems dewp isn't too excited about it for example, others might be). Let us focus on having a good implementation rather than the things that (whether we like it or not) already seem to have been decided for us.
Lodewijk
Am 5. September 2011 18:00 schrieb Andre Engels andreengels@gmail.com:
On Mon, Sep 5, 2011 at 5:04 PM, Risker risker.wp@gmail.com wrote:
On 5 September 2011 11:02, Marc A. Pelletier marc@uberbox.org wrote:
On 05/09/2011 10:55 AM, Andrew Gray wrote:
As to why no-one is distributing a "filtered" version of Wikipedia, I think that falls more under the general heading of "where are the major third-party reusers that anyone actually cares about?" - the non-existence of a commercial filtered version is less of a surprise when we consider the dearth of commercial packaged versions at all...
You'd think a "safe" version would be a valuable service that many
would
be willing to pay for, given the hordes of people beating down our
doors
demanding just that...
oh, wait.
They already exist, and have for years. We call them "mirrors.
Yes, but most mirrors are just that - mirrors. As far as I know, there is no Wikipedia mirror that actually contains extra functionality - like improved searching, wisiwyg editing, automatic translation, image filtering, or whatever else one could think of.
-- André Engels, andreengels@gmail.com _______________________________________________ foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On 5 September 2011 18:09, Lodewijk lodewijk@effeietsanders.org wrote:
there are however generic internet filters - foundations which serve as internet provider and filter out "unsafe" pages (usually with a religious foundation). These usually have problems though, because they are recognized as open proxy, and thus blocked. this is a popular service in parts of NL - and potentially keeps editors away because they have no on-site way of filtering. But maybe some think we shouldn't want those people as editors anyway... (yes, that last is sarcasm)
Don't even need to use those. Adblock plus can be used as an on machine filter. For example the following filter will nicely take care of all the giant isopod images on wikipedia
*Giant_isopod.jpg* *Bathynomus_giganteus.jpg* *Bathynomus_giganteus_NOAA.jpg* *Front_View_Isopod_West_Sirius_Rig_GOM.JPG* *Isopod_from_West_Sirius_Rig_GOM.JPG* *Bathynomus doederleinii.jpg* *Bathynomus doederleinii (dorsal).jpg*
W
On Mon, Sep 05, 2011 at 07:09:13PM +0200, Lodewijk wrote:
I do however not understand why we are having the fundamental discussion all over again.
Ok, I'll bite. When did we have it the first time? :-)
sincerely, Kim Bruning
If I may bite back, the original outline of this filter, in a form not dissimilar to the one that lay at the heart of the referendum, was placed in the Wikimedia public space more than a year ago, on the Meta page devoted to the Study on Controversial Content -- and put in that space clearly as a proposal, not as a fait accompli. Since then, a discussion on its merits and demerits has taken place in a number of forums-- with literally -- what-- a thousand, more than a thousand public comments added?
Does that not count as a fundamental discussion?
Robert Harris
Date: Mon, 5 Sep 2011 21:01:38 +0200 From: kim@bruning.xs4all.nl To: foundation-l@lists.wikimedia.org Subject: Re: [Foundation-l] Personal Image Filter results announced
On Mon, Sep 05, 2011 at 07:09:13PM +0200, Lodewijk wrote:
I do however not understand why we are having the fundamental discussion all over again.
Ok, I'll bite. When did we have it the first time? :-)
sincerely, Kim Bruning
-- [Non-pgp mail clients may show pgp-signature as attachment] gpg (www.gnupg.org) Fingerprint for key FEF9DD72 5ED6 E215 73EE AD84 E03A 01C5 94AC 7B0E FEF9 DD72
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On Mon, Sep 05, 2011 at 04:16:52PM -0400, R M Harris wrote:
If I may bite back, the original outline of this filter.
Sure. That's why I think it's disingenuous to say there are problems with the filter per-se. (And, if you read all my posts here and elsewhere, you'll see that -in fact- I repeat over and over that there are no problems with the filter per-se.)
So: I'm Not Talking About That.
== So what AM I talking about? ==
Well, you see, to make the filter work, we're going to need a data-set of stuff to filter. I've been thinking REALLY hard about this part, and how to make sure it doesn't actually break.
This involves a bit of thinking ahead, like in chess: "we do this, bad guy does that, we do this next thing in response..." etc.
I haven't found a solution yet. And I'm somewhat discouraged by the fact that ALA hasn't found a solution in half a century.
Of course, we might be smarter than ALA. O:-) But it requires actual work. It's not going to magically happen.
==A challenge==
I'd be willing to set up a test server, with say you and phoebe running blue team and protecting the categorization system... and I could play red team and try to take you down. ;-)
Then we can switch places and see if I'm any better at defence. :-)
Depending on the exact scenario, I've figured out how to do things from damaging third parties using the wiki as leverage [1], to, -in one extreme case- actually locking out admins for a while [2]
We can think up some rules and make some bets on the outcome if you like. Wanna play? :-)
Sincerely, Kim Bruning
[1] Damage as expressed in SEO terms.
[2] This would require a corporate proxy acting agressively based on data derived from the categorization system. [2]
[3] The particular scenario might be more or less likely, but utilizing synergy between wiki & 3rd party reusers makes for some interesting hacks in general.
On 5 September 2011 17:00, Andre Engels andreengels@gmail.com wrote:
Yes, but most mirrors are just that - mirrors. As far as I know, there is no Wikipedia mirror that actually contains extra functionality - like improved searching, wisiwyg editing, automatic translation, image filtering, or whatever else one could think of.
There have been a couple of attempts to make more-or-less curated mirrors, but they've found it hard to gain traction. It's a bit of a vicious cycle - to get readers you need lots of content, to get the resources to curate lots of content you need readers (this holds whether you rely on volunteers or whether you run it commercially). To have a chance of getting enough readers to make the project a viable going concern, you'd need to invest a lot of resources up front, banking on the assumption that: * a) your difference from the status quo is enough to attract some fraction of users; * b) the search engines would actually work in your favour rather than treating you as Wikipedia-With-Adwords Dump #41,875; and * c) it wouldn't be cloned fifty-three times by next week.
This holds regardless of what it is - whether it's stable-versioning or image-filtering, any prospective reuser is gambling on an uncertain level of takeup and a massive unknown in terms of search-engine response. If you have a target audience who you know want your specific flavour of curation, you can bypass this and go straight to them - see, for example, the Wikipedia For Schools offline projects - but it's not clear how you could then use this to bootstrap a successful internet service, since projects like this are usually selections rather than whole-content curation. Some kind of partnership with a portal might work, but I don't know if anyone's tried it yet.
In short, the current model for online mirrors serves to discourage people from putting much effort into them, and so all sorts of potentially desirable (or potentially interesting, or even potentially amazingly-bad-example) experiments with reusing our content just aren't happening.
It's not a problem we can solve (and it's perhaps not one we should be trying to solve) but it does mean we shouldn't draw any firm conclusions from the absence of any specific types of project - there's an absence of *all* sorts of projects, good and bad ones alike.
On 04/09/11 21:28, Kim Bruning wrote:
On Sun, Sep 04, 2011 at 09:16:42PM +0100, Thomas Dalton wrote:
The trouble is that at its edges, education is fundamentally disconcerting, upsetting and subversive. And that this is a matter only of degree, not of kind.
I agree, and I would never turn on such a filter. That doesn't mean that other people shouldn't be allowed to if they want to.
Right, but then they won't be educated.
But, if they don't want to be educated, erm, why are they using an encyclopedia in the first place?
Perhaps different people want to be educated about different things? For example, I might want to be educated about possible treatments for arachnophobia, but I don't want to be educated about how a large hairy spider looks in close-up?
On 4 September 2011 21:16, Thomas Dalton thomas.dalton@gmail.com wrote:
On 4 September 2011 21:12, David Gerard dgerard@gmail.com wrote:
The trouble is that at its edges, education is fundamentally disconcerting, upsetting and subversive. And that this is a matter only of degree, not of kind.
I agree, and I would never turn on such a filter. That doesn't mean that other people shouldn't be allowed to if they want to.
I previously agreed - and I have a direct personal interest, as the father of a 4 year old child - but I'm now starting to wonder. I can't see this going well at all.
- d.
On 09/04/2011 09:57 PM, Thomas Dalton wrote:
I never said there was anything wrong with the German Wikipedia. I was suggesting that swastikas might be something German people would want to filter out […]
An empirical assumption I disagree with. Most people are imho only offended by swastikas if they are used for right-wing propaganda (i.e. not for educational use).
My point was: The readership of German Wikipedia has far fewer people who urgently demand such a feature than other language versions. That should be kept in mind when looking at German Wikipedia's polls and it should be accounted for when deciding if and where to deploy image filters.
--Tobias
On Sun, Sep 4, 2011 at 21:42, Thomas Dalton thomas.dalton@gmail.com wrote:
On 4 September 2011 20:11, church.of.emacs.ml church.of.emacs.ml@googlemail.com wrote:
On 09/04/2011 07:43 PM, Kim Bruning wrote:
Assuming that the .de community is similar to the wikimedia community at large […]
That is where I disagree. The personal image filter doesn't make much sense in German Wikipedia, since the German culture is generally pretty liberal with respect to depictions of sexuality, (partially) violence and of course Muhammed. So it's clear that there is simply no or a very small necessity for a filter; thus the rejection.
What about Swastikas?
swastikas are not problem, but scorpions seem to be recently, haha: * http://en.wikipedia.org/wiki/Virgin_Killer * http://de.wikipedia.org/wiki/Virgin_Killer
rupert
On 4 September 2011 21:20, rupert THURNER rupert.thurner@gmail.com wrote:
swastikas are not problem, but scorpions seem to be recently, haha:
Well, en:wp allows fair use, but de:wp doesn't. Which averts that one nicely.
- d.
On Sun, Sep 4, 2011 at 22:22, David Gerard dgerard@gmail.com wrote:
On 4 September 2011 21:20, rupert THURNER rupert.thurner@gmail.com wrote:
swastikas are not problem, but scorpions seem to be recently, haha:
Well, en:wp allows fair use, but de:wp doesn't. Which averts that one nicely.
oh, and why, if the domain name for both belongs to the wikimedia foundation which is located in the united states?
On Sun, Sep 04, 2011 at 10:36:31PM +0200, rupert THURNER wrote:
On Sun, Sep 4, 2011 at 22:22, David Gerard dgerard@gmail.com wrote:
On 4 September 2011 21:20, rupert THURNER rupert.thurner@gmail.com wrote:
swastikas are not problem, but scorpions seem to be recently, haha:
Well, en:wp allows fair use, but de:wp doesn't. Which averts that one nicely.
oh, and why, if the domain name for both belongs to the wikimedia foundation which is located in the united states?
%-/
Why? Because our mission is to make things free (as in speech). You may have heard about that ;-)
It's Wikipedia the free (as in speech) encyclopedia, not Wikipedia the fair use encyclopedia.
So in theory, en shouldn't be allowing fair use either. In practice, some people fubared things, and now it's grandfathered in.
Also, de.wikipedia uses Commons 100% iirc. Commons also only hosts actual free (as in speech) images. Because -hey- that's their mission.
sincerely, Kim Bruning
(Off-Topic post)
On 09/04/2011 09:53 PM, Kim Bruning wrote:
Also, de.wikipedia uses Commons 100% iirc. Commons also only hosts actual free (as in speech) images. Because -hey- that's their mission.
That's almost correct :) There are some exceptions due to a relative low "Threshold of originality" for logos in German law. E.g. this is considered public domain by German law (but not internationally, so Commons doesn't have it): http://de.wikipedia.org/wiki/Datei:Laufendes-Auge_2.jpg
On the other hand, there are very few cases, where an image on Commons can't be used in German Wikipedia. Example: http://commons.wikimedia.org/wiki/File:Albert_Einstein_photo_1920.jpg?uselan...
Regards, Tobias
2011/9/5 church.of.emacs.ml church.of.emacs.ml@googlemail.com:
(Off-Topic post)
On 09/04/2011 09:53 PM, Kim Bruning wrote:
Also, de.wikipedia uses Commons 100% iirc. Commons also only hosts actual free (as in speech) images. Because -hey- that's their mission.
That's almost correct :) There are some exceptions due to a relative low "Threshold of originality" for logos in German law. E.g. this is considered public domain by German law (but not internationally, so Commons doesn't have it): http://de.wikipedia.org/wiki/Datei:Laufendes-Auge_2.jpg
This image can certainly be covered by http://commons.wikimedia.org/wiki/Template:PD-EU-no_author_disclosure and therefore be used in Germany.
Regards,
Yann
On 4 September 2011 21:36, rupert THURNER rupert.thurner@gmail.com wrote:
On Sun, Sep 4, 2011 at 22:22, David Gerard dgerard@gmail.com wrote:
On 4 September 2011 21:20, rupert THURNER rupert.thurner@gmail.com wrote:
swastikas are not problem, but scorpions seem to be recently, haha:
Well, en:wp allows fair use, but de:wp doesn't. Which averts that one nicely.
oh, and why, if the domain name for both belongs to the wikimedia foundation which is located in the united states?
Because it turns out that is one of those local wiki policy issues. Which averts *that* one nicely.
- d.
On 04/09/2011 3:11 PM, church.of.emacs.ml wrote:
That is where I disagree. The personal image filter doesn't make much sense in German Wikipedia, since the German culture is generally pretty liberal with respect to depictions of sexuality, (partially) violence and of course Muhammed. So it's clear that there is simply no or a very small necessity for a filter; thus the rejection.
And that's the best argument *against* the filter I've seen in a while because it reiterates that it has - at its core - the insurmountable problem that it attempts to provide a method by which "objectionable" material can be filtered without being able to define what "objectionable" means in any meaningfully culturally-neutral way. (Hint: the answer is "it cannot be done").
It wouldn't even be possible to define a meaningful "nudity" category, and that's arguably the simplest of all.
-- Coren / Marc
On Mon, Sep 5, 2011 at 9:04 AM, Marc A. Pelletier marc@uberbox.org wrote:
And that's the best argument *against* the filter I've seen in a while because it reiterates that it has - at its core - the insurmountable problem that it attempts to provide a method by which "objectionable" material can be filtered without being able to define what "objectionable" means in any meaningfully culturally-neutral way. (Hint: the answer is "it cannot be done").
It wouldn't even be possible to define a meaningful "nudity" category, and that's arguably the simplest of all.
That's not a sensible assertion. The fact that being in or out of a category is inherently a matter of degree rather than a binary thing doesn't mean that there's no difference between a picture of a rabbit and a screengrab from (freely licensed, of course) hardcore pornography.
Sure, there'd need to be some understanding of what's in and what's out of various categories, and it's not possible to make that completely objective. But that doesn't mean it's not a useful or worthwhile exercise.
On 04/09/2011 20:11, church.of.emacs.ml wrote:
On 09/04/2011 07:43 PM, Kim Bruning wrote:
Assuming that the .de community is similar to the wikimedia community at large […]
That is where I disagree. The personal image filter doesn't make much sense in German Wikipedia, since the German culture is generally pretty liberal with respect to depictions of sexuality, (partially) violence and of course Muhammed. So it's clear that there is simply no or a very small necessity for a filter; thus the rejection.
I call bullshit. If wikipedia or its chapters should ever be found to be liable under German law there is going to have some explaining to do:
http://en.wikipedia.org/wiki/Pornography_by_region#Germany http://blogoscoped.com/archive/2007-09-12-n23.html http://www.chillingeffects.org/international/notice.cgi?NoticeID=5990 http://www.chillingeffects.org/international/notice.cgi?NoticeID=2187
On 09/06/2011 08:29 PM, ???? wrote:
On 04/09/2011 20:11, church.of.emacs.ml wrote:
On 09/04/2011 07:43 PM, Kim Bruning wrote:
Assuming that the .de community is similar to the wikimedia community at large […]
That is where I disagree. The personal image filter doesn't make much sense in German Wikipedia, since the German culture is generally pretty liberal with respect to depictions of sexuality, (partially) violence and of course Muhammed. So it's clear that there is simply no or a very small necessity for a filter; thus the rejection.
I call bullshit. If wikipedia or its chapters should ever be found to be liable under German law there is going to have some explaining to do:
http://en.wikipedia.org/wiki/Pornography_by_region#Germany http://blogoscoped.com/archive/2007-09-12-n23.html http://www.chillingeffects.org/international/notice.cgi?NoticeID=5990 http://www.chillingeffects.org/international/notice.cgi?NoticeID=2187
I was talking about German culture, not German law. In other words, people don't seem to be so weirded out by sexuality like some* Americans are. Look up Freikörperkultur for an example.
* (I don't want to overgeneralize things :))
--Tobias
On 4 September 2011 13:48, Thomas Dalton thomas.dalton@gmail.com wrote:
The Foundation needs to be mature enough to admit that they've screwed up this survey, apologise and try again. Next time, start by figuring out what you want to achieve by asking the questions and then choose the questions accordingly.
... including posting the questions for sanity-checking, at the very least.
- d.
David Gerard, 04/09/2011 11:17:
The bimodal distribution in the first graph suggests this feature will continue to be controversial (to say the least), with fans saying "we had the majority" and foes saying "there is clearly not a consensus".
Or that a successful "plebiscite" (as it's called in the results page) usually has a 90-95 % support. Hmm, yet another terminology/translation problem.
Nemo
On Sun, Sep 4, 2011 at 11:17, David Gerard dgerard@gmail.com wrote:
The bimodal distribution in the first graph suggests this feature will continue to be controversial (to say the least), with fans saying "we had the majority" and foes saying "there is clearly not a consensus".
So. What happens now?
People with sufficient free time could be very imaginative in solving that existential problem for them. I've heard that German, Polish, Hungarian (etc.?) Wikipedians solved the problem by introducing flagged revisions. I suppose that Board's approach to that problem is to analyze endlessly numbers which basically mean nothing.
On Sun, Sep 4, 2011 at 2:33 PM, Philippe Beaudette pbeaudette@wikimedia.org wrote:
Ladies and Gentlemen,
The committee running the vote on the features for the Personal Image Filter have released their interim report and vote count. You may see the results at http://meta.wikimedia.org/wiki/Image_filter_referendum/Results/en. Please note that the results are not final: although the vote count is, and has been finalized, the analysis of comments is ongoing.
Was this survey approved by the Research Committee? If so, can they give us an opinion on the survey instrument used, whether the survey population obtained is suitable, etc?
On 04/09/11 14:33, Philippe Beaudette wrote:
Please note that the results are not final: although the vote count is, and has been finalized, the analysis of comments is ongoing.
It would be nice to see a correlation analysis of some kind. For example, it would be interesting to know whether those who support the filter have differing views on cultural neutrality to those who oppose it.
-- Tim Starling
On Sun, Sep 4, 2011 at 5:43 PM, Tim Starling tstarling@wikimedia.orgwrote:
On 04/09/11 14:33, Philippe Beaudette wrote:
Please note that the results are not final: although the vote count is, and has been finalized, the analysis of comments is ongoing.
It would be nice to see a correlation analysis of some kind. For example, it would be interesting to know whether those who support the filter have differing views on cultural neutrality to those who oppose it.
-- Tim Starling
Absolutely. There's a ton of analysis left to do. I'll add that to the list though. :)
pb
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On 04/09/2011 9:24 PM, Philippe Beaudette wrote:
Absolutely. There's a ton of analysis left to do. I'll add that to the list though. :) pb
Let's give the Foundation and the committee who made this survey every benefit of the doubt. Let's presume they truly and sincerely thought that the idea of this image filter was entirely uncontroversial, and that it would have near-universal support once the details are worked out. That the survey was so rife with methodological problems that – unavoidably – biased any result strongly towards something that looks superficially supportive of the feature yet still renders a highly mitigated result should signal an immediate "all stop" to forging ahead blindly with this idea.
What I'd expect now from the committee/WMF is an acknowledgement that the image filter is nowhere near the no-brainer they imagined it to be, and a commitment to not do any further work towards implementation until a real community discussion has taken place. Further, at least some signal that they even allow for the possibility that this may not be a workable idea in the first place.
-- Coren / Marc
On 5 September 2011 14:59, Marc A. Pelletier marc@uberbox.org wrote:
What I'd expect now from the committee/WMF is an acknowledgement that the image filter is nowhere near the no-brainer they imagined it to be, and a commitment to not do any further work towards implementation until a real community discussion has taken place. Further, at least some signal that they even allow for the possibility that this may not be a workable idea in the first place.
+1
- d.
I think the most important line from the analysis is the question, "What comes next?" As there was no question that asked the community if this feature was desirable or wanted, what does come next? Will this feature be implemented? If so, is it being implemented with community assent? What was the purpose of this "referendum" and what decision was arrived at through it?
Coming from a country that has frequent referendums (Ireland), I think it was quite absurd to call this survey a referendum. What question was referred to the community? What decision did the community arrive at? Whether intentional or un-intentional, I get the feeling that this survey will be used to give a gloss of "community input" to a decision that was already taken.
Don't get me wrong here, I actually don't care very much whether this feature is implemented or not but I am disappointed at the procedure. I think I preferred it in the good old days when it was made clear that Wikipedia was not a democracy or an experiment in society and when Jimbo's word was final. For one thing, there was nothing really wrong with that. But, more importantly with respect to these kinds of "referendums", it was honest.
In future, let's just call these kind of things a "survey", can we?
TLDR; surveys are great but don't call them a referendum if you don't ask a direct question
Regards, Oliver
On 4 September 2011 05:33, Philippe Beaudette pbeaudette@wikimedia.orgwrote:
Ladies and Gentlemen,
The committee running the vote on the features for the Personal Image Filter have released their interim report and vote count. You may see the results at http://meta.wikimedia.org/wiki/Image_filter_referendum/Results/en. Please note that the results are not final: although the vote count is, and has been finalized, the analysis of comments is ongoing.
Posted on behalf of the committee, Philippe ___________________ Philippe Beaudette Head of Reader Relations Wikimedia Foundation, Inc.
philippe@wikimedia.org _______________________________________________ foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On Tue, Sep 6, 2011 at 11:22, Oliver Moran oliver.moran@gmail.com wrote:
In future, let's just call these kind of things a "survey", can we?
TLDR; surveys are great but don't call them a referendum if you don't ask a direct question
Welcome to the political society! I could imagine how the survey got the name "referendum". Actually, how the referendum became survey: It is likely that some Board members insisted on referendum. And, to please them, others accepted the name. And then started to work on "something which would be called referendum, but which won't be a referendum". Simple as that and everybody [from the Board] is happy.
Which reminds on Radio Yerevan [1]. For example, this one:
Question: "Is it true that comrade cosmonaut Yuri Gagarin's car was stolen in Moscow during the celebrations?" Answer: "In principle yes, but it was not in Moscow, rather in Kiev, and it was not his car, but his bike and it was not comrade cosmonaut Yuri Gagarin, but comrade highschool teacher Gagarin and his first name was not Yuri, but Leonid..."
On 06/09/11 19:22, Oliver Moran wrote:
I think the most important line from the analysis is the question, "What comes next?" As there was no question that asked the community if this feature was desirable or wanted, what does come next? Will this feature be implemented? If so, is it being implemented with community assent? What was the purpose of this "referendum" and what decision was arrived at through it?
There's a Board resolution that says "implement it", so I suppose it will be implemented.
http://wikimediafoundation.org/wiki/Resolution:Controversial_content
However, the editor community could sabotage it in various ways. For example, there's no guarantee that anyone will tag any images, or that tagged images won't be untagged by bots run by administrators. If the Board really does want a useful image-hiding feature, then it's essential that the community be persuaded that it is a good idea.
Personally, I think the filter will be mostly harmless, and that it's not worth the effort to rail against it. It will be useful for PR -- it will seem as if we are trying to accomodate all points of view even if the feature is not particularly useful for parents.
-- Tim Starling
On Tue, Sep 6, 2011 at 14:33, Tim Starling tstarling@wikimedia.org wrote:
Personally, I think the filter will be mostly harmless, and that it's not worth the effort to rail against it. It will be useful for PR -- it will seem as if we are trying to accomodate all points of view even if the feature is not particularly useful for parents.
I suppose that you know that WMF did PR research if you claim that it will be useful for that purpose. If so, please refer to it. If not, it's just about dilettantism, as usual.
The only useful PR action for right-wing media was Jimmy's purging of artworks depicting nude women [1]. But, again, I didn't see any PR analysis which told to Jimmy that it is right thing to do.
[1] http://www.foxnews.com/scitech/2010/05/07/wikipedia-purges-porn/
On 6 September 2011 13:56, Milos Rancic millosh@gmail.com wrote:
On Tue, Sep 6, 2011 at 14:33, Tim Starling tstarling@wikimedia.org wrote:
Personally, I think the filter will be mostly harmless, and that it's not worth the effort to rail against it. It will be useful for PR -- it will seem as if we are trying to accomodate all points of view even if the feature is not particularly useful for parents.
I suppose that you know that WMF did PR research if you claim that it will be useful for that purpose. If so, please refer to it. If not, it's just about dilettantism, as usual.
The only useful PR action for right-wing media was Jimmy's purging of artworks depicting nude women [1]. But, again, I didn't see any PR analysis which told to Jimmy that it is right thing to do.
[1] http://www.foxnews.com/scitech/2010/05/07/wikipedia-purges-porn/
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Milos; as a strong left wing liberal I entirely support your ideas on censorship etc.
But on the other hand I've always recognised that some people do not want to see certain things... especially things like nudity or images of Muhammed. And I don't see any issues with giving them the tools to hide such things - it feels better than cramming it down their throats based on our liberal agenda :)
Sure; it needs to be done with care so as not to be gamed, and so that it does reflect a personal choice.
But I've never seen an issue with this sort of thing; from our readers perspective it is a useful tool, to the editor community perhaps not so much - but lets consider the readers for a moment.
Tom
On Tue, Sep 6, 2011 at 15:01, Thomas Morton morton.thomas@googlemail.com wrote:
Milos; as a strong left wing liberal I entirely support your ideas on censorship etc.
But on the other hand I've always recognised that some people do not want to see certain things... especially things like nudity or images of Muhammed. And I don't see any issues with giving them the tools to hide such things - it feels better than cramming it down their throats based on our liberal agenda :)
Sure; it needs to be done with care so as not to be gamed, and so that it does reflect a personal choice.
But I've never seen an issue with this sort of thing; from our readers perspective it is a useful tool, to the editor community perhaps not so much
- but lets consider the readers for a moment.
First, I actually don't oppose to the filter per se. There is significant difference between what Jimmy did in May 2010 and this filter. From the point of freedom of information, that's not an issue. I was even thinking to support image filter inclusion; just to finish with that; but grotesque mismanagement repelled me of that idea.
However, there are two unsolved issues and two problems in relation to the process itself.
The unsolved issue is the fact that it's not our job to censor content, it's the job of those who want to censor. And it's pretty easy to implement it (see Appendix A for algorithm). Nobody of pro-censor Board members and others addressed that issue. Besides the fact that by introducing soft censorship we are introducing *censorship*, which could be treated variously in various parts of the world. Nobody of them gave a decent analysis (neither here, neither on internal-l) about possible consequence of introducing censorship. But, anyway, Board will deal with possible negative consequences, not me, so it's not my problem.
The first problem in front of us is the fact that the majority of core editors disagree with the filter. While I don't think that the filter is a big deal, disagreement of the core editors is. Forcing the issue above the will of the core community means that Board has the plan how to create new core community (probably consisted of Concerned Women for America and similar organizations).
But, they don't have such plan, of course. Because they basically don't have any plan at all. It is known habit of the Board to make strategic decisions (or tactical decisions with significant strategic influences) on ad hoc basis. And that's the second and most important problem of the movement itself. The goal of employing Robert Harris was not to find what would be the best, but how to make something which would look like political compromise. And it is not about the compromise inside of the movement nor compromise in the interest of Wikimedia movement, but compromise between personal wishes of two Board members and the [vast majority of the] rest of the movement. (Again, I don't count a member of Concerned Women for America with 17 edits as movement member.)
And that's not just unacceptable, but dangerous.
At Research committee list [1] there is ongoing discussion related to John Vanderberg's question "Was this survey approved by the Research Committee?" [2]. Research committee wasn't asked, of course (and WereSpielChequers is working on statement). Because, simply, politically motivated junk science requires implementation, not questions about validity of premises.
[1] http://lists.wikimedia.org/pipermail/rcom-l/2011-September/000327.html [2] http://lists.wikimedia.org/pipermail/foundation-l/2011-September/067889.html
* * *
[Appendix A]
Today's Commons picture of the day is Maze Coral [1]. It is inside of the categories Marine animals of Haiti [2] and Meandrinidae [3]. Its location on Upload Wikimedia is [4] and link to the image on page is [5].
If someone wants to create a decent censorship tool and has aversion toward marine animals of Haiti, that person should: 1) go to the category Marine animal of Haiti; 2) list all images inside of that category; 3) take links to the image at upload.wikimedia.org; 4) add image link to the censorship database; 5) add .../wikipedia/commons/thumb/<$1>/<$2>/<image name>.* regex into the censorship database.
That's not our business.
[1] http://commons.wikimedia.org/wiki/File:Meandrina_meandrites_%28Maze_Coral%29... [2] http://commons.wikimedia.org/wiki/Category:Marine_animals_of_Haiti [3] http://commons.wikimedia.org/wiki/Category:Meandrinidae [4] http://upload.wikimedia.org/wikipedia/commons/e/ef/Meandrina_meandrites_%28M... [5] http://upload.wikimedia.org/wikipedia/commons/thumb/e/ef/Meandrina_meandrite...
First, I actually don't oppose to the filter per se. There is significant difference between what Jimmy did in May 2010 and this filter. From the point of freedom of information, that's not an issue. I was even thinking to support image filter inclusion; just to finish with that; but grotesque mismanagement repelled me of that idea.
Concur; this survey has not instilled in me the idea that this will work out well in the end :S
The unsolved issue is the fact that it's not our job to censor content, it's the job of those who want to censor. And it's pretty easy to implement it (see Appendix A for algorithm).
Yes, but guaranteed you're going to end up with readers asking why on earth they have to go through and manually implement these filters; they'll want some defaults they can "just use". I posit that the majority of people wanting to use this thing will likely want to simply click "Do not show me images of X" and leave it there. This is not a scientific study of what the reader wants - we do need to do one of those - just my RL experience of how web users interact.
I recall a message in an previous thread that went into ideas of how to do this in a less centralised way (to avoid the idea of it not being our job to censor).
The first problem in front of us is the fact that the majority of core
editors disagree with the filter. While I don't think that the filter is a big deal, disagreement of the core editors is. Forcing the issue above the will of the core community means that Board has the plan how to create new core community (probably consisted of Concerned Women for America and similar organizations).
The problem I see here is that editors are a biased group to poll in relation to this - this is a tool for readers, and it should be up to the readers to comment on what they would like to see. The editorship has an anti-censorship view, and largely will not approve of using this tool themselves (Not Censored etc.). However I suspect a large number of readers do feel differently... if only we knew the figures...
I'm not sure why we would necessarily let editors stall that feature request - or why we are primarily polling editors and not readers about this situation.
I'd like to see some user studies done to see what the wider response to this idea might be...
As an encyclopaedia we consistently forget that for *all* of us the readers are our customers, and represent the vast majority of people using Wikipedia - and we should be improving the software for them as much as for the editor community.
Tom
On Tue, Sep 6, 2011 at 16:07, Thomas Morton morton.thomas@googlemail.com wrote:
Yes, but guaranteed you're going to end up with readers asking why on earth they have to go through and manually implement these filters; they'll want some defaults they can "just use". I posit that the majority of people wanting to use this thing will likely want to simply click "Do not show me images of X" and leave it there. This is not a scientific study of what the reader wants - we do need to do one of those - just my RL experience of how web users interact.
I recall a message in an previous thread that went into ideas of how to do this in a less centralised way (to avoid the idea of it not being our job to censor).
That would mean that pornography exists just on Wikimedia Commons. Those who censor sexually explicit and other images use censorship software.
The problem I see here is that editors are a biased group to poll in relation to this - this is a tool for readers, and it should be up to the readers to comment on what they would like to see. The editorship has an anti-censorship view, and largely will not approve of using this tool themselves (Not Censored etc.). However I suspect a large number of readers do feel differently... if only we knew the figures...
I'm not sure why we would necessarily let editors stall that feature request
- or why we are primarily polling editors and not readers about this
situation.
I'd like to see some user studies done to see what the wider response to this idea might be...
As an encyclopaedia we consistently forget that for *all* of us the readers are our customers, and represent the vast majority of people using Wikipedia
- and we should be improving the software for them as much as for the editor
community.
The *first* instance to be asked about such thing are editors, not readers. I mean, the first question is "Do *we* want it?". Readers opinion could be one of the arguments in discussion; likely one of the most important ones; but decision should be on editors. And Board should act in opposition to editors just if there is serious threat for the project existence. However, nobody gave any reason in favor of avoiding editors' will in favor of Board's decision. Nothing rational, just personal wishes of a couple of people. And, again, if those wishes could pass without a lot of drama, I would be fine with it. However, that's not the case.
The *first* instance to be asked about such thing are editors, not readers. I mean, the first question is "Do *we* want it?". Readers opinion could be one of the arguments in discussion; likely one of the most important ones; but decision should be on editors. And Board should act in opposition to editors just if there is serious threat for the project existence. However, nobody gave any reason in favor of avoiding editors' will in favor of Board's decision. Nothing rational, just personal wishes of a couple of people. And, again, if those wishes could pass without a lot of drama, I would be fine with it. However, that's not the case.
As always; I disagree with this view in the strongest possible way :)
Readers should always be our primary focus, and their needs should drive everything we do - from editing/writing through to policy and technical changes. They are our life blood and our reason for existing.
Just saying :)
Tom
On Tue, Sep 06, 2011 at 04:10:54PM +0100, Thomas Morton wrote:
The *first* instance to be asked about such thing are editors, not readers. I mean, the first question is "Do *we* want it?". Readers opinion could be one of the arguments in discussion; likely one of the most important ones; but decision should be on editors. And Board should act in opposition to editors just if there is serious threat for the project existence. However, nobody gave any reason in favor of avoiding editors' will in favor of Board's decision. Nothing rational, just personal wishes of a couple of people. And, again, if those wishes could pass without a lot of drama, I would be fine with it. However, that's not the case.
As always; I disagree with this view in the strongest possible way :)
Readers should always be our primary focus, and their needs should drive everything we do - from editing/writing through to policy and technical changes. They are our life blood and our reason for existing.
I oppose any form of reader/editor dichotomy in the strongest possible way. A wiki operates on the premise that all readers are editors, and all editors are readers.
Any kind of distinction is pathological within the context of a wiki and will hasten its demise. (As we are in fact seeing) [1]
So as a matter of dogma in the context of running a wiki, readers are important in the sense that they need to be converted into editor/readers. If you want to make a distinction, it would be wise to stop running a wiki, and start looking for a different paradigm.
Possibly if we feel that certain encyclopedias are finished; we may indeed want to stop running those wikis, kill off those communities, harvest the content; and start using ye olde Nupedia model to polish the final product.
(Of course, I do have an opinion on whether or not an encyclopedia can be considered "finished" at a point in time dominated by Moore's law)
sincerely, Kim Bruning
[1] One of the things that impresses me about the current foundation staff is that they recognize the growing dichotomy as a problem, and are willing to fight to prevent it. :-)
I oppose any form of reader/editor dichotomy in the strongest possible way.
And yet speak in support of the current system - which makes no effort to listen readers... it enforces a dichotomy of its own!
A wiki operates on the premise that all readers are editors, and all editors are readers.
I used to think this. I've come to realise this is a naive view; there are many people who are consumers of knowledge and are incapable, unable or uninteresting in *creating* knowledge (i.e. an editor). Theoretically they could be an editor (and we should of course work on the principle that they could manage that at any time) but in practice most aren't and we need to cater for them where possible.
Let's consider this in another format; most of us eat chicken. Theoretically it is not all that hard to raise chickens, kill them and eat them. But in reality various hurdles exist to me keeping raising chickens. But I still like to eat chicken, raised by poultry farmers.
Now it would be a poor group of poultry farmers who made decisions purely on what is right for them as a distinct group, on the basis that anyone could theoretically raise a chicken. Instead they make decisions that a) maximise the customer experience and b) make their job easier.
Any kind of distinction is pathological within the context of a wiki and will hasten its demise. (As we are in fact seeing) [1]
Where are we seeing this? Encouraging readers to become editors is a good thing; we are not at all good at it - conversion rates are low and the community is insular, closed, discouraging and elitest to most newcomers.
So as a matter of dogma in the context of running a wiki, readers are
important in the sense that they need to be converted into
editor/readers.
I don't see this as mutually exclusive - we can cater primarily for the needs of a reader, and we can encourage them to become editors as much as possible. In fact the two aims are highly complementary.
My point stands, though, even in this case; whether you are making it "better" to read WP, or converting them to editors you are still focusing on the reader, as it should be.
Possibly if we feel that certain encyclopedias are finished; we may indeed want to stop running those wikis, kill off those communities, harvest the content; and start using ye olde Nupedia model to polish the final product.
As I say; I don't see the link here between considering what our readers want....
We are making this knowledge primarily for it to be read; every time i make an edit to Wikipedia I try to consider it in the context of "is this adding knowledge, how will this affect a reader, will it impart something useful, etc." We don't do a lot of that.
Finally I want to point out that I wasn't really making a distinction between editor and reader; every editor is generally a reader. The problem in this case is that we are asking a readership question of the editing community - which represents a small and *biased* portion of the readership :) Just because someone does not want to edit the Wiki does not mean their view on what they would like to see is valueless.
We don't appear to listen to our readers much at all.
Which is sad.
Tom
On Tue, Sep 06, 2011 at 05:21:22PM +0100, Thomas Morton wrote:
Theoretically they could be an editor (and we should of course work on the principle that they could manage that at any time) but in practice most aren't and we need to cater for them where possible.
It is very hard to cater for someone when you are not engaged with them in conversation. Any attempts to do so are doomed to make an ASS of U and ME (ASSUME).
Don't ASSUME. ASK!
We have an existing mechanism by which people can engage and ASK, but many choose not to use it. Per definition, they are forfeiting their rights, unfortunately, (if they even exist).
How can I lose sleep over those people, if I don't even know if they have anything to say in the first place?
Incidentally, many attempts to "help" "readers" end up actually disenfranchising them. (and also disenfranchise anyone who might have been in a position to help them). Why? Because it puts a wedge between producers and consumers, even while we're attempting to create a prosumer class.
I do lose a bit of sleep over that.
sincerely, Kim Bruning
It is very hard to cater for someone when you are not engaged with them in conversation. Any attempts to do so are doomed to make an ASS of U and ME (ASSUME).
It is hard, sure; most users/consumers don't engage - which is why a whole industry has grown around finding out what they want and meeting that need.
But just because it is hard is not an excuse to not bother :)
Unless you are suggesting that our current use as a knowledge-base is incidental to the point of Wikipedia (which seems a little out of track with out goals...).
Don't ASSUME. ASK!
We have an existing mechanism by which people can engage and ASK, but many choose not to use it. Per definition, they are forfeiting their rights, unfortunately, (if they even exist).
You have to solicit those views, hunt them down and beat out of them what their gripes and bug bears are. They will not come to you.
This is the basics of creating a good product.
You have the process the wrong way round - leaving the consumer to be the one doing the asking. But they are a mundane person flicking through reading articles, some might have ideas on how to improve thins. But you won't find them telling us without prompting.
This is why big companies will invest millions of dollars finding out what it is their consumers want.
We are the ones who have to ASK
Incidentally, many attempts to "help" "readers" end up actually
disenfranchising them. (and also disenfranchise anyone who might have been in a position to help them). Why? Because it puts a wedge between producers and consumers, even while we're attempting to create a prosumer class.
Usually because the producers think they know what consumers want. Which never really works.
By contrast your approach/attitude creates the exact dichotomy you claim to oppose.
Tom
(BTW your comments are coming across as acerbic/ironic and at times a quite patronising - that is perhaps hampering people's ability to respond constructively)
On Wed, Sep 07, 2011 at 10:45:29PM +0100, Thomas Morton wrote:
It is very hard to cater for someone when you are not engaged with them in conversation. Any attempts to do so are doomed to make an ASS of U and ME (ASSUME).
It is hard, sure; most users/consumers don't engage - which is why a whole industry has grown around finding out what they want and meeting that need.
Yes, but we're not that industry. In fact, (rightly-or-wrongly) we characterize that industry as an Enemy.
You have to solicit those views, hunt them down and beat out of them what their gripes and bug bears are. They will not come to you.
To an extent, but this assumes people are stupid and don't want to help. Usually they do, if they know they can and are welcome.
This has happened in the past, and still does happen to an extent today. (Although many articles in news and blogs show that the community on -en and -nl among others are becoming more and more insular, sadly. The foundation is working to alleviate this).
This is the basics of creating a good product.
We're not creating a "product".
You have the process the wrong way round - leaving the consumer to be the one doing the asking. But they are a mundane person flicking through reading articles, some might have ideas on how to improve thins. But you won't find them telling us without prompting.
Right, if they don't care enough, they won't. If we make the barriers to entry higher than their ability to care, they won't either.
This is why big companies will invest millions of dollars finding out what it is their consumers want.
Or, in fact, billions. We have already outperformed those companies. There are no tail-lights. But -being in the lead- we risk losing a goal to chase after.
We are the ones who have to ASK
Obviously. Because there's no "us" and "them". Just an "us". And all of us need to ASK. :-)
Usually because the producers think they know what consumers want. Which never really works.
That's why I oppose producers.
By contrast your approach/attitude creates the exact dichotomy you claim to oppose.
I think that our original approach has had a proven track record. It's only after people abandoned it and/or got sloppy that things went downhill, after all.
sincerely, Kim Bruning
(BTW your comments are coming across as acerbic/ironic and at times a quite patronising - that is perhaps hampering people's ability to respond constructively)
(Ahhh, your comments irritate me a bit too. I guess we're reflecting that back and forth at each other and making it worse. Sorry about that. Let's try hard to both be more polite to each other! Was this mail better already?)
On Tue, Sep 6, 2011 at 11:07 AM, Milos Rancic millosh@gmail.com wrote:
On Tue, Sep 6, 2011 at 16:07, Thomas Morton morton.thomas@googlemail.com wrote:
Yes, but guaranteed you're going to end up with readers asking why on
earth
they have to go through and manually implement these filters; they'll
want
some defaults they can "just use". I posit that the majority of people wanting to use this thing will likely want to simply click "Do not show
me
images of X" and leave it there. This is not a scientific study of what
the
reader wants - we do need to do one of those - just my RL experience of
how
web users interact.
I recall a message in an previous thread that went into ideas of how to
do
this in a less centralised way (to avoid the idea of it not being our job
to
censor).
That would mean that pornography exists just on Wikimedia Commons. Those who censor sexually explicit and other images use censorship software.
The problem I see here is that editors are a biased group to poll in relation to this - this is a tool for readers, and it should be up to the readers to comment on what they would like to see. The editorship has an anti-censorship view, and largely will not approve of using this tool themselves (Not Censored etc.). However I suspect a large number of
readers
do feel differently... if only we knew the figures...
I'm not sure why we would necessarily let editors stall that feature
request
- or why we are primarily polling editors and not readers about this
situation.
I'd like to see some user studies done to see what the wider response to this idea might be...
As an encyclopaedia we consistently forget that for *all* of us the
readers
are our customers, and represent the vast majority of people using
Wikipedia
- and we should be improving the software for them as much as for the
editor
community.
The *first* instance to be asked about such thing are editors, not readers. I mean, the first question is "Do *we* want it?". Readers opinion could be one of the arguments in discussion; likely one of the most important ones; but decision should be on editors. And Board should act in opposition to editors just if there is serious threat for the project existence. However, nobody gave any reason in favor of avoiding editors' will in favor of Board's decision. Nothing rational, just personal wishes of a couple of people. And, again, if those wishes could pass without a lot of drama, I would be fine with it. However, that's not the case.
While I'm very interested in hearing the opinion of our current editors, I disagree that we will can collect and disseminate information in a neutral way to all the people of world if we continue to listen solely to our core group of editors. Our current editors come from much too narrow a demographic group to think that we are making content decision that represent a global view.
I realize that change is uncomfortable, but we must find ways to be more inclusive in order to achieve the WMF core mission.
A WMF offered content filter is one way that we can reach people who otherwise would not be inclined to read or edit WMF projects. Although I may not necessarily agree with the viewing options of some of the people who use the filter, I respect their choice because I believe that they know better than me what is best for them.
I strongly oppose any decision making process that does not look outside of WMF for ideas. The surest way for WMF to grow stagnant is to work in an echo chamber. And it is imperative for WMF staff, WMF Board, and WMF community to welcome diverse views in our discussions.
On a final note, I ask our regular community members to be welcoming and tolerant of people who they think have different ideas from their own. There is no doubt that I have learned the most when I was in dialogue with people who had vastly different opinions from mine. I think that this will be true in our community, too.
Sydney Poore User:FloNight
On Tue, Sep 6, 2011 at 18:04, Sydney Poore sydney.poore@gmail.com wrote:
While I'm very interested in hearing the opinion of our current editors, I disagree that we will can collect and disseminate information in a neutral way to all the people of world if we continue to listen solely to our core group of editors. Our current editors come from much too narrow a demographic group to think that we are making content decision that represent a global view.
I realize that change is uncomfortable, but we must find ways to be more inclusive in order to achieve the WMF core mission.
A WMF offered content filter is one way that we can reach people who otherwise would not be inclined to read or edit WMF projects. Although I may not necessarily agree with the viewing options of some of the people who use the filter, I respect their choice because I believe that they know better than me what is best for them.
I strongly oppose any decision making process that does not look outside of WMF for ideas. The surest way for WMF to grow stagnant is to work in an echo chamber. And it is imperative for WMF staff, WMF Board, and WMF community to welcome diverse views in our discussions.
On a final note, I ask our regular community members to be welcoming and tolerant of people who they think have different ideas from their own. There is no doubt that I have learned the most when I was in dialogue with people who had vastly different opinions from mine. I think that this will be true in our community, too.
I didn't say that we shouldn't look into readers' opinions; I said that *decision* is on editors, as it is not the question of life and death; not even a high profile question out of right-wing US. (Many Muslim countries already filter sexually explicit images; which means that it is not their question, as well.)
Contrary to your premises, I don't think that raising number of readers and editors lays in filtering any image. All of the numbers show that it is about other things, like, for example, that Facebook is more attractive than editing Wikipedia. If you have some data to support your position, please let us know.
The last issue is the fact that modern encyclopedia is well *ideologically* defined. It is positivist phenomenon and its roots are in scientific method. Wikipedia has Five pillars and a number of other policies which define it ideologically, as well. Those who think that such project is unacceptable are free to use other sums of knowledge and to build their own ones. It is not possible to be absolutely inclusive. Being fully acceptable for ~50% of population is also very questionable.
On Tue, Sep 6, 2011 at 1:05 PM, Milos Rancic millosh@gmail.com wrote:
On Tue, Sep 6, 2011 at 18:04, Sydney Poore sydney.poore@gmail.com wrote:
While I'm very interested in hearing the opinion of our current editors,
I
disagree that we will can collect and disseminate information in a
neutral
way to all the people of world if we continue to listen solely to our
core
group of editors. Our current editors come from much too narrow a demographic group to think that we are making content decision that represent a global view.
I realize that change is uncomfortable, but we must find ways to be more inclusive in order to achieve the WMF core mission.
A WMF offered content filter is one way that we can reach people who otherwise would not be inclined to read or edit WMF projects. Although I
may
not necessarily agree with the viewing options of some of the people who
use
the filter, I respect their choice because I believe that they know
better
than me what is best for them.
I strongly oppose any decision making process that does not look outside
of
WMF for ideas. The surest way for WMF to grow stagnant is to work in an
echo
chamber. And it is imperative for WMF staff, WMF Board, and WMF community
to
welcome diverse views in our discussions.
On a final note, I ask our regular community members to be welcoming and tolerant of people who they think have different ideas from their own. There is no doubt that I have learned the most when I was in dialogue
with
people who had vastly different opinions from mine. I think that this
will
be true in our community, too.
I didn't say that we shouldn't look into readers' opinions; I said that *decision* is on editors, as it is not the question of life and death; not even a high profile question out of right-wing US. (Many Muslim countries already filter sexually explicit images; which means that it is not their question, as well.)
Seeking outside opinions, and outreach efforts to bring more people into our Communities are high on my list of priorities because WMF contributor base is too homogeneous for me to be comfortable that our community members are making neutral decisions.
Contrary to your premises, I don't think that raising number of readers and editors lays in filtering any image. All of the numbers show that it is about other things, like, for example, that Facebook is more attractive than editing Wikipedia. If you have some data to support your position, please let us know.
1) We have people speaking up publicly saying that they are not able to edit from some locations because of the presence of some images on our Projects. Numerous editors have told me this in private, too. 2) We regularly have people put up "controversial content" for deletion because they find it offensive or out of scope. 3) Image filters are commonly available on other internet website, often by default.
The idea of offering imagine filters on WMF project is much more controversial than it is on other internet websites. So, I I think that it is fair to suggest that we examine why we are having conflicts over this topic when other website don't. One possible reason is that our base of editors is different from other websites. If that is true, then I think we need to allow for this difference when we make features to appeal to readers.
The last issue is the fact that modern encyclopedia is well *ideologically* defined. It is positivist phenomenon and its roots are in scientific method. Wikipedia has Five pillars and a number of other policies which define it ideologically, as well. Those who think that such project is unacceptable are free to use other sums of knowledge and to build their own ones. It is not possible to be absolutely inclusive. Being fully acceptable for ~50% of population is also very questionable.
On WMF projects images are not collected using anything remotely close to the "Five pillars" that define content on Wikipedia projects. Much of the content is self made, low quality, and without out descriptions that would be adequate to give proper captions for publication in the general media, and certainly not in a scholarly works.
The way that WMF collects and uses images is one of the biggest differences between us and other organizations that have a similar mission. Libraries, museums, universities, publishers of reference works, and other educationally minded organizations do not solicit for amateur images for their collections. Lack of peer review of our images prior to acquisition is at the heart of the problem and is large part of what is causing the disconnect between the people who do not approve of our "controversial content" and our editors who upload the images.
Sydney
2011/9/7 Sydney Poore sydney.poore@gmail.com:
The way that WMF collects and uses images is one of the biggest differences between us and other organizations that have a similar mission. Libraries, museums, universities, publishers of reference works, and other educationally minded organizations do not solicit for amateur images for their collections. Lack of peer review of our images prior to acquisition is at the heart of the problem and is large part of what is causing the disconnect between the people who do not approve of our "controversial content" and our editors who upload the images.
Well, other "educationally minded organizations" do not either solicit amateurs for writing encyclopedic articles. But we do peer review images after they have been uploaded on Commons or Wikipedia.
It seems that, 10 years after Wikipedia and its sisters have been created, you still do not understand that there are wikis.
Sydney
Regards,
Yann
On Wed, Sep 7, 2011 at 7:35 AM, Yann Forget yannfo@gmail.com wrote:
2011/9/7 Sydney Poore sydney.poore@gmail.com:
The way that WMF collects and uses images is one of the biggest
differences
between us and other organizations that have a similar mission.
Libraries,
museums, universities, publishers of reference works, and other educationally minded organizations do not solicit for amateur images for their collections. Lack of peer review of our images prior to
acquisition
is at the heart of the problem and is large part of what is causing the disconnect between the people who do not approve of our "controversial content" and our editors who upload the images.
Well, other "educationally minded organizations" do not either solicit amateurs for writing encyclopedic articles. But we do peer review images after they have been uploaded on Commons or Wikipedia.
It seems that, 10 years after Wikipedia and its sisters have been created, you still do not understand that there are wikis.
Sydney
Regards,
Yann
Hi Yann,
You are someone that does deletions on Commons of images that are out of scope. I very much appreciate your work as it helps keep some of the worst images out of Commons.
But in my view, this is not the same type of peer review that is used when creating content on Wikipedia. In general, we expect the content to come from an existing body of work that has already under gone a vigorous form of review by people who are trained to know if the content is high quality or not.
I upload original images to Commons, too. :-)
I'm not suggesting that we abandon this system now. But we need to recognize the way that the abundance of low quality images is limiting our ability to create high quality works.
In practice, some Wikipedias also have a problem with peer reviewing content, too. Suppression of unsourced content is needed because some wikis don't have a way to prevent the addition of very inappropriate material.
IMO, reminding ourselves of the problems with the way that wikis work is essential to finding ways to improve.
Sydney User:FloNight
On Wed, Sep 7, 2011 at 05:35, Yann Forget yannfo@gmail.com wrote:
But we do peer review images after they have been uploaded on Commons or Wikipedia.
It seems that, 10 years after Wikipedia and its sisters have been created, you still do not understand that there are wikis.
Regards,
Yann
Yann, I yesterday looked at the Veganism article, only to find a photograph in the infobox, not of yummy tofu scramble as before, but a close-up of a woman's genitals, with a vibrator and what looked like a man's fingers. I clicked on it, and saw it was being hosted by the Wikimedia Foundation, uploaded from Flickr by the Flickr upload bot.
Objecting to this isn't a question of being prudish or of censorship, or of being anti-wiki. But if we want to attract mature editors, women editors, editors from outside the majority cultures on Wikipedia, and serious readers, this kind of thing is obviously very off-putting. So we risk limiting our reach by not dealing with it.
Sarah
2011/9/8 Sarah slimvirgin@gmail.com:
On Wed, Sep 7, 2011 at 05:35, Yann Forget yannfo@gmail.com wrote:
But we do peer review images after they have been uploaded on Commons or Wikipedia.
It seems that, 10 years after Wikipedia and its sisters have been created, you still do not understand that there are wikis.
Regards,
Yann
Yann, I yesterday looked at the Veganism article, only to find a photograph in the infobox, not of yummy tofu scramble as before, but a close-up of a woman's genitals, with a vibrator and what looked like a man's fingers. I clicked on it, and saw it was being hosted by the Wikimedia Foundation, uploaded from Flickr by the Flickr upload bot.
Actually we already have a list of "objectionable" images for blocking this kind of vandalism: http://en.wikipedia.org/wiki/MediaWiki:Bad_image_list I am not sure a new tool is needed for that, unless you find the image objectionable in itself, but this is another issue.
Objecting to this isn't a question of being prudish or of censorship, or of being anti-wiki. But if we want to attract mature editors, women editors, editors from outside the majority cultures on Wikipedia, and serious readers, this kind of thing is obviously very off-putting. So we risk limiting our reach by not dealing with it.
Sarah
Regards,
Yann
*Objecting to this isn't a question of being prudish or of censorship, or of being anti-wiki. But if we want to attract (...) women editors, editors from outside the majority cultures on Wikipedia (...) this kind of thing is obviously very off-putting. So we risk limiting our reach by not dealing with it.
Speak for yourself. Like i told before in Gender Gap list, remove the "porn/naked" images will not solve gender gap. And about culture - forgive me - but the only people who seems concerned about remove those images from wiki are AFAIS american.
*Béria Lima* (PS.: IF you want to see some "really" bad naked/porn images, who shouldn't be in Wikipédia, talk with a Commons admin - even a female admin like me saw thinks 100 times worst than a naked girl showing her 8 months pregancy belly that you are complaining so much abouthttp://lists.wikimedia.org/pipermail/gendergap/2011-September/001308.html )
On 8 September 2011 04:58, Sarah slimvirgin@gmail.com wrote:
On Wed, Sep 7, 2011 at 05:35, Yann Forget yannfo@gmail.com wrote:
But we do peer review images after they have been uploaded on Commons or Wikipedia.
It seems that, 10 years after Wikipedia and its sisters have been created, you still do not understand that there are wikis.
Regards,
Yann
Yann, I yesterday looked at the Veganism article, only to find a photograph in the infobox, not of yummy tofu scramble as before, but a close-up of a woman's genitals, with a vibrator and what looked like a man's fingers. I clicked on it, and saw it was being hosted by the Wikimedia Foundation, uploaded from Flickr by the Flickr upload bot.
Objecting to this isn't a question of being prudish or of censorship, or of being anti-wiki. But if we want to attract mature editors, women editors, editors from outside the majority cultures on Wikipedia, and serious readers, this kind of thing is obviously very off-putting. So we risk limiting our reach by not dealing with it.
Sarah
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On Thu, Sep 8, 2011 at 1:31 AM, Béria Lima berialima@gmail.com wrote:
And about culture - forgive me - but the only people who seems concerned about remove those images from wiki are AFAIS american.
I'm sorry, no. This is just untrue.
I wonder, would the same sentence be acceptable if you substitute anything else for the word "American"?
pb
On Sat, Sep 10, 2011 at 4:07 PM, Philippe Beaudette philippe@wikimedia.org wrote:
On Thu, Sep 8, 2011 at 1:31 AM, Béria Lima berialima@gmail.com wrote:
And about culture - forgive me - but the only people who seems concerned about remove those images from wiki are AFAIS american.
I'm sorry, no. This is just untrue.
It would be nice to see some analysis of the results per country or language.
But please can we have the data first, so that the analysis tasks can be undertaken by we the people.
http://meta.wikimedia.org/wiki/Talk:Image_filter_referendum/Results/en#Relea...
-- John Vandenberg
As soon as I've got it to give and the comments have been anonymized, absolutely.
I do not yet have a full feed that meets our needs for analysis beyond what's already done. ___________________ Philippe Beaudette Head of Reader Relations Wikimedia Foundation, Inc.
415-839-6885, x 6643
philippe@wikimedia.org
On Sat, Sep 10, 2011 at 12:04 AM, John Vandenberg jayvdb@gmail.com wrote:
On Sat, Sep 10, 2011 at 4:07 PM, Philippe Beaudette philippe@wikimedia.org wrote:
On Thu, Sep 8, 2011 at 1:31 AM, Béria Lima berialima@gmail.com wrote:
And about culture - forgive me - but the only people who seems concerned about remove those images
from
wiki are AFAIS american.
I'm sorry, no. This is just untrue.
It would be nice to see some analysis of the results per country or language.
But please can we have the data first, so that the analysis tasks can be undertaken by we the people.
http://meta.wikimedia.org/wiki/Talk:Image_filter_referendum/Results/en#Relea...
-- John Vandenberg
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Hello,
2011/9/10 Philippe Beaudette philippe@wikimedia.org:
As soon as I've got it to give and the comments have been anonymized, absolutely.
I do not yet have a full feed that meets our needs for analysis beyond what's already done.
We should have started by this before organizing a "referendum".
Regards,
Yann
Philippe Beaudette Head of Reader Relations Wikimedia Foundation, Inc.
On 10 September 2011 09:34, Yann Forget yannfo@gmail.com wrote:
2011/9/10 Philippe Beaudette philippe@wikimedia.org:
I do not yet have a full feed that meets our needs for analysis beyond what's already done.
We should have started by this before organizing a "referendum".
I've asked only twice now, here goes for a third time:
What was the process of coming up with the questions? Who did this? As much detail as possible please.
- d.
On Sat, Sep 10, 2011 at 6:40 PM, David Gerard dgerard@gmail.com wrote:
On 10 September 2011 09:34, Yann Forget yannfo@gmail.com wrote:
2011/9/10 Philippe Beaudette philippe@wikimedia.org:
I do not yet have a full feed that meets our needs for analysis beyond what's already done.
We should have started by this before organizing a "referendum".
I've asked only twice now, here goes for a third time:
What was the process of coming up with the questions? Who did this? As much detail as possible please.
I'm also scratching my head about this, and I would like to know who created and approved this survey instrument.
http://meta.wikimedia.org/wiki/Image_filter_referendum/Committee/en
"The Committee is responsible for planning and maintaining virtually every aspect of the image filter referendum. For example, the Committee plans the type of voting and requirements for voters, drafts and organizes all of the official referendum pages on Meta, verifies that voters meet the criteria, audits votes to ensure there are no duplicate votes or other problems, et cetera."
Does "the type of voting" include the survey instrument used?
If not, does the committee know who is responsible for it?
-- John Vandenberg
On 10 September 2011 01:40, David Gerard dgerard@gmail.com wrote:
On 10 September 2011 09:34, Yann Forget yannfo@gmail.com wrote:
2011/9/10 Philippe Beaudette philippe@wikimedia.org:
I do not yet have a full feed that meets our needs for analysis beyond what's already done.
We should have started by this before organizing a "referendum".
I've asked only twice now, here goes for a third time:
What was the process of coming up with the questions? Who did this? As much detail as possible please.
I wrote the questions, with Phoebe and SJ, in Boston at the Wikipedia in Higher Ed conference.
It's not a secret -- I wrote about it here: http://meta.wikimedia.org/w/index.php?title=Talk%3AImage_filter_referendum%2...
Thanks, Sue
On Sat, Sep 10, 2011 at 09:38:38AM -0700, Sue Gardner wrote:
I wrote the questions, with Phoebe and SJ, in Boston at the Wikipedia in Higher Ed conference.
It's not a secret -- I wrote about it here: http://meta.wikimedia.org/w/index.php?title=Talk%3AImage_filter_referendum%2...
Awesome. That puts so much into perspective :-)
Thank you for answering that question, Sue!
I'll go apply the Feynman algorithm some more now. :-)
sincerely, Kim Bruning
On 11 September 2011 17:22, Kim Bruning kim@bruning.xs4all.nl wrote:
On Sat, Sep 10, 2011 at 09:38:38AM -0700, Sue Gardner wrote:
I wrote the questions, with Phoebe and SJ, in Boston at the Wikipedia in Higher Ed conference. It's not a secret -- I wrote about it here: http://meta.wikimedia.org/w/index.php?title=Talk%3AImage_filter_referendum%2...
Awesome. That puts so much into perspective :-) Thank you for answering that question, Sue!
Yes, thank you :-)
I note SJ's comment on the lack of a "do you want this?" question:
"I too wish that the separate question had been asked. –SJ talk | translate 20:54, 5 September 2011 (UTC)"
SJ, what can now be done to ask this - vital and missing - question?
- d.
On Sun, Sep 11, 2011 at 10:53 AM, David Gerard dgerard@gmail.com wrote:
On 11 September 2011 17:22, Kim Bruning kim@bruning.xs4all.nl wrote:
On Sat, Sep 10, 2011 at 09:38:38AM -0700, Sue Gardner wrote:
I wrote the questions, with Phoebe and SJ, in Boston at the Wikipedia in Higher Ed conference. It's not a secret -- I wrote about it here: http://meta.wikimedia.org/w/index.php?title=Talk%3AImage_filter_referendum%2...
Awesome. That puts so much into perspective :-) Thank you for answering that question, Sue!
Yes, thank you :-)
I note SJ's comment on the lack of a "do you want this?" question:
"I too wish that the separate question had been asked. –SJ talk | translate 20:54, 5 September 2011 (UTC)"
SJ, what can now be done to ask this - vital and missing - question?
- d.
David -- and all --
I've been away for a week offline, so am trying to catch up. I'm picking a random point in the thread to try and answer lots of questions at once, from my own viewpoint.
Re: the problems with the referendum -- it's my understanding that the committee in charge of running the referendum will be conducting a formal postmortem. But of course as someone involved I've been doing a lot of thinking about it, and reading comments, and a lot of what I've identified is just simple hindsight.
Here are some of those things:
a) In hindsight, of course we should not have called it a referendum; it was a survey, or a poll, on various design questions. I don't think anything specific was intended by the nomenclature one way or another -- it just started out being called a referendum, and the name stuck, and by the time people identified problems with that name the pages had already been translated and it seemed too hard to change it. Perhaps we should have anyway, given all the drama around the name. But nothing special was meant by it one way or the other; certainly no deception about intent.
b) In hindsight I would wanted us to get better analysis infrastructure set up ahead of time, if I'd realized this would be the single largest vote in Wikimedia history :) That said -- I am glad we have learned some things about conducting votes, and I think that the committee did handle the vote quite well. There are always things to improve, but they did a great job at handling voting problems gracefully and getting the results out fast, and I would like to thank them for all of their work, as well as for handling a difficult topic well -- committee members got a lot of undeserved personal flak as a result of volunteering for this job.
c) In hindsight I would have done more to clarify the role of the board in this process. The board didn't ask for the referendum to be conducted; Sue did, as part of being directed to implement the board's resolution. The board has naturally been sent the results, and I acted as board liaison to the referendum committee, and helped think through the questions -- but the referendum wasn't specifically a board project. (The board did ask for the feature to be built in the first place, however).
d) In hindsight I would have made sure that we had more careful review of the questions for their utility as survey instruments, perhaps running them past the research committee. There's not much precedent for that, but we could start!
e) The big question -- should we have asked "yes or no" or not? I pushed for not asking this directly because of the premise that we were asking for broad-scale community input on design, and because the board had already asked for the thing to be built, and because "importance" felt like a more subtle measure of where people stood. In hindsight, given all the controversy and the number of people who if they were consulted at all wanted to be asked simply yes or no, that was likely a mistake. People certainly made their views known in the comments and talk pages though, and I am glad we have that rich input.
f) It's not a surprise to me, or the Board, that this is controversial; from what the referendum did measure, it seems clear that the community is fairly split. I am glad that we had the referendum though, because it did reveal that split to be bimodal and complex. I have reviewed a sampling of the comments, and along with the negatives and those opposed on practical and philosophical grounds there are many positives, and many arguments for why such a feature is needed. And remember, we did broaden the net so that both long-term heavy editors and occasional, mostly-reader editors had a chance to say their piece, which I think was a success in getting much wider and diverse input that we generally do just here on foundation-l or on meta talk pages.
So given that, I think we owe it to the community to take both the negatives and the positives seriously; we cannot in good faith ignore either side.
Contrary to some speculation on this list, the board did try to think hard through the pros and cons before asking for what seemed like a reasonable measure with various protective principles in place (opt-in only, no change to editorial decisions). None of us want to censor. No other groups were involved. People's political views were not involved (Millosh certainly has not accurately represented mine, LOL). No donors were involved. Nor, in fact, were personal feelings deeply represented -- I personally had no special feelings one way or the other regarding the feature when I took on working on this issue for the board (and I have no special negative feelings about any of our content, except the poor quality stuff). Our intent as a board was simply, as Wikimedians, to help the readers of Wikimedia projects.
g) there are lots of great ideas -- in these threads, in the comments -- for how to implement such a feature, variations on the feature, and practical and philosophical concerns one way or another (for instance, I personally love the idea of also adding a link to turn off all images). We need to work hard at collecting these, integrating them, making sure we can build something that does what we want while realizing that there are always tradeoffs between usability, effectiveness, and what is possible.
Next steps: there's not a hard and fast timeline for the next steps; we are all thinking and discussing. I expect that Sue will send an update on what the foundation plans to do within the next few weeks. There is still in-depth analysis of the referendum results many of us would like to see, and that will take time. And finally there is a "next steps" page for the referendum -- there is some very helpful post-mortem stuff being posted there, so please join in. http://meta.wikimedia.org/wiki/Image_filter_referendum/Next_steps/en
Thanks for your patience; and thanks, as ever, for being Wikimedians.
-- phoebe, speaking only for herself, not for the board or committee
Is there a link somewhere to the total budget and actual staff costs of the referendum?
Thanks, Fae
Fae wrote:
Is there a link somewhere to the total budget and actual staff costs of the referendum?
This was asked very early on: http://meta.wikimedia.org/wiki/Talk:Image_filter_referendum/Archive1#Cost
I don't think it ever got a proper response. I'm not sure it's something that's knowable at this point. It's going to take X hours of development and Y hours of testing in order to produce Z. Not knowing any of the three variables makes accurate predicting fairly difficult, I think. :-) Plus you have to factor in (or factor out, maybe) volunteer resources, rewrites, etc.
MZMcBride
Fae wrote:
Is there a link somewhere to the total budget and actual staff costs of the referendum?
This was asked very early on: http://meta.wikimedia.org/wiki/Talk:Image_filter_referendum/Archive1#Cost
Sorry, not the same question as expected costs of implementation. I am asking for the actual staff costs for running and analysing the referendum itself (so far) plus any further planned costs. This should be entirely known and openly reported.
Obviously as with any well managed project, there should be a number of kill points. If the outcome of paying further staff costs for detailed analysis is unclear beyond working out that mistakes were made and the results are not all that meaningful, then I would consider killing the rest of the project now and put the saved budget towards more useful programmes.
Thanks, Fae
Fae wrote:
Fae wrote:
Is there a link somewhere to the total budget and actual staff costs of the referendum?
This was asked very early on: http://meta.wikimedia.org/wiki/Talk:Image_filter_referendum/Archive1#Cost
Sorry, not the same question as expected costs of implementation. I am asking for the actual staff costs for running and analysing the referendum itself (so far) plus any further planned costs. This should be entirely known and openly reported.
Oops. My bad.
I don't know of any such link and I doubt you'll ever seen one (unless you write the page yourself!). The Wikimedia cost side would largely be focused in Philippe (for organizing the referendum) and maybe in a contractor or two for some work on SecurePoll. Off-hand, I can't think of anyone else who was really involved from Wikimedia's side. I don't know if there's a cost associated with the vote hosting (by SPI, I believe), though I'd assume there is. (Unless the hosting is donated.)
Philippe organized a committee of users, so their time and resources would be calculated separately. All of their work was unpaid, as far as I know. Just another thankless task. "On Wikipedia, the reward for a job well done is another three jobs," as Mr. Gerard says. :-)
Other than costs noted above, I can't really think of too much else that went into this. Some Board people and staffers have commented on this list and on the talk page, but most of that is negligible cost and/or volunteer cost. There are grey areas to consider as well. For example, would you consider the time and resources that went into the mock-ups as part of the referendum costs?
I agree that a proper report would be nice, but I don't see it realistically happening, for a variety of reasons.
MZMcBride
On Thu, Sep 15, 2011 at 3:01 AM, MZMcBride z@mzmcbride.com wrote:
Fae wrote:
Fae wrote:
Is there a link somewhere to the total budget and actual staff costs of the referendum?
This was asked very early on: http://meta.wikimedia.org/wiki/Talk:Image_filter_referendum/Archive1#Cost
Sorry, not the same question as expected costs of implementation. I am asking for the actual staff costs for running and analysing the referendum itself (so far) plus any further planned costs. This should be entirely known and openly reported.
Oops. My bad.
I don't know of any such link and I doubt you'll ever seen one (unless you write the page yourself!). The Wikimedia cost side would largely be focused in Philippe (for organizing the referendum) and maybe in a contractor or two for some work on SecurePoll. Off-hand, I can't think of anyone else who was really involved from Wikimedia's side. I don't know if there's a cost associated with the vote hosting (by SPI, I believe), though I'd assume there is. (Unless the hosting is donated.)
Philippe organized a committee of users, so their time and resources would be calculated separately. All of their work was unpaid, as far as I know. Just another thankless task. "On Wikipedia, the reward for a job well done is another three jobs," as Mr. Gerard says. :-)
Other than costs noted above, I can't really think of too much else that went into this. Some Board people and staffers have commented on this list and on the talk page, but most of that is negligible cost and/or volunteer cost. There are grey areas to consider as well. For example, would you consider the time and resources that went into the mock-ups as part of the referendum costs?
Thanks Mzm -- this is all correct. Of course the work on SecurePoll also translates over for all the other elections; and work on filter mockups was actually done beforehand and could more accurately be counted as part of the implementation itself. As for the referendum committee everyone was a volunteer, just like all the other election committees, with the exception of Philippe who did this as yet another task added to his list of things to do (so some percentage but not the total of his overall time), and Maggie who observed (as she does many projects).
Other staff time? Sue's spent hundreds of hours thinking about this; I have no idea how you would separate that out from all the things she does in her 18-hour days :) There's been a few meetings with tech staff. Everyone on the staff has had to sort through a million emails on the subject, because we're all subscribed to the same lists, but that's a cost for everyone. In terms of people-hours spent replying on the lists and such... that's awfully hard to calculate. A lot, to be sure, but mostly volunteer time. I have personally spent heaven knows how many hundred (unpaid) hours on this, but I just chalk it up to being yet another Wikimedia project, albeit one that is taking a disproportionate amount of energy... it's what we do. (Of course I am not counting my therapy bills after this is all over, LOL).
-- phoebe
Thanks Phoebe. I'm glad to hear that the WMF has used almost no donated money in staff costs running this global referendum.
As a member of the board you may want to consider what it means in terms of operational accountability if such a large exercise with massive impact on our community has no measurable costs due to staff not reporting their time against it. I am puzzled at how your programme managers ever decide when to cancel projects if the resources they are consuming are not reported.
With regard to your comments about massive and possibly excessive use of "free" volunteer time, as a UK charity we are interested in improving how we measure e-volunteer effort spent on our projects as we believe that we should take care to avoid volunteer "burn-outs" (which we see too many of) or using up all of the good will that is represented by the efforts of our volunteers, of which I am one, without maximizing the impact on our mission. Perhaps WMF could consider the same issues when judging the success of its projects?
Cheers, Fae
On Thu, Sep 15, 2011 at 9:13 AM, Fae fae@wikimedia.org.uk wrote:
Thanks Phoebe. I'm glad to hear that the WMF has used almost no donated money in staff costs running this global referendum.
As a member of the board you may want to consider what it means in terms of operational accountability if such a large exercise with massive impact on our community has no measurable costs due to staff not reporting their time against it. I am puzzled at how your programme managers ever decide when to cancel projects if the resources they are consuming are not reported.
With regard to your comments about massive and possibly excessive use of "free" volunteer time, as a UK charity we are interested in improving how we measure e-volunteer effort spent on our projects as we believe that we should take care to avoid volunteer "burn-outs" (which we see too many of) or using up all of the good will that is represented by the efforts of our volunteers, of which I am one, without maximizing the impact on our mission. Perhaps WMF could consider the same issues when judging the success of its projects?
Cheers, Fae
Fae -- I'll be really interested to hear the results of what WMUK finds on the volunteer-burnout front. It's an important issue, and one we have largely glossed over in our 10-year history -- or just deal with as individuals.
One note -- me not knowing what kind of staff time was reported doesn't mean that reports don't exist; the staff don't report to the board directly. We work on the level of the annual plan, and divergence from it on a broad scale, so I was just speaking generally about the resources used... And while this referendum caused way more discussion than most things the WMF does, and thus had a much higher volunteer and community time cost, in terms of money and staff time it is a pretty tiny piece of the overall picture.
-- phoebe
One note -- me not knowing what kind of staff time was reported doesn't mean that reports don't exist; the staff don't report to the board directly. We work on the level of the annual plan, and divergence from it on a broad scale, so I was just speaking generally about the resources used... And while this referendum caused way more discussion than most things the WMF does, and thus had a much higher volunteer and community time cost, in terms of money and staff time it is a pretty tiny piece of the overall picture.
-- phoebe
Oh, I am reassured that the costs are tiny and that project reports might exist.
Can anyone with an understanding of WMF operational reports add a link to where the total actual cost of running the referendum can be seen, or are WMF project reports not available to the community?
It would be nice to give a true dollar value to "tiny" as we may want to commission several more on an improved basis if they can run so cheaply.
Thanks, Fae
On 15 September 2011 07:31, phoebe ayers phoebe.wiki@gmail.com wrote:
I've been away for a week offline, so am trying to catch up. I'm picking a random point in the thread to try and answer lots of questions at once, from my own viewpoint.
Thank you for this email. I'm going to pick just a few portions of it to respond to, since a lot of what I would say has already been covered at length by me and others.
c) In hindsight I would have done more to clarify the role of the board in this process. The board didn't ask for the referendum to be conducted; Sue did, as part of being directed to implement the board's resolution. The board has naturally been sent the results, and I acted as board liaison to the referendum committee, and helped think through the questions -- but the referendum wasn't specifically a board project. (The board did ask for the feature to be built in the first place, however).
The resolution did mandate Sue to consult with the community. Sue chose to do that using this "referendum", but she was required to do something.
d) In hindsight I would have made sure that we had more careful review of the questions for their utility as survey instruments, perhaps running them past the research committee. There's not much precedent for that, but we could start!
This is a recurring problem. The Foundation has a tendancy to do surveys and polls without actually thinking about what they are trying to find out and how they are going to analyse it before hand. You should know exactly what analysis you are going to do on the answers before you ask the questions, otherwise you have no chance of knowing what questions to ask.
e) The big question -- should we have asked "yes or no" or not? I pushed for not asking this directly because of the premise that we were asking for broad-scale community input on design, and because the board had already asked for the thing to be built, and because "importance" felt like a more subtle measure of where people stood. In hindsight, given all the controversy and the number of people who if they were consulted at all wanted to be asked simply yes or no, that was likely a mistake. People certainly made their views known in the comments and talk pages though, and I am glad we have that rich input.
The big problem with "importance" is that it is a relative concept and we weren't given anything to compare it to. That makes the whole thing meaningless.
f) It's not a surprise to me, or the Board, that this is controversial; from what the referendum did measure, it seems clear that the community is fairly split. I am glad that we had the referendum though, because it did reveal that split to be bimodal and complex. I have reviewed a sampling of the comments, and along with the negatives and those opposed on practical and philosophical grounds there are many positives, and many arguments for why such a feature is needed. And remember, we did broaden the net so that both long-term heavy editors and occasional, mostly-reader editors had a chance to say their piece, which I think was a success in getting much wider and diverse input that we generally do just here on foundation-l or on meta talk pages.
The bimodality could easily be an artifact of the flawed methodology. The lack of a "yes/no" question probably resulted in a lot of people choosing the interpret the first question as a "yes/no" question with 1=no and 10=yes, ignoring 2-9. If you had asked a combined yes/no-importance question (eg. "Should we do this? Yes, it's very important/Yes, but only if it doesn't cost too much/I don't know/No, but it won't really hurt if you do/No, it's very important that you don't") then I would guess you wouldn't get the same level of bimodality.
Just to followup on this. Andrew Garrett is talking with Tim about the best way to extract this data while still keeping the secrecy of a ballot intact. Some of the analysis we simply may not be able to do, without risking the secret ballot. We'll let you know more as we hear.
pb ___________________ Philippe Beaudette Head of Reader Relations Wikimedia Foundation, Inc.
415-839-6885, x 6643
philippe@wikimedia.org
On Sat, Sep 10, 2011 at 12:04 AM, John Vandenberg jayvdb@gmail.com wrote:
On Sat, Sep 10, 2011 at 4:07 PM, Philippe Beaudette philippe@wikimedia.org wrote:
On Thu, Sep 8, 2011 at 1:31 AM, Béria Lima berialima@gmail.com wrote:
And about culture - forgive me - but the only people who seems concerned about remove those images
from
wiki are AFAIS american.
I'm sorry, no. This is just untrue.
It would be nice to see some analysis of the results per country or language.
But please can we have the data first, so that the analysis tasks can be undertaken by we the people.
http://meta.wikimedia.org/wiki/Talk:Image_filter_referendum/Results/en#Relea...
-- John Vandenberg
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On Wed, Sep 7, 2011 at 12:35, Sydney Poore sydney.poore@gmail.com wrote:
Seeking outside opinions, and outreach efforts to bring more people into our Communities are high on my list of priorities because WMF contributor base is too homogeneous for me to be comfortable that our community members are making neutral decisions.
Agreed. So, how can we do that? On ad hoc basis and by lying the community that it's about referendum or a bit more organized?
- We have people speaking up publicly saying that they are not able to edit
from some locations because of the presence of some images on our Projects. Numerous editors have told me this in private, too. 2) We regularly have people put up "controversial content" for deletion because they find it offensive or out of scope. 3) Image filters are commonly available on other internet website, often by default.
That doesn't give a picture of how deep that problem is. Without harder evidences, I could freely conclude that it's just about particular portion of US society which is anyway positioned far from our ideals, so not worthy of efforts. (Similarly to that, I have no intention to work ma making Wikipedia closer to Serbian morons of any kind. The necessary prerequisite for using internet and Wikipedia is not to be a moron.)
The idea of offering imagine filters on WMF project is much more controversial than it is on other internet websites. So, I I think that it is fair to suggest that we examine why we are having conflicts over this topic when other website don't. One possible reason is that our base of editors is different from other websites. If that is true, then I think we need to allow for this difference when we make features to appeal to readers.
I don't see that as a problem and something unusual. We are community driven and we don't depend on Rupert Murdoch et al., unlike any other commercial sites.
On Wed, Sep 7, 2011 at 8:13 AM, Milos Rancic millosh@gmail.com wrote:
On Wed, Sep 7, 2011 at 12:35, Sydney Poore sydney.poore@gmail.com wrote:
Seeking outside opinions, and outreach efforts to bring more people into
our
Communities are high on my list of priorities because WMF contributor
base
is too homogeneous for me to be comfortable that our community members
are
making neutral decisions.
That doesn't give a picture of how deep that problem is. Without harder evidences, I could freely conclude that it's just about particular portion of US society which is anyway positioned far from our ideals, so not worthy of efforts. (Similarly to that, I have no intention to work ma making Wikipedia closer to Serbian morons of any kind. The necessary prerequisite for using internet and Wikipedia is not to be a moron.)
We know that our core contributors are a homogeneous group and could be introducing biases into WMF, both in content and policy decisions.
We can start from the premise that WMF is an international organizations that needs to find ways for people of all cultures to work to together.
We can recognize going into every situation that our contributions are going to be seen by people who do not share the biases we have.
We can attempt to avoid making stereotypical comments about people from other cultures.
If we don't do these things then it is near impossible to be an organization where people of all cultures feel free to express their opinion, and join the community. Without the opinions of these people, then we will not achieve our core mission.
Sydney
User:FloNight
The idea of offering imagine filters on WMF project is much more controversial than it is on other internet websites. So, I I think that
it
is fair to suggest that we examine why we are having conflicts over this topic when other website don't. One possible reason is that our base of editors is different from other websites. If that is true, then I think
we
need to allow for this difference when we make features to appeal to readers.
I don't see that as a problem and something unusual. We are community driven and we don't depend on Rupert Murdoch et al., unlike any other commercial sites.
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On Wed, Sep 7, 2011 at 15:14, Sydney Poore sydney.poore@gmail.com wrote:
We know that our core contributors are a homogeneous group and could be introducing biases into WMF, both in content and policy decisions.
Of course. Those editors created Wikipedia based on their biases. There wouldn't be Wikipedia without people whose ideology is to build free knowledge repository. There are others, biased in other ways, and they created, for example, Conservapedia.
We can start from the premise that WMF is an international organizations that needs to find ways for people of all cultures to work to together.
That's demagogy. Whenever anyone from the *international* community spoke about need for multicultural perspective, very precise issues were raised. Dominant influx in this issue is not from "international" community, but from one part of American society, supported tactically by people who have similar positions in relation to
Besides that, the most vocal "international" people are usually talking about imposing their POV, which is in collision with NPOV policy.
I agree that there many issues exist and we should start gather those issues. However, again, we are not talking here about protecting indigenous people of Australia from publishing photos of their sacred places, but about very common place in US. Thus, there is nothing here with multiculturalism.
We can recognize going into every situation that our contributions are going to be seen by people who do not share the biases we have.
May you list those biases, because you are talking too generally. What are the biases of Wikipedians for yourself?
We can attempt to avoid making stereotypical comments about people from other cultures.
May you define what the phrase "other cultures" means to you? I can't say that American culture is not mine, as well. From time to time I am better introduced into the current events in US than in Serbia.
If we don't do these things then it is near impossible to be an organization where people of all cultures feel free to express their opinion, and join the community. Without the opinions of these people, then we will not achieve our core mission.
If by "culture" you mean all parts of particular societies, then Wikipedia is not for all of them; as neither Encyclopédie was for everyone from Paris. Particular intellectual level is needed to be able to accept the world as-is.
On Wed, Sep 07, 2011 at 09:14:14AM -0400, Sydney Poore wrote:
We know that our core contributors are a homogeneous group and could be introducing biases into WMF, both in content and policy decisions.
The bias is towards the concept of openness and an acceptance of otherness.
There are 2 approaches here: * We can run this bias to self destruction (due to its tendency to water itself down to nothing over time) * We can strongly keep re-invigorating this bias, so that it remains operational. This requires a little oomph from time to time. As the saying goes: the price of openness and freedom is eternal vigilance, and all that.
My personal preference is to hold to the vigilant approach, and continuously work to provide an anti-bias bias.
We can start from the premise that WMF is an international organizations that needs to find ways for people of all cultures to work to together.
Um, Hi, Person from 2 or 3 of those cultures here (depending on how you count) O:-)
I've had hilarious situations where people accused me of having a united states bias[1], and modifying stuff I'd written to be "more international"... at which point they rewrote it from a united states bias. ;-)
As soon as you "go down to common fundamentals" you -more often than not- don't actually go down to fundamentals, but rather you end up reaffirming your own personal fundamentals (and thus biases) instead. It's a psychology thing, possibly with a topping of epistemology.
The only solution that I've ever known to work at all is to stay frosty, stay on your toes, and find (partial) consensus with your peers (those who are already present), and work to find more new peers from outside that circle.
It is absolutely impossible to predict the way of thinking of people whom you have no interaction with. Don't try to get in their head, don't try to speak FOR them. Instead, work out how to engage with them, then do so.
So don't make an Ass Of U and ME (ASSUME). Do Actually Start Kommunicating (ASK)!
Incidentally, from a "interacting with people outside your peer group" perspective. most forms of (innocious!) filtering are *disasterous* [2].
sincerely, Kim Bruning [1] This was patently impossible, as I had never set foot in the americas at that point in time. [2] http://www.thefilterbubble.com/ted-talk
ps I'm blessed with many different sets of biases: * Commonwealth/Kiwi point of view. * Orange/Cloggy point of view. * Expat point of view. (Expats tend to have more in common with each other than with host nation or nation of origin.) (See also: http://en.wikipedia.org/wiki/User:Kim_Bruning for illustration)
Sydney Poore wrote:
The idea of offering imagine filters on WMF project is much more controversial than it is on other internet websites. So, I I think that it is fair to suggest that we examine why we are having conflicts over this topic when other website don't. One possible reason is that our base of editors is different from other websites.
Websites like Flickr (an example commonly cited) are commercial endeavors whose decisions are based on profitability, not an obligation to maintain neutrality (a core element of most WMF projects). These services can cater to the revenue-driving majorities (with geographic segregation, if need be) and ignore minorities whose beliefs fall outside the "mainstream" for a given country. We mustn't do that.
One of the main issues regarding the proposed system is the need to determine which image types to label "potentially objectionable" and place under the limited number of optional filters. Due to cultural bias, some people (including a segment of voters in the "referendum," some of whom commented on its various talk pages) believe that this is as simple as creating a few categories along the lines of "nudity," "sex," "violence" and "gore" (defined and populated in accordance with arbitrary standards).
For a website like Flickr, that probably works fairly well; a majority of users will be satisfied, with the rest too fragmented to be accommodated in a cost-effective manner. Revenues are maximized. Mission accomplished.
The WMF projects' missions are dramatically different. For most, neutrality is a nonnegotiable principle. To provide an optional filter for "image type x" and not "image type y" is to formally validate the former objection and not the latter. That's unacceptable.
An alternative implementation, endorsed by WMF trustee Samuel Klein, is discussed here: http://meta.wikimedia.org/wiki/Talk:Image_filter_referendum/en/Categories#ge... or http://goo.gl/t6ly5
David Levy
The idea of offering imagine filters on WMF project is much more controversial than it is on other internet websites. So, I I think that it is fair to suggest that we examine why we are having conflicts over this topic when other website don't. One possible reason is that our base of editors is different from other websites.
Websites like Flickr (an example commonly cited) are commercial endeavors whose decisions are based on profitability, not an obligation to maintain neutrality (a core element of most WMF projects). These services can cater to the revenue-driving majorities (with geographic segregation, if need be) and ignore minorities whose beliefs fall outside the "mainstream" for a given country. We mustn't do that.
Brilliantly put!
One of the main issues regarding the proposed system is the need to determine which image types to label "potentially objectionable" and place under the limited number of optional filters. Due to cultural bias, some people (including a segment of voters in the "referendum," some of whom commented on its various talk pages) believe that this is as simple as creating a few categories along the lines of "nudity," "sex," "violence" and "gore" (defined and populated in accordance with arbitrary standards).
I think a key part of resolving this is to avoid calling the labels "potentially objectionable". I mean - anything can be potentially objectionable, it depends on the individual.
Obviously we cast this in the nudity/Mohammed light, because those are the most high profile examples.
But another example; clowns.
Some people are terrified of clowns, even their images. You wouldn't describe images of clowns as "potentially objectionable" but it would be great for Coulrophobes to go "oh hey Wikipedia, I don't like clowns so can you hide pics of them for me please? Thanks".
Some people are squeamish - so OK let the hides images involving blood/gore. Foot phobia? (that's common enough) Hide images of naked feet.
And so on.
This should not be about filtering "potentially objectionable" images, but about giving readers a way to filter their experience in a way that makes them feel safe and happy. And that is the light to cast & develop the feature
Tom
Thomas Morton wrote:
I think a key part of resolving this is to avoid calling the labels "potentially objectionable". I mean - anything can be potentially objectionable, it depends on the individual.
Indeed. The term "objectionable" is more applicable than "offensive" is (because one needn't be offended by an image to object to its sight), but neither concept can be accurately defined on behalf of the projects' readers as a whole.
Obviously we cast this in the nudity/Mohammed light, because those are the most high profile examples.
But another example; clowns.
Some people are terrified of clowns, even their images. You wouldn't describe images of clowns as "potentially objectionable" but it would be great for Coulrophobes to go "oh hey Wikipedia, I don't like clowns so can you hide pics of them for me please? Thanks".
Some people are squeamish - so OK let the hides images involving blood/gore. Foot phobia? (that's common enough) Hide images of naked feet.
And so on.
Another example, mentioned several times, is "spiders." An aversion to spiders is extremely common.
But even if we were to confine the image filter system to subjects that actually offend people (and further restrict this by mandating that the relevant belief be common in at least one culture), the list is staggering.
Many people are offended by photographs of unveiled women. Will one of the "5–10 categories" be dedicated to such images? If not, why not? Because we're deeming that cultural belief unworthy of accommodation?
I haven't even touched on the logistics. (Imagine a need to tag every image containing an unveiled woman.)
This should not be about filtering "potentially objectionable" images, but about giving readers a way to filter their experience in a way that makes them feel safe and happy. And that is the light to cast & develop the feature
Agreed. And one of the most important aspects to acknowledge is the infeasibility of labeling/grouping images based on what we believe people will want to filter.
David Levy
Agreed. And one of the most important aspects to acknowledge is the infeasibility of labeling/grouping images based on what we believe people will want to filter.
I confess to not being "on top" of the exact mechanics of this proposal... but why can we not be using normal categories?
Ok so for ease of use it is sensible to consider pre-made "bundles" of commonly filtered images (and I can see the issues there, obviously).
But for the default use filtering on categories is fine... then we can us the normal Wiki system and stick to neutrality (Don't like English Churches? Fine, add it to your exclusion list :))
Tom
On 7 September 2011 15:40, Thomas Morton morton.thomas@googlemail.com wrote:
I confess to not being "on top" of the exact mechanics of this proposal... but why can we not be using normal categories? Ok so for ease of use it is sensible to consider pre-made "bundles" of commonly filtered images (and I can see the issues there, obviously). But for the default use filtering on categories is fine... then we can us the normal Wiki system and stick to neutrality (Don't like English Churches? Fine, add it to your exclusion list :))
* The category system is constructed of minute subcategories, not broad categories that are then combined.
You could then say "this and everything under it." But then you run into:
* The category system is not very consistent. * The category system is not free of loops. * An image on en:wp could be a local image (one system of categories) or a Commons image (a completely different system of categories).
Thus, to use categories for an image filtering system would indeed require constructing a category for the specific purpose of exclusion. Big ALA "actually, that *is* censorship" alarm goes off.
The closest we could come to a neutral filtering system is an easily accessible on/off switch for images.
- d.
On 7 September 2011 10:48, David Gerard dgerard@gmail.com wrote:
<snip>
The closest we could come to a neutral filtering system is an easily accessible on/off switch for images.
Interestingly, this proposal has come up many times completely separate to the issue of image filtering. Many users, particularly those on dial-up systems or those whose billing is related to the amount of data accessed have asked for this ability for some time. For them it is a performance/cost issue, and has nothing to do with filtering. Given some of the arguments that have been made in opposition to filtering, particularly those that seem to focus on "the content should be displayed in the way the authors intended", I'm concerned there would be equally significant opposition to even this simple matter.
Risker/Anne
On Thu, Sep 8, 2011 at 12:55 AM, Risker risker.wp@gmail.com wrote:
On 7 September 2011 10:48, David Gerard dgerard@gmail.com wrote:
<snip>
The closest we could come to a neutral filtering system is an easily accessible on/off switch for images.
Interestingly, this proposal has come up many times completely separate to the issue of image filtering. Many users, particularly those on dial-up systems or those whose billing is related to the amount of data accessed have asked for this ability for some time. For them it is a performance/cost issue, and has nothing to do with filtering. Given some of the arguments that have been made in opposition to filtering, particularly those that seem to focus on "the content should be displayed in the way the authors intended", I'm concerned there would be equally significant opposition to even this simple matter.
Turning off images should be, and can be, done by the user-agent. We have a help page describing how to do this.
-- John Vandenberg
On 7 September 2011 17:18, John Vandenberg jayvdb@gmail.com wrote:
On Thu, Sep 8, 2011 at 12:55 AM, Risker risker.wp@gmail.com wrote:
On 7 September 2011 10:48, David Gerard dgerard@gmail.com wrote:
<snip>
The closest we could come to a neutral filtering system is an easily accessible on/off switch for images.
Interestingly, this proposal has come up many times completely separate
to
the issue of image filtering. Many users, particularly those on dial-up systems or those whose billing is related to the amount of data accessed have asked for this ability for some time. For them it is a
performance/cost
issue, and has nothing to do with filtering. Given some of the arguments that have been made in opposition to filtering, particularly those that
seem
to focus on "the content should be displayed in the way the authors intended", I'm concerned there would be equally significant opposition to even this simple matter.
Turning off images should be, and can be, done by the user-agent. We have a help page describing how to do this.
That would be the page with the great big "this page is out of date" notice at the top, giving instructions that are not valid for the most common user agents (Firefox 2?). And it spends a great deal of time talking about altering people's personal userspace. Like David said....a nice simple switch to turn them on and off without having to log in: that's what people have asked for. Mucking about with their user agent is beyond the technical comfort level of most internet users, and in some cases is not possible. (Example - many publicly accessible computers are set up so that no programs can be added or modified without sysadmin permissions.)
Risker/Anne
On Thu, Sep 8, 2011 at 7:26 AM, Risker risker.wp@gmail.com wrote:
On 7 September 2011 17:18, John Vandenberg jayvdb@gmail.com wrote:
On Thu, Sep 8, 2011 at 12:55 AM, Risker risker.wp@gmail.com wrote:
On 7 September 2011 10:48, David Gerard dgerard@gmail.com wrote:
<snip>
The closest we could come to a neutral filtering system is an easily accessible on/off switch for images.
Interestingly, this proposal has come up many times completely separate
to
the issue of image filtering. Many users, particularly those on dial-up systems or those whose billing is related to the amount of data accessed have asked for this ability for some time. For them it is a
performance/cost
issue, and has nothing to do with filtering. Given some of the arguments that have been made in opposition to filtering, particularly those that
seem
to focus on "the content should be displayed in the way the authors intended", I'm concerned there would be equally significant opposition to even this simple matter.
Turning off images should be, and can be, done by the user-agent. We have a help page describing how to do this.
That would be the page with the great big "this page is out of date" notice at the top, giving instructions that are not valid for the most common user agents (Firefox 2?).
Every version of Mozilla has included the "Dont load images" option. And it is simple to find.
On 7 September 2011 17:32, John Vandenberg jayvdb@gmail.com wrote:
On Thu, Sep 8, 2011 at 7:26 AM, Risker risker.wp@gmail.com wrote:
On 7 September 2011 17:18, John Vandenberg jayvdb@gmail.com wrote:
On Thu, Sep 8, 2011 at 12:55 AM, Risker risker.wp@gmail.com wrote:
On 7 September 2011 10:48, David Gerard dgerard@gmail.com wrote:
<snip>
The closest we could come to a neutral filtering system is an easily accessible on/off switch for images.
Interestingly, this proposal has come up many times completely
separate
to
the issue of image filtering. Many users, particularly those on
dial-up
systems or those whose billing is related to the amount of data
accessed
have asked for this ability for some time. For them it is a
performance/cost
issue, and has nothing to do with filtering. Given some of the
arguments
that have been made in opposition to filtering, particularly those
that
seem
to focus on "the content should be displayed in the way the authors intended", I'm concerned there would be equally significant opposition
to
even this simple matter.
Turning off images should be, and can be, done by the user-agent. We have a help page describing how to do this.
That would be the page with the great big "this page is out of date"
notice
at the top, giving instructions that are not valid for the most common
user
agents (Firefox 2?).
Every version of Mozilla has included the "Dont load images" option. And it is simple to find.
John, you made me laugh out loud when I read that - it reminded me of how incredibly non-techie I was before I started hanging out with Wikimedians, because a few years ago it never would have occurred to me that it was possible. As it was, It took me 15 minutes to find the two ways to do that (without looking at the help page that I doubt anyone would find without knowing a lot about the project).
I do think David Gerard's suggestion is probably both (a) quite workable and (b) more likely to create user satisfaction, especially if it's a straightforward toggle.
Risker/Anne
On Thu, Sep 8, 2011 at 2:43 PM, Risker risker.wp@gmail.com wrote:
On 7 September 2011 17:32, John Vandenberg jayvdb@gmail.com wrote:
Every version of Mozilla has included the "Dont load images" option. And it is simple to find.
John, you made me laugh out loud when I read that - it reminded me of how incredibly non-techie I was before I started hanging out with Wikimedians, because a few years ago it never would have occurred to me that it was possible. As it was, It took me 15 minutes to find the two ways to do that (without looking at the help page that I doubt anyone would find without knowing a lot about the project).
http://www.google.com/search?q=firefox+disable+images
(our help page turns up on the first page of results, for me)
I do think David Gerard's suggestion is probably both (a) quite workable and (b) more likely to create user satisfaction, especially if it's a straightforward toggle.
We should be helping users use their existing tools better, not creating new tools to do the same job, less well. people on dialup need to learn how to use these tools because it isnt just Wikipedia which is slow to load - the entire internet is full of sites which are a nightmare on dialup.
If we want to improve Wikipedia for dialup, our developer resources are better spent on a skin which emits less HTML, selects smaller or less images, etc.
On 8 September 2011 01:57, John Vandenberg jayvdb@gmail.com wrote:
On Thu, Sep 8, 2011 at 2:43 PM, Risker risker.wp@gmail.com wrote:
On 7 September 2011 17:32, John Vandenberg jayvdb@gmail.com wrote:
Every version of Mozilla has included the "Dont load images" option. And it is simple to find.
John, you made me laugh out loud when I read that - it reminded me of how incredibly non-techie I was before I started hanging out with
Wikimedians,
because a few years ago it never would have occurred to me that it was possible. As it was, It took me 15 minutes to find the two ways to do
that
(without looking at the help page that I doubt anyone would find without knowing a lot about the project).
http://www.google.com/search?q=firefox+disable+images
(our help page turns up on the first page of results, for me)
I do think David Gerard's suggestion is probably both (a) quite workable
and
(b) more likely to create user satisfaction, especially if it's a straightforward toggle.
We should be helping users use their existing tools better, not creating new tools to do the same job, less well. people on dialup need to learn how to use these tools because it isnt just Wikipedia which is slow to load - the entire internet is full of sites which are a nightmare on dialup.
If we want to improve Wikipedia for dialup, our developer resources are better spent on a skin which emits less HTML, selects smaller or less images, etc.
John, we can't fix the whole internet. We can't insist that users do a google search to find pages in our own project (you've made an argument for improving our search function further). And we shouldn't treat people who don't want to muck about in their browser software ("oh geez, now what have I done!") as too uneducated to be shown courtesy. Yes, the internet is full of sites that are a pain on dialup. But we can be a leader in giving people the opportunity to find out about Leonardo da Vinci without using up their bandwidth.
We already know that changing editorial practices is like herding cats, and getting people to use smaller images when clearly a large one is appropriate to the page, or using fewer to illustrate articles, is a particularly challenging one. The use of overlinking and massive templates at the bottom of articles is also problematic, as are the ever-increasing expectations for referencing. A toggle to turn off images, right over there on the top of the toolbox, is only one small step in the constant evolution of ways that can make our projects an easier place to be for those who aren't as well-informed or clever as the average Wikimedian. Education is best digested when it is actively sought; I'd rather feed the reader an article on a topic he wants to know about than insist that he learn how to reprogram his browser before he can open the article he wants without having enough time to take a shower before the page finishes loading.
Risker/Anne
On 7 September 2011 22:26, Risker risker.wp@gmail.com wrote:
Turning off images should be, and can be, done by the user-agent. We have a help page describing how to do this.
That would be the page with the great big "this page is out of date" notice at the top, giving instructions that are not valid for the most common user agents (Firefox 2?). And it spends a great deal of time talking about altering people's personal userspace. Like David said....a nice simple switch to turn them on and off without having to log in: that's what people have asked for. Mucking about with their user agent is beyond the technical comfort level of most internet users, and in some cases is not possible. (Example - many publicly accessible computers are set up so that no programs can be added or modified without sysadmin permissions.)
+1
This is really low-bandwidth usability. I've tried editing Wikipedia on dialup ... it's annoying enough waiting for all the Javascript these days on 1Mbit.
"Images on" "Images off" in a sidebar, switching the CSS live?
- d.
David Gerard dgerard@gmail.com wrote:
On 7 September 2011 22:26, Risker risker.wp@gmail.com wrote:
Turning off images should be, and can be, done by the user-agent. We have a help page describing how to do this.
This is really low-bandwidth usability. I've tried editing Wikipedia on dialup ... it's annoying enough waiting for all the Javascript these days on 1Mbit.
"Images on" "Images off" in a sidebar, switching the CSS live?
Come on, I actually like editing using a text browser (at least I get a much better editor than any browser is offering currently).
Images in many such environments can be loaded and displayed on-demand. In some browsers there is an option "launch graphical browser for this URL".
The only thing that I really miss is a "?" after a "red link". It got replaced long time ago by the CSS gimmick and it does not work on pure text browsers at all, see
https://bugzilla.wikimedia.org/show_bug.cgi?id=5366#c5
//Saper
Thus, to use categories for an image filtering system would indeed require constructing a category for the specific purpose of exclusion. Big ALA "actually, that *is* censorship" alarm goes off.
This is true, and I agree. but...
- The category system is constructed of minute subcategories, not
broad categories that are then combined.
You could then say "this and everything under it." But then you run into:
- The category system is not very consistent.
- The category system is not free of loops.
- An image on en:wp could be a local image (one system of categories)
or a Commons image (a completely different system of categories).
This is largely an engineering problem; and it can probably be overcome with some architecture work. As we are going to be implementing a major new feature *anyway* it's not something to reject outright, I think :)
Obviously given the complexity of the category tree system any such engineering wouldn't be infallible - but you could match it to most use cases. Ultimately it is just a collapsing tree problem, and they are ten a penny to a decent engineer :)
Tom
On 7 September 2011 15:55, Thomas Morton morton.thomas@googlemail.com wrote:
Obviously given the complexity of the category tree system any such engineering wouldn't be infallible - but you could match it to most use cases. Ultimately it is just a collapsing tree problem, and they are ten a penny to a decent engineer :)
The category trees are pathological in every way. Unless you try to regularise the category system for the purpose of making the filter easier to implement, which I predict will lead to *considerable* community resistance and obstruction.
- d.
On 7 September 2011 15:58, David Gerard dgerard@gmail.com wrote:
On 7 September 2011 15:55, Thomas Morton morton.thomas@googlemail.com wrote:
Obviously given the complexity of the category tree system any such engineering wouldn't be infallible - but you could match it to most use cases. Ultimately it is just a collapsing tree problem, and they are ten
a
penny to a decent engineer :)
The category trees are pathological in every way. Unless you try to regularise the category system for the purpose of making the filter easier to implement, which I predict will lead to *considerable* community resistance and obstruction.
As I said; you can't cover every situation. But you can engineer around the basic hierarchy - and leave the rest to a button saying "add this image to my filter".
I don't see that as a major roadblock.
Tom
- d.
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On Wed, Sep 7, 2011 at 4:03 PM, Thomas Morton morton.thomas@googlemail.com wrote:
As I said; you can't cover every situation. But you can engineer around the basic hierarchy - and leave the rest to a button saying "add this image to my filter".
I'm in favour of the filter (my argument being "I'm not super-excited about having it, but I'm even less keen on parents telling their children they can't use Wikipedia") but I do worry about the implementation.
I'm not looking forward to the possibility that every picture is going to be surrounded by filter-cruft. I don't really want pictures of planets, plants, fonts, colours and anything else that's universally inoffensive being accompanied with buttons. I hope there's a more elegant solution but if we're giving the user control of their filter then I wonder how this can be avoided.
Boednotbod
I'm not looking forward to the possibility that every picture is going to be surrounded by filter-cruft. I don't really want pictures of planets, plants, fonts, colours and anything else that's universally inoffensive being accompanied with buttons. I hope there's a more elegant solution but if we're giving the user control of their filter then I wonder how this can be avoided.
True; that is a UI engineering problem - and we have significant UI problems already so it needs to be considered carefully (so as not to compound current issues).
The easiest/neatest solution (that I have in my mind) is probably a little tiny dot/arrow/icon appear at the top right of every image when you pass over it, that then brings up a drown down menu when you hover.
Tom
On 07/09/2011 11:17 AM, Bod Notbod wrote:
[...] but I'm even less keen on parents telling their children they can't use Wikipedia [...]
It's not the first time I see this meme expressed.
Is there a reliable source somewhere that shows that (a) this represents a significant number of parents over several cultural groups, and that (b) there is serious indication that if (a) is true those same parents are going to change their stance given the proposed implementation of the image filter?
Because, unless we got some serious statistical backing for those assertions, they are just smoke blowing our of asses to the sound of "but think of the children!"
-- Coren / Marc
Thomas Morton wrote:
This is largely an engineering problem; and it can probably be overcome with some architecture work. As we are going to be implementing a major new feature *anyway* it's not something to reject outright, I think :)
Obviously given the complexity of the category tree system any such engineering wouldn't be infallible - but you could match it to most use cases. Ultimately it is just a collapsing tree problem, and they are ten a penny to a decent engineer :)
I think some of your comments are exhibiting an unfamiliarity with the tangled mess that is MediaWiki/Wikipedia. Have you done much work on MediaWiki or worked with the replicated databases at all (particularly the databases of the larger sites)? An outside voice is great, but yours comes off as rather naïve.
MZMcBride
On 7 Sep 2011, at 23:04, MZMcBride z@mzmcbride.com wrote:
Thomas Morton wrote:
This is largely an engineering problem; and it can probably be overcome with some architecture work. As we are going to be implementing a major new feature *anyway* it's not something to reject outright, I think :)
Obviously given the complexity of the category tree system any such engineering wouldn't be infallible - but you could match it to most use cases. Ultimately it is just a collapsing tree problem, and they are ten a penny to a decent engineer :)
I think some of your comments are exhibiting an unfamiliarity with the tangled mess that is MediaWiki/Wikipedia. Have you done much work on MediaWiki or worked with the replicated databases at all (particularly the databases of the larger sites)? An outside voice is great, but yours comes off as rather naïve.
MZMcBride
I've not proposed any actual solutions, or changes etc. Simply said that the problem raised is an engineering problem and so needs to be considered from that perspective.
From my off hand knowledge of MW it won't be particularly easy - but
as one of my professors used to say "nothing is easy, but someone will be able to fix it"
The next step is to figure out what engineering would be needed to provide these features and whether that is possible
Had anyone seriously assessed this? (and if the answer is yes, then fine)
Tom
(I tend to hold a positive attitude to such problems until they are solved or shown insoluble; everyone tells me my proposed solutions at work are "impossible" but they work out more often than not!)
I think that having this kind of Image Filter is against the mission of the Wikimedia Foundation and a claudication of the WMF in favor of interests of others.
Letting some users to block Wikipedia content is NOT a good way to "disseminate it effectively and globally". http://wikimediafoundation.org/wiki/Mission_statement
Allowing this type of self censorship is imposing a point of view.
It's a waste of time and resources to support the POV that certain content should be censored. Is opening the door to censorship and to give ammunition to enemies of knowledge and freedom.
The users of Wikimedia projects should see "The Sum of Human Knowledge" and not just "the knowledge that I like".
Thomas Morton wrote:
I confess to not being "on top" of the exact mechanics of this proposal... but why can we not be using normal categories? Ok so for ease of use it is sensible to consider pre-made "bundles" of commonly filtered images (and I can see the issues there, obviously). But for the default use filtering on categories is fine... then we can us the normal Wiki system and stick to neutrality (Don't like English Churches? Fine, add it to your exclusion list :))
David Gerard replied:
- The category system is constructed of minute subcategories, not
broad categories that are then combined.
You could then say "this and everything under it." But then you run into:
- The category system is not very consistent.
- The category system is not free of loops.
- An image on en:wp could be a local image (one system of categories)
or a Commons image (a completely different system of categories).
Additionally:
* Our current categorization is based primarily on what images are about, *not* what they contain. For example, a photograph depicting a protest rally might include nudity on the part of someone in the background, but its categorization won't specify that. Of course, if we were to introduce a filter system reliant upon the current categories, it's likely that some users would seek to change that (resulting in harmful dilution).
* Many "potentially objectionable" subjects simply aren't reflected in the current categorization. An example is the aforementioned "unveiled women." I can't speak for every project, but Commons certainly has no such category.
David Levy
--- On Wed, 7/9/11, David Gerard dgerard@gmail.com wrote: The closest we could come to a neutral filtering system is an easily accessible on/off switch for images.
Actually, that is really not a bad idea. If a user wants to read about bukkake or fisting, rather than seeing it displayed in graphic detail on their screen, they could switch images off, just as a precaution, before they navigate to the page (especially if they sit in an open-plan office). The same if they are a muslim and want to read about the prophet, but don't want to be surprised by an image of him; or if they're arachnophobic and want to read about the critters without being visually freaked out, etc. A.
--- On Wed, 7/9/11, David Gerard dgerard@gmail.com wrote: The closest we could come to a neutral filtering system is an easily accessible on/off switch for images. On Thursdy, 08. sep 2011 07:56 Andreas Kolbe jayen466@yahoo.com [mailto:jayen466@yahoo.com] wrote: Actually, that is really not a bad idea.
+1. Actually, it's the best idea.
Sir48/Thyge
This proposal was around for a long time inside the discussion pages of the referendum itself. It would have many positive arguments: * Truly neutral * Easy to implement * No time spend (lost) on sorting images * No editwars about categorization
I would support this solution. But I'm strongly against any biased filtering by arbitrary categories.
Tobias Oelgarte
Am 08.09.2011 16:33, schrieb dex2000@pc.dk:
--- On Wed, 7/9/11, David Gerarddgerard@gmail.com wrote: The closest we could come to a neutral filtering system is an easily accessible on/off switch for images. On Thursdy, 08. sep 2011 07:56 Andreas Kolbe jayen466@yahoo.com [mailto:jayen466@yahoo.com] wrote: Actually, that is really not a bad idea.
+1. Actually, it's the best idea.
Sir48/Thyge
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
and: *the only solution to give parents a real possibility to protect their kids against images on Wikipedia, which they consider inappropriate for them.
Sir48/Thyge
----- Original meddelelse -----
Fra: Tobias Oelgarte tobias.oelgarte@googlemail.com Til: foundation-l@lists.wikimedia.org Dato: Tor, 08. sep 2011 16:40 Emne: Re: [Foundation-l] Personal Image Filter results announced
This proposal was around for a long time inside the discussion pages of the referendum itself. It would have many positive arguments:
- Truly neutral
- Easy to implement
- No time spend (lost) on sorting images
- No editwars about categorization
I would support this solution. But I'm strongly against any biased filtering by arbitrary categories.
Tobias Oelgarte
Am 08.09.2011 16:33, schrieb dex2000@pc.dk:
--- On Wed, 7/9/11, David Gerarddgerard@gmail.com wrote: The closest we could come to a neutral filtering system is an
easily
accessible on/off switch for images. On Thursdy, 08. sep 2011 07:56 Andreas Kolbe jayen466@yahoo.com
[mailto:jayen466@yahoo.com]
wrote: Actually, that is really not a bad idea.
+1. Actually, it's the best idea.
Sir48/Thyge
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe:
https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
No, that wouldn't/shouldn't still be possible. Any image can be made visible while the parents are away for some minutes. But as usual: No filter can replace the guidance trough parents. It's a parents job to explain their own kids how the world works.
Tobias Oelgarte
Am 08.09.2011 16:54, schrieb dex2000@pc.dk:
and: *the only solution to give parents a real possibility to protect their kids against images on Wikipedia, which they consider inappropriate for them.
Sir48/Thyge
----- Original meddelelse -----
Fra: Tobias Oelgartetobias.oelgarte@googlemail.com Til: foundation-l@lists.wikimedia.org Dato: Tor, 08. sep 2011 16:40 Emne: Re: [Foundation-l] Personal Image Filter results announced
This proposal was around for a long time inside the discussion pages of the referendum itself. It would have many positive arguments:
- Truly neutral
- Easy to implement
- No time spend (lost) on sorting images
- No editwars about categorization
I would support this solution. But I'm strongly against any biased filtering by arbitrary categories.
Tobias Oelgarte
Am 08.09.2011 16:33, schrieb dex2000@pc.dk:
--- On Wed, 7/9/11, David Gerarddgerard@gmail.com wrote: The closest we could come to a neutral filtering system is an
easily
accessible on/off switch for images. On Thursdy, 08. sep 2011 07:56 Andreas Kolbe jayen466@yahoo.com
[mailto:jayen466@yahoo.com]
wrote: Actually, that is really not a bad idea.
+1. Actually, it's the best idea.
Sir48/Thyge
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe:
https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
That's a property of any filter. But other filters may not fulfil the requirement for protecting against accidential display of images.
Sir48/Thyge
----- Original meddelelse -----
Fra: Tobias Oelgarte tobias.oelgarte@googlemail.com Til: foundation-l@lists.wikimedia.org Dato: Tor, 08. sep 2011 17:00 Emne: Re: [Foundation-l] Personal Image Filter results announced
No, that wouldn't/shouldn't still be possible. Any image can be made visible while the parents are away for some minutes. But as usual: No filter can replace the guidance trough parents. It's a parents job to explain their own kids how the world works.
Tobias Oelgarte
Am 08.09.2011 16:54, schrieb dex2000@pc.dk:
and: *the only solution to give parents a real possibility to protect
their
kids against images on Wikipedia, which they consider inappropriate
for
them.
Sir48/Thyge
----- Original meddelelse -----
Fra: Tobias Oelgartetobias.oelgarte@googlemail.com Til: foundation-l@lists.wikimedia.org Dato: Tor, 08. sep 2011 16:40 Emne: Re: [Foundation-l] Personal Image Filter results announced
This proposal was around for a long time inside the discussion
pages
of the referendum itself. It would have many positive arguments:
- Truly neutral
- Easy to implement
- No time spend (lost) on sorting images
- No editwars about categorization
I would support this solution. But I'm strongly against any biased filtering by arbitrary categories.
Tobias Oelgarte
Am 08.09.2011 16:33, schrieb dex2000@pc.dk:
--- On Wed, 7/9/11, David Gerarddgerard@gmail.com wrote: The closest we could come to a neutral filtering system is an
easily
accessible on/off switch for images. On Thursdy, 08. sep 2011 07:56 Andreas Kolbe jayen466@yahoo.com
[mailto:jayen466@yahoo.com]
wrote: Actually, that is really not a bad idea.
+1. Actually, it's the best idea.
Sir48/Thyge
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe:
https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe:
https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
That is true. But we should represent it as an child protection feature. It alone can't do the job. But it could help parents with children playing in the background and users that wan't to read about controversial content in public. One of the typical, frequently made requests, if i remember the reasonings of the WMF correctly.
Tobias
Am 08.09.2011 17:10, schrieb dex2000@pc.dk:
That's a property of any filter. But other filters may not fulfil the requirement for protecting against accidential display of images.
Sir48/Thyge
Oops. I meant, "we should _not_ represent it as an child protection feature".
Am 08.09.2011 17:21, schrieb Tobias Oelgarte:
That is true. But we should represent it as an child protection feature. It alone can't do the job. But it could help parents with children playing in the background and users that wan't to read about controversial content in public. One of the typical, frequently made requests, if i remember the reasonings of the WMF correctly.
Tobias
Am 08.09.2011 17:10, schrieb dex2000@pc.dk:
That's a property of any filter. But other filters may not fulfil the requirement for protecting against accidential display of images.
Sir48/Thyge
On Wed, Sep 7, 2011 at 2:46 PM, Thomas Morton morton.thomas@googlemail.com wrote:
But another example; clowns.
Some people are terrified of clowns, even their images. You wouldn't describe images of clowns as "potentially objectionable" but it would be great for Coulrophobes to go "oh hey Wikipedia, I don't like clowns so can you hide pics of them for me please? Thanks".
I have a phobia. I would like to overcome it. All my reading suggests that what I need to do is expose myself to the thing I fear, more and more, in incremental steps.
So, if Wikipedia is to be a good citizen in the online world what we should actually do for someone afeared of clowns is to make sure that they see a picture of a clown once every, say, ten articles or so *no matter what the article is about*. This should be ratcheted up gradually so that at some point all the user sees is a big picture of Ronald Macdonald whenever they visit Wikipedia.
Once the user reports that they are cured we can return their service back to normal and they can then educate themselves, do their homework whatever, without trepidation.
Bodnotbod
On Wed, Sep 7, 2011 at 9:29 AM, David Levy lifeisunfair@gmail.com wrote:
Sydney Poore wrote:
The idea of offering imagine filters on WMF project is much more controversial than it is on other internet websites. So, I I think that it is fair to suggest that we examine why we are having conflicts over this topic when other website don't. One possible reason is that our base of editors is different from other websites.
Websites like Flickr (an example commonly cited) are commercial endeavors whose decisions are based on profitability, not an obligation to maintain neutrality (a core element of most WMF projects). These services can cater to the revenue-driving majorities (with geographic segregation, if need be) and ignore minorities whose beliefs fall outside the "mainstream" for a given country. We mustn't do that.
Today to be successful organizations; both for-profit and not-for-profit, must recognize the needs of their global audience. Offering image filters where people can set their own preferences and bypass the setting for individual settings is brilliant way for people with different values to share the same space. No content is removed, and people can see all images if they choose to.
This approach is far better than the approach used by most other large educational institutions which currently control the viewing of controversial content through their acquisition process.
One of the main issues regarding the proposed system is the need to determine which image types to label "potentially objectionable" and place under the limited number of optional filters. Due to cultural bias, some people (including a segment of voters in the "referendum," some of whom commented on its various talk pages) believe that this is as simple as creating a few categories along the lines of "nudity," "sex," "violence" and "gore" (defined and populated in accordance with arbitrary standards).
For a website like Flickr, that probably works fairly well; a majority of users will be satisfied, with the rest too fragmented to be accommodated in a cost-effective manner. Revenues are maximized. Mission accomplished.
The WMF projects' missions are dramatically different. For most, neutrality is a nonnegotiable principle. To provide an optional filter for "image type x" and not "image type y" is to formally validate the former objection and not the latter. That's unacceptable.
An alternative implementation, endorsed by WMF trustee Samuel Klein, is discussed here:
http://meta.wikimedia.org/wiki/Talk:Image_filter_referendum/en/Categories#ge... or http://goo.gl/t6ly5
Organizations who share our mission make these type of decisions everyday. They consider the ideals that frame their mission, and then craft work policies and procedures that balance all of their ideals. IMO, that is exactly what the WMF Board and staff have been doing in regard to controversial content for the last 18 months. Because WMF has a strong, strong tradition of community involvement at every level practical, the community is being asked to help craft the policy and procedures.
Various ideas about how to label images for a personal filter have been floated around for years. The referendum asked the community for opinions about features that could be included.
I see this as goodness. Evidently, other people disagree given the large volume of posts and remarks criticizing the referendum.
Some of the criticism is fair, and I'm sure that people involved with planning the referendum will take it on board. Being experienced Wikimedians, I imagine that they will put all the comments in proper context, even words spoken in the heat of moment. But still, we need to remember that the people working on this issue as part of their fiduciary responsibility or employment are doing with the best intentions of WMF in mind. And they need to be thanked for their work.
Thank you to everyone who has commented in the thread. Through dialogue with each other on this transparent mailing list, we are showing the world that it possible to collaboratively collect and disseminate free knowledge.
Sydney Poore
User:FloNight
Sydney Poore wrote:
Today to be successful organizations; both for-profit and not-for-profit, must recognize the needs of their global audience. Offering image filters where people can set their own preferences and bypass the setting for individual settings is brilliant way for people with different values to share the same space. No content is removed, and people can see all images if they choose to.
Agreed. That's why I support the image filter implementation proposed here: http://meta.wikimedia.org/wiki/Talk:Image_filter_referendum/en/Categories#ge... or http://goo.gl/t6ly5
Because we seek to accommodate a global audience (comprising people whose beliefs are extremely diverse), I unreservedly oppose any implementation necessitating the designation of certain image types (and not others) as "potentially objectionable" or similar.
David Levy
On Thu, Sep 8, 2011 at 12:38 AM, Sydney Poore sydney.poore@gmail.com wrote:
On Wed, Sep 7, 2011 at 9:29 AM, David Levy lifeisunfair@gmail.com wrote:
Sydney Poore wrote:
The idea of offering imagine filters on WMF project is much more controversial than it is on other internet websites. So, I I think that it is fair to suggest that we examine why we are having conflicts over this topic when other website don't. One possible reason is that our base of editors is different from other websites.
Websites like Flickr (an example commonly cited) are commercial endeavors whose decisions are based on profitability, not an obligation to maintain neutrality (a core element of most WMF projects). These services can cater to the revenue-driving majorities (with geographic segregation, if need be) and ignore minorities whose beliefs fall outside the "mainstream" for a given country. We mustn't do that.
Today to be successful organizations; both for-profit and not-for-profit, must recognize the needs of their global audience. Offering image filters where people can set their own preferences and bypass the setting for individual settings is brilliant way for people with different values to share the same space. No content is removed, and people can see all images if they choose to.
This approach is far better than the approach used by most other large educational institutions which currently control the viewing of controversial content through their acquisition process.
Wikipedia *is* successful, and an image filter was not part of its success.
I dont mind Wikimedia content being better labelled with metadata, however the actual process of filtering should be done by the user-agent.
-- John Vandenberg
On Wed, Sep 07, 2011 at 06:35:02AM -0400, Sydney Poore wrote:
- We have people speaking up publicly saying that they are not able to edit
from some locations because of the presence of some images on our Projects. Numerous editors have told me this in private, too.
Seriously? So at least one of my theoretical scenarios and potential exploits have already been spotted in the wild (albeit inadvertant, and in embryonic form). AKA, it's not theoretical.
Oh wow, blue team is SO dead. O:-)
By the way, if people are already reaping the bad karma for fubaring their own networks, why are we trying to help them?
sincerely, Kim Bruning Who wants to run a censorship wiki-wargame. (As soon as I have some time off again ;-)
On Tue, Sep 6, 2011 at 11:50 PM, Milos Rancic millosh@gmail.com wrote:
... At Research committee list [1] there is ongoing discussion related to John Vanderberg's question "Was this survey approved by the Research Committee?" [2]. Research committee wasn't asked, of course (and WereSpielChequers is working on statement). Because, simply, politically motivated junk science requires implementation, not questions about validity of premises.
[1] http://lists.wikimedia.org/pipermail/rcom-l/2011-September/000327.html [2] http://lists.wikimedia.org/pipermail/foundation-l/2011-September/067889.html
Thank you for pointing this out Milos. I wasnt aware that RCom's email list is public. That is good.
This survey may not be feeding into scientific research publications, however the principles of human research ethics should still apply to any survey of the public, especially when conducted by organisations funded by the public. The survey instruments used should be valid, and the survey results should be discard if the survey population was not satisfactory.
-- John Vandenberg
On 06/09/11 22:56, Milos Rancic wrote:
On Tue, Sep 6, 2011 at 14:33, Tim Starling tstarling@wikimedia.org wrote:
Personally, I think the filter will be mostly harmless, and that it's not worth the effort to rail against it. It will be useful for PR -- it will seem as if we are trying to accomodate all points of view even if the feature is not particularly useful for parents.
I suppose that you know that WMF did PR research if you claim that it will be useful for that purpose. If so, please refer to it. If not, it's just about dilettantism, as usual.
Just dilettanteism. I thought I made that clear by starting with "personally, I think".
-- Tim Starling
On Tue, Sep 6, 2011 at 15:50, Tim Starling tstarling@wikimedia.org wrote:
On 06/09/11 22:56, Milos Rancic wrote:
On Tue, Sep 6, 2011 at 14:33, Tim Starling tstarling@wikimedia.org wrote:
Personally, I think the filter will be mostly harmless, and that it's not worth the effort to rail against it. It will be useful for PR -- it will seem as if we are trying to accomodate all points of view even if the feature is not particularly useful for parents.
I suppose that you know that WMF did PR research if you claim that it will be useful for that purpose. If so, please refer to it. If not, it's just about dilettantism, as usual.
Just dilettanteism. I thought I made that clear by starting with "personally, I think".
It wasn't a comment about you.
On Tue, Sep 6, 2011 at 8:33 AM, Tim Starling tstarling@wikimedia.orgwrote:
There's a Board resolution that says "implement it", so I suppose it will be implemented.
http://wikimediafoundation.org/wiki/Resolution:Controversial_content
However, the editor community could sabotage it in various ways. For example, there's no guarantee that anyone will tag any images, or that tagged images won't be untagged by bots run by administrators. If the Board really does want a useful image-hiding feature, then it's essential that the community be persuaded that it is a good idea.
Personally, I think the filter will be mostly harmless, and that it's not worth the effort to rail against it. It will be useful for PR -- it will seem as if we are trying to accomodate all points of view even if the feature is not particularly useful for parents.
-- Tim Starling
Seems just as likely that it will prove to be new fodder for critics. I'm sure you can imagine it - "Wikipedia announced a porn filter, so I let my 3rd grader use it, and found him looking at [[Pearl necklace]]* because of a link on [[Sesame street]]! And there were pictures!" - magnified across the blogosphere and conservative commentariat.
Nathan
* NSFW
On Tue, Sep 6, 2011 at 3:33 PM, Tim Starling tstarling@wikimedia.org wrote:
http://wikimediafoundation.org/wiki/Resolution:Controversial_content
However, the editor community could sabotage it in various ways. For example, there's no guarantee that anyone will tag any images, or that tagged images won't be untagged by bots run by administrators. If the Board really does want a useful image-hiding feature, then it's essential that the community be persuaded that it is a good idea.
Personally, I think the filter will be mostly harmless, and that it's not worth the effort to rail against it. It will be useful for PR -- it will seem as if we are trying to accomodate all points of view even if the feature is not particularly useful for parents.
-- Tim Starling
Because of the dictat nature of the board resolution, I think the key question omitted from the questionnaire was:
When (not if) we implement this feature, would you be willing to participate actively in a fork of Wikipedia?
Not kidding.
On Tue, Sep 13, 2011 at 1:28 PM, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
On Tue, Sep 6, 2011 at 3:33 PM, Tim Starling tstarling@wikimedia.org wrote:
http://wikimediafoundation.org/wiki/Resolution:Controversial_content
However, the editor community could sabotage it in various ways. For example, there's no guarantee that anyone will tag any images, or that tagged images won't be untagged by bots run by administrators. If the Board really does want a useful image-hiding feature, then it's essential that the community be persuaded that it is a good idea.
Personally, I think the filter will be mostly harmless, and that it's not worth the effort to rail against it. It will be useful for PR -- it will seem as if we are trying to accomodate all points of view even if the feature is not particularly useful for parents.
-- Tim Starling
Because of the dictat nature of the board resolution, I think the key question omitted from the questionnaire was:
When (not if) we implement this feature, would you be willing to participate actively in a fork of Wikipedia?
Not kidding.
Since the leadership of the foundation seems to be unable to speak for themselves, let me speak *for* them.
"We do realize that what we did was wrong, and this is clearly not a situation where we can go on with the 'your opinions have been duly noted' haughty attitude. We apologize for even going that route ever in the first place. The community rules, we serve, that is what we are being payed for.
We do realize how demoralizing it must be for the people in the trenches trying to weed out behaviours that can be only charitably called "gaming the system", when -- it is admitted -- we egregiously did that at the highest of levels of Foundation goverment, and for that we are duly sorry. Lessons shall be learned."
On Thu, Sep 15, 2011 at 1:26 AM, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
"We do realize that what we did was wrong, and this is clearly not a situation where we can go on with the 'your opinions have been duly noted' haughty attitude. We apologize for even going that route ever in the first place. The community rules, we serve, that is what we are being payed for."
"Let us now prostrate ourselves at the feet of that segment of the community which opposed this idea, who we realise are the just and righteous leaders of the Wikimedia movement due to having the loudest voices, and beg for their absolution."
On Wed, Sep 14, 2011 at 17:44, Stephen Bain stephen.bain@gmail.com wrote:
On Thu, Sep 15, 2011 at 1:26 AM, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
"We do realize that what we did was wrong, and this is clearly not a situation where we can go on with the 'your opinions have been duly noted' haughty attitude. We apologize for even going that route ever in the first place. The community rules, we serve, that is what we are being payed for."
"Let us now prostrate ourselves at the feet of that segment of the community which opposed this idea, who we realise are the just and righteous leaders of the Wikimedia movement due to having the loudest voices, and beg for their absolution."
"We accept the fact that our motives and motives of those who supported us lay in our suppressed unconscious part of mind; that it shows how deep are our fears to face the real world. But, as we said, we've learned the lesson and we'll try to face reality, no matter how painful it is. That's our job, as we are the leaders of Wikimedia movement."
On Wed, Sep 14, 2011 at 6:56 PM, Milos Rancic millosh@gmail.com wrote:
On Wed, Sep 14, 2011 at 17:44, Stephen Bain stephen.bain@gmail.com wrote:
On Thu, Sep 15, 2011 at 1:26 AM, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
"We do realize that what we did was wrong, and this is clearly not a situation where we can go on with the 'your opinions have been duly noted' haughty attitude. We apologize for even going that route ever in the first place. The community rules, we serve, that is what we are being payed for."
"Let us now prostrate ourselves at the feet of that segment of the community which opposed this idea, who we realise are the just and righteous leaders of the Wikimedia movement due to having the loudest voices, and beg for their absolution."
"We accept the fact that our motives and motives of those who supported us lay in our suppressed unconscious part of mind; that it shows how deep are our fears to face the real world. But, as we said, we've learned the lesson and we'll try to face reality, no matter how painful it is. That's our job, as we are the leaders of Wikimedia movement."
This is not about who, or what, this is about how.
On Thu, Sep 15, 2011 at 1:56 AM, Milos Rancic millosh@gmail.com wrote:
On Wed, Sep 14, 2011 at 17:44, Stephen Bain stephen.bain@gmail.com wrote:
On Thu, Sep 15, 2011 at 1:26 AM, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
"We do realize that what we did was wrong, and this is clearly not a situation where we can go on with the 'your opinions have been duly noted' haughty attitude. We apologize for even going that route ever in the first place. The community rules, we serve, that is what we are being payed for."
"Let us now prostrate ourselves at the feet of that segment of the community which opposed this idea, who we realise are the just and righteous leaders of the Wikimedia movement due to having the loudest voices, and beg for their absolution."
"We accept the fact that our motives and motives of those who supported us lay in our suppressed unconscious part of mind; that it shows how deep are our fears to face the real world. But, as we said, we've learned the lesson and we'll try to face reality, no matter how painful it is. That's our job, as we are the leaders of Wikimedia movement."
Could we not?
This isn't very useful to anybody.
On Thu, Sep 15, 2011 at 3:05 AM, Andrew Garrett agarrett@wikimedia.org wrote:
On Thu, Sep 15, 2011 at 1:26 AM, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
"We do realize that what we did was wrong, and this is clearly not a situation where we can go on with the 'your opinions have been duly noted' haughty attitude. We apologize for even going that route ever in the first place. The community rules, we serve, that is what we are being payed for."
"Let us now prostrate ourselves at the feet of that segment of the community which opposed this idea, who we realise are the just and righteous leaders of the Wikimedia movement due to having the loudest voices, and beg for their absolution."
"We accept the fact that our motives and motives of those who supported us lay in our suppressed unconscious part of mind; that it shows how deep are our fears to face the real world. But, as we said, we've learned the lesson and we'll try to face reality, no matter how painful it is. That's our job, as we are the leaders of Wikimedia movement."
Could we not?
This isn't very useful to anybody.
Allow me to disagree. This very serious dysfunction in the proper allocation of roles in the foundation, and the boil *must* be lanced.
I remember Florence Devouard (when she was still chair) describing to me a chart drawn by Brad Patrick about roles and the proper allocation of them. If either is listening, maybe they could copy the chart for circulation around the staff and current board.
I genuinely believe that this can be an extremely useful learning experience for the leadership. Not to be repeated; once should be enough.
On Wed, Sep 14, 2011 at 6:44 PM, Stephen Bain stephen.bain@gmail.com wrote:
On Thu, Sep 15, 2011 at 1:26 AM, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
"We do realize that what we did was wrong, and this is clearly not a situation where we can go on with the 'your opinions have been duly noted' haughty attitude. We apologize for even going that route ever in the first place. The community rules, we serve, that is what we are being payed for."
"Let us now prostrate ourselves at the feet of that segment of the community which opposed this idea, who we realise are the just and righteous leaders of the Wikimedia movement due to having the loudest voices, and beg for their absolution."
You may have missed a couple of things. A good portion of the loudest critics of _the_ _process_ (not the result) ; I hesitate say the majority, because I haven't done the numbers; were in fact people in favor of the proposal itself, but who had the integrity to recognize that a result gained by such flawed means has little or no legitimacy.
Secondly, a leadership prostrate would have even less legitimacy than one hunkering down in a bunker somewhere, surrounded by stalwart loyalists. Not that I genuinely believe the latter is the case here. I have enough faith in the integrity of many of the actors at the highest level of the foundation that my gloss on the situation is that likely people are still sorting things out in the usual manner and don't want to gross people out by how the sausage is made, and we will have a public sausage, er, I mean statement, shortly.
Thirdly, there never has in the past been *any* hierarchy in wikimedia, that is the beauty of it. And any attempt at empire building, now, or in the future, is doomed to fail. There is a governance structure, but that should be ring-fenced away from the community. What we have here is the governance structure trying to leap over the fence. We simply can't have that.
What is needed here is not a prostrate leadership, but one which acknowledges what happened, and offers a crisp plain apology. That wouldn't repair the damage, but it sure would staunch the bleeding.
You may have missed a couple of things. A good portion of the loudest critics of _the_ _process_ (not the result) ; I hesitate say the majority, because I haven't done the numbers; were in fact people in favor of the proposal itself, but who had the integrity to recognize that a result gained by such flawed means has little or no legitimacy.
Indeed; it was not a wonderfully useful polling of opinion. Count me in that group.
Thirdly, there never has in the past been *any* hierarchy in wikimedia, that is the beauty of it. And any attempt at empire building, now, or in the future, is doomed to fail. There is a governance structure, but that should be ring-fenced away from the community. What we have here is the governance structure trying to leap over the fence. We simply can't have that.
We have mini-empires at just about every level of the foundation and communities - at least on en.wiki. Those empires are pretty well entrenched by now that any differing view barely gets a look in.
What is needed here is not a prostrate leadership, but one which acknowledges what happened, and offers a crisp plain apology. That wouldn't repair the damage, but it sure would staunch the bleeding.
If only... as I recently commented at the London Wikimeet; for an organisation that is ostensibly at the bleeding edge of disseminating knowledge and openness we have a disappointingly standard/closed higher organisation. You can hardly tell it apart from the bazillions of other NFP's....
So: I wouldn't hold out hope.
Tom
On 15 September 2011 19:26, Thomas Morton morton.thomas@googlemail.com wrote:
Thirdly, there never has in the past been *any* hierarchy in wikimedia, that is the beauty of it. And any attempt at empire building, now, or in the future, is doomed to fail. There is a governance structure, but that should be ring-fenced away from the community. What we have here is the governance structure trying to leap over the fence. We simply can't have that.
We have mini-empires at just about every level of the foundation and communities - at least on en.wiki. Those empires are pretty well entrenched by now that any differing view barely gets a look in.
Indeed. It is important to remember that there is *always* a hierarchy - the difference is in whether it is explicit or implicit. c.f. "The Tyranny of Structurelessness" by Jo Freeman, an essay every Wikimedian sufficiently interested in the nuts and bolts of the Wikimedia movement to, e.g., read foundation-l should reread on occasion. http://www.jofreeman.com/joreen/tyranny.htm
What is needed here is not a prostrate leadership, but one which acknowledges what happened, and offers a crisp plain apology. That wouldn't repair the damage, but it sure would staunch the bleeding.
If only... as I recently commented at the London Wikimeet; for an organisation that is ostensibly at the bleeding edge of disseminating knowledge and openness we have a disappointingly standard/closed higher organisation. You can hardly tell it apart from the bazillions of other NFP's.... So: I wouldn't hold out hope.
It's important to note that every person involved is a digital native who deeply understands how Internet and volunteer things work, and is proceeding completely in good faith. And all are ridiculously smart and knowledgeable, too. That things appear hideously broken anyway is, I suspect, par for the course when humans are involved. "None of us is as stupid as all of us."
(This means I may disagree vociferously with many of you but love and admire you all nevertheless!)
- d.
wikimedia-l@lists.wikimedia.org