We are currently discussing an evolving image filter proposal on the Meta brainstorming page* that would give users the option of creating personal filter lists (PFL). The structure and interactivity of these personal filter lists would be comparable to those of editors' personal watchlists.
The way this would work is that each project page would have an "Enable image filtering" entry in the side bar. Clicking on this would add a "Hide" button to each image displayed on the page. Clicking on "Hide" would then grey the image, and automatically add it to the user's personal filter list.
Any image added to the PFL in this way would appear greyed on any subsequent visit to the page. It would also appear greyed on any other project page where it is included, and (given an SUL account) any page containing the image in any other Wikimedia project such as Commons itself – including Commons search result listings. In each case, the user would always retain the option of clicking on a "Show" button or the placeholder itself to reveal the picture again, and simultaneously remove it from their PFL. Of course, if they change their mind, they can add it right back again, by clicking on "Hide" again. It would work like adding/removing pages in one's watchlist.
Apart from enabling users to hide images and add them to their PFL as they encounter them in surfing our projects, users would also be able to edit the PFL manually, just as it is possible to edit one's watchlist manually. In this way, they could add any image file or category they want to their PFL. They could also add filter lists precompiled for them by a third party. Such lists could be crowdsourced by people interested in filtering, according to whatever cultural criteria they choose.
It became very clear during the discussions over the past few months that tagging files for the personal image filter, or creating image filter categories, was not something the community as a whole wanted to become involved in – partly because of the work involved, partly because of the arguments it would cause, and partly because it would not be possible to do this truly neutrally, given different cultural standards of offensiveness. Various people suggested that the Foundation do nothing, and leave the creation of image filters to third parties altogether.
This proposal occupies a middle ground. The Foundation provides users with the software capability to create and maintain personal filter lists, just like it enables users to maintain watchlists, but it is then up to a separate crowdsourcing effort by those who want to have a filter to find ways of populating such lists. This is consistent with the overall Wikimedia crowdsourcing approach, and a natural extension of it. Even if this crowdsourcing effort should unexpectedly fail to take off, readers will still gain the possibility of hiding images or media as they come across them with a single click, with the assurance that they won't ever see them again anywhere on our projects unless they really want to. That in itself would be a net gain. Users who don't want to have anything to do with filtering at all could switch any related screen furniture off in their preferences, to retain the same surfing experience they have now.
Under this proposal, the entire informational infrastructure for filtering would reside in readers' personal filter lists. The data structure of the wiki itself does not change at all, just like adding pages to a personal watchlist affects no one apart from the user whose watchlist it is. There are no filter tags, no specially created filter categories, and no one has to worry about defining, creating or maintaining them. The filter users do that for themselves.
For unregistered users, their PFL could be stored in a cookie. However, they would be encouraged to create an SUL account when they first enable image filtering, so they can retain the same surfing experience even after changing computers, or after accidentally deleting the cookie.
Andreas
* http://meta.wikimedia.org/wiki/Controversial_content/Brainstorming#Page-spec...
Andreas K. wrote:
The way this would work is that each project page would have an "Enable image filtering" entry in the side bar. Clicking on this would add a "Hide" button to each image displayed on the page. Clicking on "Hide" would then grey the image, and automatically add it to the user's personal filter list.
I think this sounds pretty good. Is there any indication how German Wikipedians generally view an implementation like this? I can't imagine English Wikipedians caring about an additional sidebar link/opt-in feature like this.
Apart from enabling users to hide images and add them to their PFL as they encounter them in surfing our projects, users would also be able to edit the PFL manually, just as it is possible to edit one's watchlist manually. In this way, they could add any image file or category they want to their PFL. They could also add filter lists precompiled for them by a third party. Such lists could be crowdsourced by people interested in filtering, according to whatever cultural criteria they choose.
Some sort of subscription service would work well here, right? Where the list can auto-update from a central list on a regular basis. I think that's roughly how in-browser ad block lists work. Seems like it could work well. Keep who pulls what lists private, though, I suppose.
For unregistered users, their PFL could be stored in a cookie.
I'm not sure you'd want to put it in a cookie, but that's an implementation detail.
Watchlist editing is generally based on looking at titles. I don't suppose you'd want a gallery of hidden images, but it would make filter-list editing easier, heh.
MZMcBride
Am 24.11.2011 15:09, schrieb MZMcBride:
Andreas K. wrote:
The way this would work is that each project page would have an "Enable image filtering" entry in the side bar. Clicking on this would add a "Hide" button to each image displayed on the page. Clicking on "Hide" would then grey the image, and automatically add it to the user's personal filter list.
I think this sounds pretty good. Is there any indication how German Wikipedians generally view an implementation like this? I can't imagine English Wikipedians caring about an additional sidebar link/opt-in feature like this.
Apart from enabling users to hide images and add them to their PFL as they encounter them in surfing our projects, users would also be able to edit the PFL manually, just as it is possible to edit one's watchlist manually. In this way, they could add any image file or category they want to their PFL. They could also add filter lists precompiled for them by a third party. Such lists could be crowdsourced by people interested in filtering, according to whatever cultural criteria they choose.
Some sort of subscription service would work well here, right? Where the list can auto-update from a central list on a regular basis. I think that's roughly how in-browser ad block lists work. Seems like it could work well. Keep who pulls what lists private, though, I suppose.
For unregistered users, their PFL could be stored in a cookie.
I'm not sure you'd want to put it in a cookie, but that's an implementation detail.
Watchlist editing is generally based on looking at titles. I don't suppose you'd want a gallery of hidden images, but it would make filter-list editing easier, heh.
MZMcBride
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
I'm a little bit confused by this approach. On the one side it is good to have this information stored privately and personal, on the other side we encouraging the development of filter lists and the tagging of possibly objectionable articles. The later wouldn't be private at all and even worse then tagging single images. In fact it would be some kind of additional force to ban images from articles just to keep them in the "clean" section.
Overall i see little to now advantage over the previously supposed solutions. It is much more complicated, harder to implement, more resource intensive and not a very friendly interface for readers.
My proposal would be: Just give it up and find other ways to improve Wikipedia and to make it more attractive.
nya~
Assuming an individual wanted filters, all methods such as this require them to be aware of whatever they consider to be the disturbing image(s) before deciding to apply the filter.
In those methods which filter on an image by image basis, this requirement rather defeats the purpose. The only way it is applicable is when someone else blocks the images first--presumably a parent, who thus has the need to identify and read every potentially disturbing page before their child happens upon it. It is more likely to be conducive to outsiders providing their prebuilt lists. They have the right to use what ever we provide, but do we want to provide tools that decrease actual individual choice and encourage the more heavy-handed methods of censorship?
This suggestion has one advantage over previous: it goes page by page, not image by image. In some cases, this might be realistic, but in others the user, especially the inexperienced user, will not realize from the page title what sort of images are likely to be found on it.
On Thu, Nov 24, 2011 at 9:59 AM, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
Am 24.11.2011 15:09, schrieb MZMcBride:
Andreas K. wrote:
The way this would work is that each project page would have an "Enable image filtering" entry in the side bar. Clicking on this would add a "Hide" button to each image displayed on the page. Clicking on "Hide" would then grey the image, and automatically add it to the user's personal filter list.
I think this sounds pretty good. Is there any indication how German Wikipedians generally view an implementation like this? I can't imagine English Wikipedians caring about an additional sidebar link/opt-in feature like this.
Apart from enabling users to hide images and add them to their PFL as they encounter them in surfing our projects, users would also be able to edit the PFL manually, just as it is possible to edit one's watchlist manually. In this way, they could add any image file or category they want to their PFL. They could also add filter lists precompiled for them by a third party. Such lists could be crowdsourced by people interested in filtering, according to whatever cultural criteria they choose.
Some sort of subscription service would work well here, right? Where the list can auto-update from a central list on a regular basis. I think that's roughly how in-browser ad block lists work. Seems like it could work well. Keep who pulls what lists private, though, I suppose.
For unregistered users, their PFL could be stored in a cookie.
I'm not sure you'd want to put it in a cookie, but that's an implementation detail.
Watchlist editing is generally based on looking at titles. I don't suppose you'd want a gallery of hidden images, but it would make filter-list editing easier, heh.
MZMcBride
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
I'm a little bit confused by this approach. On the one side it is good to have this information stored privately and personal, on the other side we encouraging the development of filter lists and the tagging of possibly objectionable articles. The later wouldn't be private at all and even worse then tagging single images. In fact it would be some kind of additional force to ban images from articles just to keep them in the "clean" section.
Overall i see little to now advantage over the previously supposed solutions. It is much more complicated, harder to implement, more resource intensive and not a very friendly interface for readers.
My proposal would be: Just give it up and find other ways to improve Wikipedia and to make it more attractive.
nya~
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On Thu, Nov 24, 2011 at 14:59, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
I'm a little bit confused by this approach. On the one side it is good to have this information stored privately and personal, on the other side we encouraging the development of filter lists and the tagging of possibly objectionable articles. The later wouldn't be private at all and even worse then tagging single images. In fact it would be some kind of additional force to ban images from articles just to keep them in the "clean" section.
Overall i see little to now advantage over the previously supposed solutions. It is much more complicated, harder to implement, more resource intensive and not a very friendly interface for readers.
Err, think of it with an analogy to AdBlock. You can have lists stored privately (in Adblock: in your browser settings files, in an image filter: on the WMF servers but in a secret file that they'll never ever ever ever release promise hand-on-heart*) and you can have lists stored publicly (in Adblock: the various public block lists that are community-maintained so that you don't actually see any ads, in an image filter: on the web somewhere). And you can put an instruction in the former list to transclude everything on a public list and keep it up-to-date.
Given it works pretty well in Adblock, I don't quite see how that's a big deal for Wikimedia either. Performance wise, you just have it so the logged in user has a list of images they don't want to see, and you have a script that every hour or so downloads and caches the public list, then when they call to retrieve the list for the purposes of seeing what's on it, it simply concatenates the two. This seems pretty straightforward.
And if the WMF doesn't do it - perhaps because people are whinging that me being given the option to opt-in and *not* see "My micropenis.jpg" is somehow evil and tyrannical and contrary to NOTCENSORED - it could possibly be done as a service by an outside group and then implemented on Wikipedia using userscripts. The difference is that the WMF may do it in a slightly more user-friendly way given that they have access to the servers.
* That's less sarcastic than it sounds.
Yes, it is an analogy to KnowledgeBlock, with predefinable lists, encouraged to be created by censors best friends, to be shared by the local ISPs to give an good understanding in what shouldn't be known.
Putting the sarcasm aside and switching to irony, I see a complicated system with very few potential users:
Problems for users/readers: * The average reader doesn't find the talk page, but it is expected by him to manage self maintained filter lists? * He needs to be shocked first, before he gets informed that such a feature exists. Or he will have to trust lists created by someone he don't even know.
Problems for the infrastructure: * Every account stores an additional list of what to block. Doing it also for IPs via cookies will create an huge amount of information that needs to be managed. (Considering the fact that it is actually used as massively as the demand is described by Andreas Kolbe/Jayen466) * Every use of the filter will circumvent the caching since every page requested by a user/reader that uses the filter will have to be created from scratch.
Problems in general: * If we use public lists then the approach is basically the same as with categorized filtering. The only difference is, that it is stored in another format. Today we serve the same eggs sunny side down. * Who creates the lists? The user for himself? Considering million of images and articles it isn't an option to do it alone. Someone who has a lot of freetime? Yes, considering the fact, that he doesn't want to see at all the pictures he looks at...
Putting the irony aside an switching to realism:
Every approach aside the "hide anything feature" that i have seen so far is either on the borderline to censorship, practically impossible to maintain or generally unusable by the average reader. The only thing i noticed is, that every approach seems to be right to introduce some kind of filter. If option A is no good, then lets try option B and if B is also not the right way then lets try C,... Currently we are at option Z II, and it looks not very different from option B, but very importantly it is better in the wording and it sounds nicer, like an old bike with a foxtail attached is much better then just an old bike.
I'm very curious what we try to achieve with this filter? Is it really to get more readers or is it just to introduce a filter that is in some way predefinable? Where is the opposition against the simple "hide anything feature"? It is simple, can quickly be implemented, doesn't cost much money and serves 99% of the mentioned purposes for filtering. But why the hell isn't it an option for our filter-fan-boys and filter-fan-girls?
nya~
Am 26.11.2011 15:41, schrieb Tom Morris:
On Thu, Nov 24, 2011 at 14:59, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
I'm a little bit confused by this approach. On the one side it is good to have this information stored privately and personal, on the other side we encouraging the development of filter lists and the tagging of possibly objectionable articles. The later wouldn't be private at all and even worse then tagging single images. In fact it would be some kind of additional force to ban images from articles just to keep them in the "clean" section.
Overall i see little to now advantage over the previously supposed solutions. It is much more complicated, harder to implement, more resource intensive and not a very friendly interface for readers.
Err, think of it with an analogy to AdBlock. You can have lists stored privately (in Adblock: in your browser settings files, in an image filter: on the WMF servers but in a secret file that they'll never ever ever ever release promise hand-on-heart*) and you can have lists stored publicly (in Adblock: the various public block lists that are community-maintained so that you don't actually see any ads, in an image filter: on the web somewhere). And you can put an instruction in the former list to transclude everything on a public list and keep it up-to-date.
Given it works pretty well in Adblock, I don't quite see how that's a big deal for Wikimedia either. Performance wise, you just have it so the logged in user has a list of images they don't want to see, and you have a script that every hour or so downloads and caches the public list, then when they call to retrieve the list for the purposes of seeing what's on it, it simply concatenates the two. This seems pretty straightforward.
And if the WMF doesn't do it - perhaps because people are whinging that me being given the option to opt-in and *not* see "My micropenis.jpg" is somehow evil and tyrannical and contrary to NOTCENSORED - it could possibly be done as a service by an outside group and then implemented on Wikipedia using userscripts. The difference is that the WMF may do it in a slightly more user-friendly way given that they have access to the servers.
- That's less sarcastic than it sounds.
On 26 November 2011 19:54, Tobias Oelgarte tobias.oelgarte@googlemail.com wrote:
[snip long list of concerns with this latest attempt in practice]
I'm very curious what we try to achieve with this filter? Is it really to get more readers or is it just to introduce a filter that is in some way predefinable? Where is the opposition against the simple "hide anything feature"? It is simple, can quickly be implemented, doesn't cost much money and serves 99% of the mentioned purposes for filtering. But why the hell isn't it an option for our filter-fan-boys and filter-fan-girls?
+1 I'd like to see the answered properly too. What, getting back to basics, is the actual point here?
- d.
On Sat, Nov 26, 2011 at 02:41:51PM +0000, Tom Morris wrote:
You can have lists stored
(...)
on the WMF servers but in a secret file that they'll never ever ever ever release promise hand-on-heart*)
This works so well if you DO read it sarcastically. ;-)
and you can have lists stored publicly (in Adblock: the various public block lists that are community-maintained so that you don't actually see any ads, in an image filter: on the web somewhere). And you can put an instruction in the former list to transclude everything on a public list and keep it up-to-date.
Right, except then you have a public list of prejudicial labels. I think that most have agreed that that's just a little too close to the fire for comfort.
And if the WMF doesn't do it - perhaps because people are whinging that me being given the option to opt-in and *not* see "My micropenis.jpg" is somehow evil and tyrannical and contrary to NOTCENSORED
Even filter proponent Jimmy Wales is adamant about there being no censorship ("Period").
The part where people are disagreeing on is "how close to we want to dance to the fire, and how many burns do we accept?"
My proposal is that perhaps we shouldn't be dancing close to the fire at all. If we want to escape BadPictures(tm), how about a nice refreshing swim instead?
more concretely:
I think filters are probably the wrong solution to the problem today (in fact, they're more like a solution looking for a problem) . I now think that a combination of on-wiki policy and prudence, and improved categorisation and search on commons, would probably not only avoid potential problems entirely, but actually be a heck of a lot more effective too.
See: http://meta.wikimedia.org/wiki/User_talk:Kim_Bruning#Image_filter
In this discussion with Atlasowa I challenge them to come with actual numbers and facts. I think Atlasowa has proved all of us more-or-less wrong ;-)
sincerely, Kim Bruning
On Thu, Nov 24, 2011 at 2:09 PM, MZMcBride z@mzmcbride.com wrote:
Andreas K. wrote:
The way this would work is that each project page would have an "Enable image filtering" entry in the side bar. Clicking on this would add a
"Hide"
button to each image displayed on the page. Clicking on "Hide" would then grey the image, and automatically add it to the user's personal filter
list.
I think this sounds pretty good. Is there any indication how German Wikipedians generally view an implementation like this? I can't imagine English Wikipedians caring about an additional sidebar link/opt-in feature like this.
The proposal is currently being discussed here:
http://de.wikipedia.org/wiki/Wikipedia_Diskussion:Wikipedia-Fork#.22Der_Filt...
So far, several editors who were rigorously opposed to the category-based filter idea have said that they would see no reason to oppose personal filter lists. (One editor mentioned disturbing images in the German meningitis article http://de.wikipedia.org/wiki/Meningitis as an example in the discussion.) About the same number have said that they would still be opposed on principle. It's early days though; at the time of writing, fewer than ten editors have commented.
There is some German user involvement and also some German-language discussion on Meta, with a similar pattern (and also some questions that only the board and programmers can answer):
http://meta.wikimedia.org/wiki/Controversial_content/Brainstorming#Page-spec...
Dirk Franke, a well-known German Wikipedian and member of the German Wikimedia board (Benutzer:Southpark), posted an entry on his blog, saying that in his opinion this could indeed form the basis of a constructive discussion about the image filter:
http://www.iberty.net/2011/11/image-filter-brainstorming.html
There have also been heated discussions about Sue's visit, the board, the appropriateness of image filtering, image use, a simple images on/off option etc. on the Kurier talk page:
http://de.wikipedia.org/wiki/Wikipedia_Diskussion:Kurierhttp://de.wikipedia.org/wiki/Wikipedia_Diskussion:Kurier#Sues_Stippvisite
– again, some comments saying personal filter lists would be okay, others countering that any softening of the resistance against any kind of image filter would be a sell-out.
Andreas
On Sat, Nov 26, 2011 at 5:56 AM, Andreas K. jayen466@gmail.com wrote:
The proposal is currently being discussed here:
http://de.wikipedia.org/wiki/Wikipedia_Diskussion:Wikipedia-Fork#.22Der_Filt...
So far, several editors who were rigorously opposed to the category-based filter idea have said that they would see no reason to oppose personal filter lists. (One editor mentioned disturbing images in the German meningitis article http://de.wikipedia.org/wiki/Meningitis as an example in the discussion.) About the same number have said that they would still be opposed on principle.It's early days though; at the time of writing, fewer than ten editors have commented.
There is some German user involvement and also some German-language discussion on Meta, with a similar pattern (and also some questions that only the board and programmers can answer):
http://meta.wikimedia.org/wiki/Controversial_content/Brainstorming#Page-spec...
Dirk Franke, a well-known German Wikipedian and member of the German Wikimedia board (Benutzer:Southpark), posted an entry on his blog, saying that in his opinion this could indeed form the basis of a constructive discussion about the image filter:
There have also been heated discussions about Sue's visit, the board, the appropriateness of image filtering, image use, a simple images on/off option etc. on the Kurier talk page:
– again, some comments saying personal filter lists would be okay, others countering that any softening of the resistance against any kind of image filter would be a sell-out.
I think the fundamental error in this reasoning is that you seem to under the impression that this is something new here that is considered, and that there have only been a few people commenting on these different schemes. The brutal fact is that during the seven or eight years this issue has reared its ugly head, thousands of people have opined on this issue, and a vast majority have a big opposition to any scheme, because it is at base against our core mission. Jimbo personally blocked a few of the people who suggested anything of the sort (Uwe Kils might have been the first, though there might have been somebody before I joined the project).
Using phrases like "some people", "a few people" is a pathetic representation of the reality. It isn't a minority you want to address/oppose, but a huge and strong entrenched core group. Pretending otherwise is just pure madness.
On 28 November 2011 02:12, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
Using phrases like "some people", "a few people" is a pathetic representation of the reality. It isn't a minority you want to address/oppose, but a huge and strong entrenched core group. Pretending otherwise is just pure madness.
Unfortunately, the issue is not dead.
http://commons.wikimedia.org/w/index.php?title=File:Presentation_Gardner_Han...
The Board's last public statement was that nothing has changed.
The executive director's last statement, directly to the group most visibly opposed to a filter, is that it's going ahead, because the Board's direction stands.
Note also the citation of the referendum as "support", which *should* be a thoroughly discredited piece of spin by now, but evidently isn't.
Andreas pushing the filter isn't relevant, it's true. The Board and the staff continuing to push the filter, that's highly relevant and it would be foolish to be sanguine about it.
- d.
On Mon, Nov 28, 2011 at 10:21 AM, David Gerard dgerard@gmail.com wrote:
Unfortunately, the issue is not dead.
That's correct; nobody from WMF has said otherwise. What's dead is the idea of a category-based image filter, not the idea of giving additional options to readers to reversibly collapse images they may find offensive, shocking, or inappropriate in the context in which they're viewing them (e.g. at work). However, Sue has made it clear that she wants the WMF staff to work with the community to find a solution that doesn't mean strong opposition. Her presentation on the issue in Hannover begins with this slide:
http://commons.wikimedia.org/w/index.php?title=File%3APresentation_Gardner_H...
My personal view is that such a solution will need to take into account that actual current editorial practices and perceptions in our projects vary a great deal, as did the image filter poll results by language. As I pointed out before, projects like Arabic and Hebrew Wikipedia are currently collapsing content that's not even on the radar in most of these discussions (e.g. the 1866 painting L'Origine du monde in Hebrew Wikipedia), while German Wikipedia put the vulva photograph on its main page. A solution that pretends that this continuum of practice can be covered with a single approach, one which doesn't give a lot of flexibility to readers and editors, is IMO not a solution at all.
I'm not convinced that the "collapse image one-by-one" approach to develop a filter list is very valuable in and of itself due to lack of immediate practical impact and likely limited usability. The idea of making it easy to build, import and share such lists of images or image-categories would move the process of categorization into a market economy of sorts where individual or organizational demand regulates supply of available filters. This could lead to all kinds of groups advertising their own filter-lists, e.g. Scientology, Focus on the Family, etc. From there, it would be relatively small step for such a group to take its filter list and coerce users to only access Wikipedia with the filter irreversibly in place.
While third parties are already able to coerce their users to not see certain content, creating an official framework for doing so IMO puts us dangerously close to censors: it may lead to creation of regimes of censorship that did not previously exist, and may be used to exercise pressure on WMF to change its default view settings in certain geographies since all the required functionality would already be readily available.
My personal view on this issue has always been that one of the most useful things we could do for readers is to make NPOV, well-vetted and thorough advice too users on how to manage and personalize their net access available to them. Wikipedia is only one site on the web, and whatever we do is not going to extend to the rest of the user's experience anyway. There are companies that specialize in filtering the Net; we could point people to those providers and give advice on how to install specific applications, summarizing criticism and praise they have received.
On the other hand, such advice would be pretty removed from the experience of the reader, and l do think there are additional reasonable things we could do. So I'm supportive of approaches which give an editing community additional flexibility in warning their readers of content they may find objectionable, and give readers the ability to hide (in the general or specific case) such content. As I said previously, this wouldn't create a new regime of filter lists or categories, merely a broad community-defined standard by which exclusion of some content may be desirable, which could vary by language as it does today.
Kim, I just read the conversation on your talk page. In general, I agree that more research into both the current practices of our editing communities as well as reader expectations and needs would be valuable. Right now we have some anecdotal data points from the projects, Robert's original research which mostly focuses on establishing definitions and principles, and the image filter poll results. I think the latter are useful data if carefully analyzed, but they do mingle low-activity users who are chiefly readers with the core editing community in ways that don't give us tremendously clear information by group. The poll also referred to a filtering concept that's now been rejected.
At the same time, I do think that we shouldn't hesitate to build some cheap prototypes to make abstract ideas more understandable. I think to advance our understanding, as well as the state of the conversation, through both additional pointed research, as well as discussion of some interactive prototypes, without spending tremendous amounts of time and money on either, feels like a response that's commensurate to the scale and importance of the issue.
Erik
On Mon, Nov 28, 2011 at 10:46 AM, Erik Moeller erik@wikimedia.org wrote:
On Mon, Nov 28, 2011 at 10:21 AM, David Gerard dgerard@gmail.com wrote:
Unfortunately, the issue is not dead.
That's correct; nobody from WMF has said otherwise. What's dead is the idea of a category-based image filter, not the idea of giving additional options to readers to reversibly collapse images they may find offensive, shocking, or inappropriate in the context in which they're viewing them (e.g. at work). However, Sue has made it clear that she wants the WMF staff to work with the community to find a solution that doesn't mean strong opposition. Her presentation on the issue in Hannover begins with this slide:
http://commons.wikimedia.org/w/index.php?title=File%3APresentation_Gardner_H...
My personal view is that such a solution will need to take into account that actual current editorial practices and perceptions in our projects vary a great deal, as did the image filter poll results by language. As I pointed out before, projects like Arabic and Hebrew Wikipedia are currently collapsing content that's not even on the radar in most of these discussions (e.g. the 1866 painting L'Origine du monde in Hebrew Wikipedia), while German Wikipedia put the vulva photograph on its main page. A solution that pretends that this continuum of practice can be covered with a single approach, one which doesn't give a lot of flexibility to readers and editors, is IMO not a solution at all.
I'm not convinced that the "collapse image one-by-one" approach to develop a filter list is very valuable in and of itself due to lack of immediate practical impact and likely limited usability. The idea of making it easy to build, import and share such lists of images or image-categories would move the process of categorization into a market economy of sorts where individual or organizational demand regulates supply of available filters. This could lead to all kinds of groups advertising their own filter-lists, e.g. Scientology, Focus on the Family, etc. From there, it would be relatively small step for such a group to take its filter list and coerce users to only access Wikipedia with the filter irreversibly in place.
The "collapse images one-by-one" approach would work for Wikipedians and readers who generally come across very little content in Wikipedia that's objectionable to them, except for the odd image that they have seen again and again and now feel they've seen often enough.
I'm not sure how much of a realistic issue the second point, with Scientology, FoF etc., is. As designed, the filter only hides the content from initial view; the content is still accessible by clicking on it. That wouldn't be good enough for a dyed-in-the-wool censor.
While third parties are already able to coerce their users to not see certain content, creating an official framework for doing so IMO puts us dangerously close to censors: it may lead to creation of regimes of censorship that did not previously exist, and may be used to exercise pressure on WMF to change its default view settings in certain geographies since all the required functionality would already be readily available.
If the image filter uses a user-specific personal filter list stored on the Foundation's server, that would assume that the censor can populate the user's list without the user noticing, can prevent the user from emptying their PFL again, and can disable the user's ability to click on a hidden image to reveal it. Is there something that we could do to make that more difficult, or impossible? Because then any censor would be back to square one, left to their own devices, rather than being able to ride piggy-back on our filter function.
My personal view on this issue has always been that one of the most
useful things we could do for readers is to make NPOV, well-vetted and thorough advice too users on how to manage and personalize their net access available to them. Wikipedia is only one site on the web, and whatever we do is not going to extend to the rest of the user's experience anyway. There are companies that specialize in filtering the Net; we could point people to those providers and give advice on how to install specific applications, summarizing criticism and praise they have received.
I don't understand why you would feel uncomfortable associating with hobbyists creating crowdsourced filter lists (or indeed moral guardians creating such lists, if they can be bothered to do the work), but would feel comfortable endorsing professional filter software companies.
If we are worried about people changing default settings in certain geographies, the professional filter providers are much more likely to have that capability, as it's already part of their product portfolio.
The individuals who'd offer Wikipedia editors and readers their own sets of graded filter lists (no hardcore / no softcore / no spiders / no Muhammad / etc.) on their websites, just as a hobby and for peer recognition, wouldn't have the business standing, nor the software design capability, of a professional filter software company. Their cottage-industry, volunteer outlook would arguably be more compatible with the Wikipedia mindset. And I do wonder how interested a moral guardianship organisation would be in developing a filter list for a filter that any child can override, just by clicking on the hidden image. Curiosity is a powerful impulse.
I do see that the personal filter list templates that hobbyists might put together are directly focused on Wikimedia, creating as it were an explicit inventory of Wikimedia's controversial content that has never existed before. On the other hand, Tom Morris in his blog post
http://blog.tommorris.org/post/11286767288/opt-in-image-filter-enabling-cens...
made a fairly good argument why professional censors could easily create such lists themselves, if they were that interested. It wouldn't cost them much, and they might be more inclined to rely on their own work rather than that of hobbyists.
On the other hand, such advice would be pretty removed from the experience of the reader, and l do think there are additional reasonable things we could do. So I'm supportive of approaches which give an editing community additional flexibility in warning their readers of content they may find objectionable, and give readers the ability to hide (in the general or specific case) such content. As I said previously, this wouldn't create a new regime of filter lists or categories, merely a broad community-defined standard by which exclusion of some content may be desirable, which could vary by language as it does today.
Do you favour the sort of approach Neitram suggested then? I.e.
http://meta.wikimedia.org/wiki/Controversial_content/Brainstorming#thumb.2Fh...
or
http://meta.wikimedia.org/wiki/Controversial_content/Brainstorming#Opt-in_ve... ?
That's kind of similar to the Hebrew and Arabic collapse templates, except that the user gets to opt in to it.
Best, Andreas
On Wed, Nov 30, 2011 at 7:39 PM, Andreas K. jayen466@gmail.com wrote:
If the image filter uses a user-specific personal filter list stored on the Foundation's server, that would assume that the censor can populate the user's list without the user noticing, can prevent the user from emptying their PFL again, and can disable the user's ability to click on a hidden image to reveal it. Is there something that we could do to make that more difficult, or impossible? Because then any censor would be back to square one, left to their own devices, rather than being able to ride piggy-back on our filter function.
Two misconceptions there. A genuine dyed-in-the-wool censor wouldn't give two figs about whether the user trying to access non-conformant material was aware or not of being restricted from accessing it.
Secondly, if there is no forcibly programmed barrier, of course, a user can just bypass a soft barrier, but censors rarely use soft barriers, they tend to be a bit more hardnosed about it.
One of my principle objections about the whole idea of doing anything remotely along the lines of what the board resolutions and board meeting minutes appear to reveal about their approach -- more direct speech from that direction would be welcome, of course, so we do not work under a misapprehension about their real goals -- is that creating the structured informational web of potentional controversial content knowledge-base is a highly complex task (I would say impossible, but there seems to be a viewpoint in the direction of the WMF that it could be done) and having made such a creature purely by the actions of a community who in fact are largely philosophically opposed to doing any such thing, then we as a community would be morally obligated to try to mitigate any attempts to subvert the use of such a knowledge-base. Which, I do assure you, might not prove to be a trivial task.
On Mon, Nov 28, 2011 at 3:12 AM, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
I think the fundamental error in this reasoning is that you seem to under the impression that this is something new here that is considered, and that there have only been a few people commenting on these different schemes. The brutal fact is that during the seven or eight years this issue has reared its ugly head, thousands of people have opined on this issue, and a vast majority have a big opposition to any scheme, because it is at base against our core mission.
Our core mission is making information and knowledge available to people who want it, not pushing it down their throats against their will.
On 28 November 2011 09:34, Andre Engels andreengels@gmail.com wrote:
Our core mission is making information and knowledge available to people who want it, not pushing it down their throats against their will.
Show that there is a demand. Build a filtered Wikipedia and get rich.
(There must be a FAQ somewhere listing all the points refuted a thousand times like this. Then they could just be answered "#66" or similar.)
- d.
On Mon, Nov 28, 2011 at 10:43 AM, David Gerard dgerard@gmail.com wrote:
On 28 November 2011 09:34, Andre Engels andreengels@gmail.com wrote:
Our core mission is making information and knowledge available to people who want it, not pushing it down their throats against their will.
Show that there is a demand. Build a filtered Wikipedia and get rich.
You're saying that anything that is not wanted by more than a few people goes against our core mission?
On 28 November 2011 10:07, Andre Engels andreengels@gmail.com wrote:
You're saying that anything that is not wanted by more than a few people goes against our core mission?
No, and nor did I say anything that could reasonably be construed as that.
- d.
On Mon, Nov 28, 2011 at 11:14 AM, David Gerard dgerard@gmail.com wrote:
On 28 November 2011 10:07, Andre Engels andreengels@gmail.com wrote:
You're saying that anything that is not wanted by more than a few people goes against our core mission?
No, and nor did I say anything that could reasonably be construed as that.
I said that an image filter was not against our core mission. You reacted to that by saying that I should show that there is a demand. Then you added something about "all the points refuted a thousand times like this". Surely it is quite reasonable that you considered "there is no demand" as a refutation of "it is not against our core mission". And for that to be a valid line of reasoning, you need to have the rule "if there is no demand for something, it goes against our core mission".
On 28 November 2011 10:51, Andre Engels andreengels@gmail.com wrote:
I said that an image filter was not against our core mission. You reacted to that by saying that I should show that there is a demand. Then you added something about "all the points refuted a thousand times like this". Surely it is quite reasonable that you considered "there is no demand" as a refutation of "it is not against our core mission". And for that to be a valid line of reasoning, you need to have the rule "if there is no demand for something, it goes against our core mission".
At this point you appear to be stretching to keep a flame war going.
- d.
On Mon, Nov 28, 2011 at 11:58 AM, David Gerard dgerard@gmail.com wrote:
At this point you appear to be stretching to keep a flame war going.
Stretching? It seemed like a valid chain of reasoning to me. But if you don't agree, please give your line of reasoning as to how your statement was a refutation of, or even just a sensible reaction to, my statement.
On 28 November 2011 11:03, Andre Engels andreengels@gmail.com wrote:
On Mon, Nov 28, 2011 at 11:58 AM, David Gerard dgerard@gmail.com wrote:
At this point you appear to be stretching to keep a flame war going.
Stretching? It seemed like a valid chain of reasoning to me. But if you don't agree, please give your line of reasoning as to how your statement was a refutation of, or even just a sensible reaction to, my statement. --
Permit me to interrupt your debate here... Perhaps we could go back to the topic of the thread itself rather than the meta-argument of whose reasoning is a sensible reaction to which statement?
Erik M's email just above has some very interesting points, both about the WMF's official stance (category-based system is dead; local communities * already* have different approaches; somehow providing tools to reversibly collapse images still happening), as well has his personal opinion (official framework for sharable filter lists takes us too close to censors; existing data does not give us good info on different group's needs; we should providing good advice to readers about personalising their web-experience; building several cheap prototypes is a good step from here). [forgive me if I'm losing some nuance in making this summary]
Perhaps we could focus on those practical points - preferably on-wiki - rather than having endless debates about what different people did/didn't mean to say or getting into abstract ideological discussions.
-Liam
Liam Wyatt wrote:
Perhaps we could focus on those practical points - preferably on-wiki - rather than having endless debates about what different people did/didn't mean to say or getting into abstract ideological discussions.
Buzzkill.
There are a lot of ideas here: https://meta.wikimedia.org/wiki/Controversial_content/Brainstorming.
Someone needs to start triaging/cultivating/poking/prodding the ideas to try to come up with something that's workable (if any such solution exists at all). As the Wikimedia Foundation is the primary body pushing for this feature, it will need to assign staff and other resources. I don't think this has been done at this point, so the ball is in its court.
MZMcBride
On Mon, Nov 28, 2011 at 8:26 PM, MZMcBride z@mzmcbride.com wrote:
Liam Wyatt wrote:
Perhaps we could focus on those practical points - preferably on-wiki - rather than having endless debates about what different people did/didn't mean to say or getting into abstract ideological discussions.
Buzzkill.
There are a lot of ideas here: https://meta.wikimedia.org/wiki/Controversial_content/Brainstorming.
Someone needs to start triaging/cultivating/poking/prodding the ideas to try to come up with something that's workable (if any such solution exists at all). As the Wikimedia Foundation is the primary body pushing for this feature, it will need to assign staff and other resources. I don't think this has been done at this point, so the ball is in its court.
I'm assuming we might have to wait till next year, more specifically the end of the fundraiser till this is taken up again. (sbm)
I do think some resources might have been committed to it already, their might already be a lot of interesting data points and research being accumulated related to this.
Regards Theo
On Mon, Nov 28, 2011 at 10:34:16AM +0100, Andre Engels wrote:
On Mon, Nov 28, 2011 at 3:12 AM, Jussi-Ville Heiskanen cimonavaro@gmail.com wrote:
Our core mission is making information and knowledge available to people who want it, not pushing it down their throats against their will.
Well, people actually have to surf over to wikipedia to be able to get any information, it's not like we jump them in the streets :-P
But I kid. ;-) I don't think that "pro-" versus "contra-" censorship is actually even the correct narrative.
Basically, we were all standing around looking at this screw that needs to be put into the wall; and the board came with a mandate "Let's make a hammer!"
* Some people went: "Yeah, the screw needs to go in!" * Other people went: "No way, hammers don't work on screws!" * A few people went: "Dude, shouldn't we use some long object that we can twist or something?"
Somehow the discussion has devolved to respectively * "Why do you want the screw to stick out?" versus * "Why do you want to hit our thumb?"...
... but -if we want to reach consensus[1]- what we really need to be discussing is: screwdrivers.
sincerely, Kim Bruning
[1] I know, boring old fuddy duddy consensus. Controversy is much more fun. ;-) But it takes away so much energy that could be used for other stuff. I'd really like to finish this and actually have some time left for editor retention -like- this year or so? :-)
Greetings Hope you are okay just follow us on twitter Ali Abdul Baasit
________________________________ From: Jussi-Ville Heiskanen cimonavaro@gmail.com To: Wikimedia Foundation Mailing List foundation-l@lists.wikimedia.org Sent: Monday, November 28, 2011 4:12 AM Subject: Re: [Foundation-l] Image filter brainstorming: Personal filter lists
On Sat, Nov 26, 2011 at 5:56 AM, Andreas K. jayen466@gmail.com wrote:
The proposal is currently being discussed here:
http://de.wikipedia.org/wiki/Wikipedia_Diskussion:Wikipedia-Fork#.22Der_Filt...
So far, several editors who were rigorously opposed to the category-based filter idea have said that they would see no reason to oppose personal filter lists. (One editor mentioned disturbing images in the German meningitis article http://de.wikipedia.org/wiki/Meningitis as an example in the discussion.) About the same number have said that they would still be opposed on principle.It's early days though; at the time of writing, fewer than ten editors have commented.
There is some German user involvement and also some German-language discussion on Meta, with a similar pattern (and also some questions that only the board and programmers can answer):
http://meta.wikimedia.org/wiki/Controversial_content/Brainstorming#Page-spec...
Dirk Franke, a well-known German Wikipedian and member of the German Wikimedia board (Benutzer:Southpark), posted an entry on his blog, saying that in his opinion this could indeed form the basis of a constructive discussion about the image filter:
There have also been heated discussions about Sue's visit, the board, the appropriateness of image filtering, image use, a simple images on/off option etc. on the Kurier talk page:
– again, some comments saying personal filter lists would be okay, others countering that any softening of the resistance against any kind of image filter would be a sell-out.
I think the fundamental error in this reasoning is that you seem to under the impression that this is something new here that is considered, and that there have only been a few people commenting on these different schemes. The brutal fact is that during the seven or eight years this issue has reared its ugly head, thousands of people have opined on this issue, and a vast majority have a big opposition to any scheme, because it is at base against our core mission. Jimbo personally blocked a few of the people who suggested anything of the sort (Uwe Kils might have been the first, though there might have been somebody before I joined the project).
Using phrases like "some people", "a few people" is a pathetic representation of the reality. It isn't a minority you want to address/oppose, but a huge and strong entrenched core group. Pretending otherwise is just pure madness.
I think this sounds pretty good. Is there any indication how German Wikipedians generally view an implementation like this? I can't imagine English Wikipedians caring about an additional sidebar link/opt-in
feature
like this.
Actually I think, they do not like it too much. I'll try to explain:
(1) Almost everybody on the German Wikipedia thinks, that the original problem the filter tries to solve, does not exist. So there is no positive reason to introduce an image filter.
(2) A strong majority thinks, the principle itself is evil.
So for (1) to accept an filter you would either need a community that doesn't care (way too late for that one..), or a lot of goodwill by the community for the foundation. I am afraid as long as the board doesn't move, there may be more or less infuriated opposition against the filter, but only a small minority who positively supports it. And I am afraid the board would have to move publicly enough that even a "I dont care about meta, I want to write articles about 18th century village churches"-Wikipedia will notice that move.
For (2) decrease evilness. There are two main reasons why the filter is considered evil. For one: it may allow third parties to influence the Wikipedia-experience of readers The personal filter solution deals imho pretty well with this problem, but still, interference is possible. And secondly it judges on different values than purely encyclopedic ones. I for myself think "I don't like it" is a perfectly valid judgement, but that seems to be a minority position on de.wp.
I think the personal image filter is a step in the right direction, as it adresses at least one of the three main objections. But still a long way to go for general acceptance.
regards,
southpark
wikimedia-l@lists.wikimedia.org