Search Engines like google deal with this problem all the time. You can find any sort of thing in Google, but you can also search with 'safe search' turned on, which presumably helps a bit with that problem.
Our job is actually a lot easier than google's in this regard, because they might link to anything out of billions and billions of pages. We only have a few hundred thousand.
We could have a flag or marker for 'content advisory'. As with Google, it wouldn't actually stop anyone from clicking on anything. (I don't suppose curious children are much delayed if they want to see something on google that is filtered.)
I'm not 100% sure this makes sense, but it *is* a solution favored by a lot of mainstream websites, and it does have some acceptability on those grounds.
--Jimbo
I suggested the same thing a couple of weeks ago on the Village Pump, here, I'll go find the area: Darn it! I can't find the discussion. Anyhow, I totally 100% support this. I don't think it is censoring. And while it may not keep us from getting blocked, it is morally responsible of us.
-- Michael Becker
-----Original Message----- From: wikien-l-admin@wikipedia.org [mailto:wikien-l-admin@wikipedia.org] On Behalf Of Jimmy Wales Sent: Monday, June 02, 2003 9.48 To: wikien-l@wikipedia.org Subject: [WikiEN-l] Just to throw this out there...
Search Engines like google deal with this problem all the time. You can find any sort of thing in Google, but you can also search with 'safe search' turned on, which presumably helps a bit with that problem.
Our job is actually a lot easier than google's in this regard, because they might link to anything out of billions and billions of pages. We only have a few hundred thousand.
We could have a flag or marker for 'content advisory'. As with Google, it wouldn't actually stop anyone from clicking on anything. (I don't suppose curious children are much delayed if they want to see something on google that is filtered.)
I'm not 100% sure this makes sense, but it *is* a solution favored by a lot of mainstream websites, and it does have some acceptability on those grounds.
--Jimbo _______________________________________________ WikiEN-l mailing list WikiEN-l@wikipedia.org http://mail.wikipedia.org/mailman/listinfo/wikien-l
Jimmy-
Search Engines like google deal with this problem all the time. You can find any sort of thing in Google, but you can also search with 'safe search' turned on, which presumably helps a bit with that problem.
Of course, it is nowhere documented what criteria or algorithms are used to filter the sites which are "unsafe", and we all know that these filters work terribly -- they either let tons of material go through, or they censor tons of legitimate content.
On a wiki, we risk "flagging wars", and by defining what "can be considered offensive" we are leaving NPOV behind. Daniel Ehrenberg suggested a similar thing a few weeks ago and said that he thought the goal of getting Wikipedia into all schools would be worth ignoring the NPOV issue. I do not agree. By implementing a "family filter" we may convince some parents to use Wikipedia -- and convince others who want their children to be raised in an open, progressive manner to ignore it. Let's not pretend that the standards of morality that may be predominant in the United States are also as common in Sweden or the Netherlands.
Still, I am not fundamentally opposed to filters on Wikipedia, but I think whatever solution is used should be NPOV - i.e. allow filtering by all types of content, not just one specific one. So, for example, if I want to filter *everything BUT* the sex content, that should also be possible. At which point I refer you to my pet project, team certification :-)
Regards,
Erik
Erik Moeller wrote:
Still, I am not fundamentally opposed to filters on Wikipedia, but I think whatever solution is used should be NPOV - i.e. allow filtering by all types of content, not just one specific one. So, for example, if I want to filter *everything BUT* the sex content, that should also be possible.
Another user may want to filter out religious content, and that would be just fine too. Is my idea of coding boxes technically feasible?
At which point I refer you to my pet project, team certification :-)
Have you elaborated on that somewhere?
Ec
Ray-
Erik Moeller wrote:
Still, I am not fundamentally opposed to filters on Wikipedia, but I think whatever solution is used should be NPOV - i.e. allow filtering by all types of content, not just one specific one. So, for example, if I want to filter *everything BUT* the sex content, that should also be possible.
Another user may want to filter out religious content, and that would be just fine too. Is my idea of coding boxes technically feasible?
You mean checkboxes for various categories below the edit submission screen? I'm afraid there'd be too many, and too much controversy around which ones we need, for that to be workable. Also integration with RC, diffs etc. would be tricky.
At which point I refer you to my pet project, team certification :-)
Have you elaborated on that somewhere?
Many times in many places ;-). Originally here: http://mail.wikipedia.org/pipermail/wikitech-l/2002-October/001089.html
Regards,
Erik
Erik Moeller wrote:
Ray-
Erik Moeller wrote:
Still, I am not fundamentally opposed to filters on Wikipedia, but I think whatever solution is used should be NPOV - i.e. allow filtering by all types of content, not just one specific one. So, for example, if I want to filter *everything BUT* the sex content, that should also be possible.
Another user may want to filter out religious content, and that would be just fine too. Is my idea of coding boxes technically feasible?
You mean checkboxes for various categories below the edit submission screen? I'm afraid there'd be too many, and too much controversy around which ones we need, for that to be workable. Also integration with RC, diffs etc. would be tricky.
Not checkboxes, which would indeed require so many that they would become rigid and/or useless. I would propose searchable write-in boxes where words and/or codes could be entered. I prefer codes to words because its easier to keep them short, and to build them into some sort of hierarchical structure. The objection that people are most likely to raise against codes is the difficulty of remembering them all. In the early stages getting the boxes to work and become acceptable would be more important than developing any system of codes for them.
The primary use of these boxes would be for indexing by subject. Secondarily it can be easily be adapted for a wide variety of flags ranging from "obscene" material to Wikipedia certified material. In Wiktionary it could be used for language codes to ease indexing by language. .
To interate with RC a simple "C" could indicate a code change in the way that an "M" now indicates a minor change.
At which point I refer you to my pet project, team certification :-)
Have you elaborated on that somewhere?
Many times in many places ;-). Originally here: http://mail.wikipedia.org/pipermail/wikitech-l/2002-October/001089.html
Okay, now I vaguely remember that discussion. I didn't jump into it then, which is probaby why I remembered nothing. I have mixed feelings about it, but not ones that are coherent enough to make comments at this time.
Eclecticology
Ray-
Not checkboxes, which would indeed require so many that they would become rigid and/or useless. I would propose searchable write-in boxes where words and/or codes could be entered.
This sounds a lot like my proposed [[Category:Foo]] scheme, only that you want codes instead of words for the categories. My scheme has the advantage that it can be built very easily on top of the existing data structures, with no extra input fields or changes to RC required.
With this scheme, you would have
[[Category:Sex]] [[Category:Sexually explicit]] or [[Category:Religion]] or [[Category:Music]]
and meta-categories:
[[Category:Stub]] or [[Category:Delete]] or [[Category:NPOV dispute]]
each of these category pages would be editable, but always automatically include a sorted list of the pages that link to the category ("what links here"). It would also be nice if a text could be configured for each category that is displayed on pages that are part of it, e.g.
[[Category:Stub]] -> This article is a stub article ...
Regards,
Erik
Erik Moeller schrieb:
Ray-
Not checkboxes, which would indeed require so many that they would become rigid and/or useless. I would propose searchable write-in boxes where words and/or codes could be entered.
This sounds a lot like my proposed [[Category:Foo]] scheme
I even had the category thing implemented on the test wikipedia a long time ago, but it was unpopular.
These category links would be similar to the language links (at least, in my implementation), right? But, we are some time now to switch to a central database managing the language links, as just putting the links into the text is kinda chaotic.
So, maybe we should consider a meta page (Special:Meta) for articles, as was suggested. This could contain categories, show the language links from the central database etc.
Anyone against meta pages? If so, why? Otherwise, I might start implementing a test version soon. (You have been warned!;-)
Magnus
I'm not sure what you are implying by making it a meta page. Please elaborate.
-- Michael Becker
-----Original Message----- From: wikien-l-admin@wikipedia.org [mailto:wikien-l-admin@wikipedia.org] On Behalf Of Magnus Manske Sent: Wednesday, June 04, 2003 7.08 To: wikien-l@wikipedia.org Cc: wikipedia-l@wikipedia.org Subject: Re: [WikiEN-l] Just to throw this out there...
Erik Moeller schrieb:
Ray-
Not checkboxes, which would indeed require so many that they would become rigid and/or useless. I would propose searchable write-in boxes where words and/or codes could be entered.
This sounds a lot like my proposed [[Category:Foo]] scheme
I even had the category thing implemented on the test wikipedia a long time ago, but it was unpopular.
These category links would be similar to the language links (at least, in my implementation), right? But, we are some time now to switch to a central database managing the language links, as just putting the links into the text is kinda chaotic.
So, maybe we should consider a meta page (Special:Meta) for articles, as
was suggested. This could contain categories, show the language links from the central database etc.
Anyone against meta pages? If so, why? Otherwise, I might start implementing a test version soon. (You have been warned!;-)
Magnus
_______________________________________________ WikiEN-l mailing list WikiEN-l@wikipedia.org http://mail.wikipedia.org/mailman/listinfo/wikien-l
Michael Becker schrieb:
I'm not sure what you are implying by making it a meta page. Please elaborate.
Instead of putting language links, categories, and who-knows-what-else-we-might-come-up-with into the article, IMHO we should put it into separate fields in the (a) database, and access this meta-information via a "special" page that is automatically generated upon request (aren't they all:-)
Advantages: * keeps the article text free from meta-information * avoids check/dropdown items on the article page (less confusing, especially for newcomers) * has all meta information in one place * machine-readable (filter-able, for example)
Disadvantages: (I don't see any real ones here...)
Magnus
--- Erik Moeller erik_moeller@gmx.de wrote:
Ray-
Not checkboxes, which would indeed require so many
that they would
become rigid and/or useless. I would propose
searchable write-in boxes
where words and/or codes could be entered.
This sounds a lot like my proposed [[Category:Foo]] scheme, only that you want codes instead of words for the categories. My scheme has the advantage that it can be built very easily on top of the existing data structures, with no extra input fields or changes to RC required.
With this scheme, you would have
[[Category:Sex]] [[Category:Sexually explicit]] or [[Category:Religion]] or [[Category:Music]]
For *all* the very good reasons KQ has been summarizing earlier today, I think this type of categorizing is very bad. I understand the need some feel to protect sensible eyes, but I believe this is not the proper way. I think it will introduce POV in an encyclopedia struggling to be NPOV.
As Ed (eh, Ed, where are you ?) would put it, when we write an article, a good way is to write "xx said yyy about zzzz".
With categories, it will be "wikipedianUser says yyy about zzzz".
Plus, I think with filters, many many articles will be more red than blue/grey. Very bad.
and meta-categories:
[[Category:Stub]] or [[Category:Delete]] or [[Category:NPOV dispute]]
each of these category pages would be editable, but always automatically include a sorted list of the pages that link to the category ("what links here"). It would also be nice if a text could be configured for each category that is displayed on pages that are part of it, e.g.
[[Category:Stub]] -> This article is a stub article
This, I agree with.
I can figure a [[Category:Copyright]] -> This article is suspected of copyright violation
Then, when the violation is asserted, and a sysop delete it, it goes automatically in the right temporary database of copyrighted material, to be permanently emptied on the first of each month.
hum...:-)
PS : I "to" to the main list. Where this ever to be implemented, it is likely to interest all wikipedias as well
__________________________________ Do you Yahoo!? Yahoo! Calendar - Free online calendar with sync to Outlook(TM). http://calendar.yahoo.com
Erik Moeller wrote:
Ray-
Not checkboxes, which would indeed require so many that they would become rigid and/or useless. I would propose searchable write-in boxes where words and/or codes could be entered.
This sounds a lot like my proposed [[Category:Foo]] scheme, only that you want codes instead of words for the categories. My scheme has the advantage that it can be built very easily on top of the existing data structures, with no extra input fields or changes to RC required.
With this scheme, you would have
[[Category:Sex]] [[Category:Sexually explicit]] or [[Category:Religion]] or [[Category:Music]]
and meta-categories:
[[Category:Stub]] or [[Category:Delete]] or [[Category:NPOV dispute]]
each of these category pages would be editable, but always automatically include a sorted list of the pages that link to the category ("what links here"). It would also be nice if a text could be configured for each category that is displayed on pages that are part of it, e.g.
[[Category:Stub]] -> This article is a stub article ...
I think there is much in common between the two ideas, at least in terms of what we want the two visions would accomplish. I think there is probably a lot of room for reconciling the two. I'd need to give more thought to the way it might be done. I can give a few preliminary observations. 1. Codes or words? Use both, but have a way of distinguishing which is being uses, e.g. it is a word if the second character is in lower case. We would still need to find a way of accomodating the "NPOV" example above. 2. Does "category page" mean a whole separate page? If it is, isn't a category box the same thing, only much tinier, and without the need to write the word "category" every time? Some articles will belong to several categories.
Anyway, I'm sure there will be more to say on this. Ec
[Crossposted from <wikiEN-l> to <wikipedia-l>. Since this feature, if implemented, would apply to all wikis, replies should go to <wikipedia-l>.]
Eclecticology wrote:
Erik Moeller wrote:
[[Category:Sex]] [[Category:Sexually explicit]] or [[Category:Religion]] or [[Category:Music]]
and meta-categories: [[Category:Stub]] or [[Category:Delete]] or [[Category:NPOV dispute]]
Not "meta-categories", but "meta categories". A meta-category is a category about categories, but a meta category is a category about meta issues, which in our case means a category about Wikipedia. (Just being silly and pedantic. Ignore me.)
each of these category pages would be editable, but always automatically include a sorted list of the pages that link to the category ("what links here"). It would also be nice if a text could be configured for each category that is displayed on pages that are part of it, e.g. [[Category:Stub]] -> This article is a stub article ...
I think there is much in common between the two ideas, at least in terms of what we want the two visions would accomplish. I think there is probably a lot of room for reconciling the two. I'd need to give more thought to the way it might be done. I can give a few preliminary observations.
I don't like the subject categories, but I like the meta categories. If we write code that works well for the meta categories, then we can always test the subject categories at that time.
- Codes or words? Use both, but have a way of distinguishing which
is being used, e.g. it is a word if the second character is in lower case. We would still need to find a way of accomodating the "NPOV" example above.
Unless "NPOV" is the code and "Neutral point of view" is the word. ^_^ But you could always check for all-caps in "NPOV dispute". Myself, I'd still say that words are best; we can keep them short.
- Does "category page" mean a whole separate page? If it is,
isn't a category box the same thing, only much tinier, and without the need to write the word "category" every time? Some articles will belong to several categories.
I'm pretty sure that "category page" means a page dedicated to that category. Thus at the URL http://en.wikipedia.org/wiki/Category:Stub one can find a page with an automatic list of all stubs, below the boilerplate text that's printed on all linked pages (say). And at http://en.wikipedia.org/w/wiki.phtml?title=Category:Stub&action=edit, one can edit this boilerplate text (but not the list).
Anyway, I'm sure there will be more to say on this.
-- Toby
Erik Moeller wrote:
On a wiki, we risk "flagging wars", and by defining what "can be considered offensive" we are leaving NPOV behind. Daniel Ehrenberg suggested a similar thing a few weeks ago and said that he thought the goal of getting Wikipedia into all schools would be worth ignoring the NPOV issue.
But he was talking about omitting such content, not just flagging it.
I do not agree. By implementing a "family filter" we may convince some parents to use Wikipedia -- and convince others who want their children to be raised in an open, progressive manner to ignore it.
But I didn't recommend that we implement a family filter. Google doesn't filter -- but they do allow end users to configure what sorts of results are presented.
Still, I am not fundamentally opposed to filters on Wikipedia, but I think whatever solution is used should be NPOV - i.e. allow filtering by all types of content, not just one specific one. So, for example, if I want to filter *everything BUT* the sex content, that should also be possible. At which point I refer you to my pet project, team certification :-)
*nod* Well, sure. That's why I chose the term 'content advisory', so as to indicate that we should implement a system that is flexible.
--Jimbo
Jimmy-
But I didn't recommend that we implement a family filter. Google doesn't filter -- but they do allow end users to configure what sorts of results are presented.
Hmm -- I do not really see the difference? Most filters are optional, at least for adults.
Regards,
Erik
Erik Moeller wrote:
Jimmy-
But I didn't recommend that we implement a family filter. Google doesn't filter -- but they do allow end users to configure what sorts of results are presented.
Hmm -- I do not really see the difference? Most filters are optional, at least for adults.
Well, I guess this is partly just a terminological difficulty.
I think we're both opposed to a system whereby Wikipedia is by default "cleansed" according to some ambiguous and amorphous standards. We don't want to place silly obstacles in the path of people seeing the whole encyclopedia, as it is written by the wiki process, even if some of the content might be considered by some to be inappropriate for children, or whatever.
At the same time, I think we both agree that it'd be nice if end users were able to very simply configure their "view" of the encyclopedia to take out material that they find objectionable.
--Jimbo
Erik wrote:
On a wiki, we risk "flagging wars", and by defining what "can be considered offensive" we are leaving NPOV behind.
One possibility is that each user has the option to give a page a score : 1 meaning "Doesn't need filtering" through to 10 "Nearly everyone would want to filter this". Then the overall 'score' for a page is the average of all scores users have given it. Then someone browsing the 'pedia could choose to filter at a particular score e.g. browse at 0 (filter everything with an score of more than 0 i.e. everything!) to 10 (= filter articles with average score 10 i.e. nothing!). 11 viewing configurations available to the user... I believe Google only has 3!
Because each person/ip could only cast one score per article (though of course they could change their score as the article changes) "flagging wars" would be difficult because it would require ip-hopping/multiple-log-in to try to tip the score in a particular direction.
This idea may be computationally quite feasible... we already have to store everyone who's watching an article... maybe we could store everyone who's scoring an article too. (Watchlist has a tag saying "article changed since you last scored" etc).
Obviously the idea extends to other scores too... instead of a "filter score" you could have a "quality score" and users could browse only articles that on average believed to be high quality.
Obvious disadvantage : The filter is "1-dimensional" i.e. you can't filter on sex, religion whatever... only on the score... so if religion were to get scores of about 5 and sex 7.. to filter religion you would have to filter sex too.. I don't know if it can be made to fit with team certification idea.
Anyhow these are only implementational ideas. My personal view is that any sort of filtering along these lines is going to take effort on the part of users.. perhaps we should stick to writing the encyclopedia for now!
Pete
--- Peter Bartlett pcb21@btconnect.com wrote:
Erik wrote:
On a wiki, we risk "flagging wars", and by defining
what "can be
considered offensive" we are leaving NPOV behind.
One possibility is that each user has the option to give a page a score : 1 meaning "Doesn't need filtering" through to 10 "Nearly everyone would want to filter this". Then the overall 'score' for a page is the average of all scores users have given it. Then someone browsing the 'pedia could choose to filter at a particular score e.g. browse at 0 (filter everything with an score of more than 0 i.e. everything!) to 10 (= filter articles with average score 10 i.e. nothing!). 11 viewing configurations available to the user... I believe Google only has 3!
Because each person/ip could only cast one score per article (though of course they could change their score as the article changes) "flagging wars" would be difficult because it would require ip-hopping/multiple-log-in to try to tip the score in a particular direction.
This idea may be computationally quite feasible... we already have to store everyone who's watching an article... maybe we could store everyone who's scoring an article too. (Watchlist has a tag saying "article changed since you last scored" etc).
Obviously the idea extends to other scores too... instead of a "filter score" you could have a "quality score" and users could browse only articles that on average believed to be high quality.
Obvious disadvantage : The filter is "1-dimensional" i.e. you can't filter on sex, religion whatever... only on the score... so if religion were to get scores of about 5 and sex 7.. to filter religion you would have to filter sex too.. I don't know if it can be made to fit with team certification idea.
Anyhow these are only implementational ideas. My personal view is that any sort of filtering along these lines is going to take effort on the part of users.. perhaps we should stick to writing the encyclopedia for now!
Pete
That sounds like a very good idea, but how would people deal with unrated pages? Let's say all of the unmodified rambot pages were unrated. Then no one with any sort of filtering could go there. If someone made a rating bot to solve this, it would seriously undermine the system.
A way to (partially) solve this could be to have a period wherein people can rate pages (and are required to in each edit) but ratings cannot yet be used for filtering.
Overall, it's a good idea but not practical. --LittleDan
__________________________________ Do you Yahoo!? Yahoo! Calendar - Free online calendar with sync to Outlook(TM). http://calendar.yahoo.com
Erik wrote:
On a wiki, we risk "flagging wars", and by defining what "can be considered offensive" we are leaving NPOV behind.
When I first read this, I thought I agreed, but overnight I realized that I really don't agree.
Of course we do risk 'flagging wars' -- but that's par for the course -- we risk 'edit wars' and so on all the time. The best way to minimize such wars is to approach tagging *as NPOV meta-data*.
No one can say whether or not children should be exposed to an article about 'felching' without taking the sort of stand that Wikipedia avoids as a rule. It is not for wikipedia, the reference work, to decide such matters.
If we were writing disclaimers at the top of some pages, we would require the disclaimer to be NPOV. We wouldn't say "This page is bad for children." And we wouldn't say "Anyone who doesn't let children read this is a stick in the mud." We would say "Some may consider this page inappropriate for children."
Similarly, a flag for 'possibly mature content' just means that some people may think so.
If the *consequences* of that flag were *huge*, then it'd be a problem. I.E., if people couldn't view mature content without proof of age (credit card, fax me a copy of your driver's license, etc.!) that'd be a very bad thing.
But if the consequences of the flag are mild, i.e. that people can choose, optionally, to 'Turn on Safe Search' with a click, or to turn it back off with a click, then there's no need for there to be edit wars.
--Jimbo
Jimmy-
If we were writing disclaimers at the top of some pages, we would require the disclaimer to be NPOV. We wouldn't say "This page is bad for children." And we wouldn't say "Anyone who doesn't let children read this is a stick in the mud." We would say "Some may consider this page inappropriate for children."
That's fair enough. I will add the appropriate disclaimers to all the pages about Christianity, which I consider potentially dangerous to children. Of course I will do so in an NPOV way. ;-)
Can you see where this is going? Standards for what is and isn't appropriate for children or adults vary so greatly that any offensiveness disclaimer is bound to be either POV (because we only add it to some pages) or useless (because we add it to all pages). We can't really solve this in the wiki way, *if* we are going to add disclaimers/filters, these have to be configurable. One size doesn't fit it all.
Regards,
Erik
Erik Moeller wrote:
Jimmy-
If we were writing disclaimers at the top of some pages, we would require the disclaimer to be NPOV. We wouldn't say "This page is bad for children." And we wouldn't say "Anyone who doesn't let children read this is a stick in the mud." We would say "Some may consider this page inappropriate for children."
That's fair enough. I will add the appropriate disclaimers to all the pages about Christianity, which I consider potentially dangerous to children. Of course I will do so in an NPOV way. ;-)
Can you see where this is going? Standards for what is and isn't appropriate for children or adults vary so greatly that any offensiveness disclaimer is bound to be either POV (because we only add it to some pages) or useless (because we add it to all pages). We can't really solve this in the wiki way, *if* we are going to add disclaimers/filters, these have to be configurable. One size doesn't fit it all.
To me, the filtering ability of coding boxes is incidental to having an indexing system that can be scaled up to an ever larger Wikipedia. The filtering possibilities would be rudimentary. There could be a place on the preferences page for a user to indicate the codes for articles not to be downloaded. Thus (without prejudice to what the codes might actually become), he could enter SEX, REL, or GEO for sex, religion or geology. (There is a risk that an article on geology could be preaching evolution. ;-) ) In many cases where the parent would use the filter the kid is often smarter, and can probably figure out the override long before the parent realizes. I may not always admit it to him, but I feel proud when my 13 year old does things better than I do.
Choosing whether an article will be subject to a particular filtering code is so subjective as to often be meaningless, but having this sort of thing available has a tremendous public relations value..Some porno sites put a long list of unlikely words on their home page to improve the chance that a search engine can find them. Filtering them all can be a losing battle.
Ec
Ray Saintonge wrote:
In many cases where the parent would use the filter the kid is often smarter, and can probably figure out the override long before the parent realizes. I may not always admit it to him, but I feel proud when my 13 year old does things better than I do.
At least as long as we implement something like this in the same way that Google does, there would be no 'figuring out' involved. If you want to block something, you click on the link to "Turn on Safe Search" and if you want to unblock something, you click on the link to "Turn off or modify Safe Search".
I don't think that wikipedia, per se, should assist at all with how other people might want to enforce such things.
-------------
Here's another way to look at it. I'm an adult, and my mother is too. She's from an earlier generation, and would be shocked to see an article on felching. It'd be nice to be able to sit down to wikipedia *with my own mother* and not worry about such things popping up. It's not about filtering or preventing access for another person, it's about being able to configure the site easily as an end user to account for such preferences.
If I want to learn about felching on my own, sometime when mom isn't watching, then I can. And if she wants to learn about it on her own, when I'm not there, that's fine too.
--Jimbo
Jimmy-
Ray Saintonge wrote:
In many cases where the parent would use the filter the kid is often smarter, and can probably figure out the override long before the parent realizes. I may not always admit it to him, but I feel proud when my 13 year old does things better than I do.
At least as long as we implement something like this in the same way that Google does, there would be no 'figuring out' involved. If you want to block something, you click on the link to "Turn on Safe Search" and if you want to unblock something, you click on the link to "Turn off or modify Safe Search".
Please, do not use the SafeSearch analogy. This is a completely different idea from the customizable filter system we have been discussing. SafeSearch is a POV family filter with no transparency. It's a Bad Thing.
Regards,
Erik
Erik Moeller wrote:
That's fair enough. I will add the appropriate disclaimers to all the pages about Christianity, which I consider potentially dangerous to children. Of course I will do so in an NPOV way. ;-)
Well, that would be fine, I think. If there was a meta data content advisory field, it probably would be best to flag these articles as being "About Religion; About Christianity".
Can you see where this is going? Standards for what is and isn't appropriate for children or adults vary so greatly that any offensiveness disclaimer is bound to be either POV (because we only add it to some pages) or useless (because we add it to all pages)
Well, I don't see why that has to be true, if we are thoughtful with how we implement content advisory meta data.
It's pretty clear to me that some topics, [[felching]] comes to mind, can be fairly uncontroversially described as 'possibly mature content; sexuality'. That's true even though people do differ widely as to their tolerance for such content.
--Jimbo
Jimmy Wales wrote:
Of course we do risk 'flagging wars' -- but that's par for the course -- we risk 'edit wars' and so on all the time. The best way to minimize such wars is to approach tagging *as NPOV meta-data*.
But one of the main tools we use to come to NPOV -- 'going meta' -- isn't really available here.
If we were writing disclaimers at the top of some pages, we would require the disclaimer to be NPOV. We wouldn't say "This page is bad for children." And we wouldn't say "Anyone who doesn't let children read this is a stick in the mud." We would say "Some may consider this page inappropriate for children."
'Going meta' isn't just saying 'some people think this', it's describing who thinks it. If we have to decide how many people should hold an opinion before a flag is set then I think we really will get 'wars' of back-and-forth reversions.
-M-
Matthew Woodcraft wrote:
Jimmy Wales wrote:
Of course we do risk 'flagging wars' -- but that's par for the course -- we risk 'edit wars' and so on all the time. The best way to minimize such wars is to approach tagging *as NPOV meta-data*.
But one of the main tools we use to come to NPOV -- 'going meta' -- isn't really available here.
Sure it is. We may not agree about the appropriateness of [[felching]], but we can agree that it's controversial, that it's sexual in nature, and that it would be widely regarded as a mature topic. There's probably a lot of other things we could say about it that would be relevant from a content advisory point of view, but which are also NPOV.
If we were writing disclaimers at the top of some pages, we would require the disclaimer to be NPOV. We wouldn't say "This page is bad for children." And we wouldn't say "Anyone who doesn't let children read this is a stick in the mud." We would say "Some may consider this page inappropriate for children."
'Going meta' isn't just saying 'some people think this', it's describing who thinks it. If we have to decide how many people should hold an opinion before a flag is set then I think we really will get 'wars' of back-and-forth reversions.
Then we have to think carefully about just what the flags mean, and just how we might make it possible for them to be NPOV.
Here's a set of flags that I think would work well enough for the felching article:
sexuality mature content slang
I don't see how those are very controversial.
--Jimbo
Jimmy Wales wrote in part:
Here's a set of flags that I think would work well enough for the felching article:
sexuality mature content slang
I don't see how those are very controversial.
"mature content" will definitely cause problems. You imply in the rest of what you write (not only in this post) «what some would consider to be appropriate only for the mature», but not only does "mature content" not say this literally, but also (as somebody else pointed out) that's extremely broad, and includes [[Christianity]]. So it won't be of much use.
The others, however, are clearly appropriate for [[Felching]], and they're probably sufficient for your purposes as well.
If I were writing the Wikipedia code, I wouldn't spend my time on writing support for these flags; that's better done (if at all) by another project that operates on top of Wikipedia (in much the way that the Sifter project was meant to do). But if such code is written, "mature content" will remain a problem. "sexuality" and "slang" will be fine -- but I also predict an explosion of a great deal of other, equally factual, classifications, most of which you and I would never think of.
-- Toby
Toby Bartels wrote:
Here's a set of flags that I think would work well enough for the felching article:
sexuality mature content slang
I don't see how those are very controversial.
"mature content" will definitely cause problems. You imply in the rest of what you write (not only in this post) «what some would consider to be appropriate only for the mature», but not only does "mature content" not say this literally, but also (as somebody else pointed out) that's extremely broad, and includes [[Christianity]]. So it won't be of much use.
Well, I don't agree. The standards for what we mean by mature content can be spelled out in sufficient detail and in an NPOV way so that controversy is minimized.
But I'm not invested in that particular phrase. Perhaps we could use 'explicit sexual content' to distinguish it from 'sexuality'.
If I were writing the Wikipedia code, I wouldn't spend my time on writing support for these flags; that's better done (if at all) by another project that operates on top of Wikipedia
But this is important for Wikipedia the website, not just for others.
Some people in this debate have taken a very POV position, i.e. that wikipedia should shove this stuff down people's throats, and if they're too prudish to deal with it, too bad ha ha. I don't agree.
--Jimbo
Jimmy Wales wrote:
Toby Bartels wrote:
"mature content" will definitely cause problems. You imply in the rest of what you write (not only in this post) «what some would consider to be appropriate only for the mature», but not only does "mature content" not say this literally, but also (as somebody else pointed out) that's extremely broad, and includes [[Christianity]]. So it won't be of much use.
Well, I don't agree. The standards for what we mean by mature content can be spelled out in sufficient detail and in an NPOV way so that controversy is minimized.
If you spell it out in such a way that it includes explicity sexual content but not explicit religious content, then how is this NPOV, given an earlier poster (Ec?) that considers the latter harmful to kids. (And if Ec is joking, I have a friend that seriously believes that about Christianity in particular.)
But I'm not invested in that particular phrase. Perhaps we could use 'explicit sexual content' to distinguish it from 'sexuality'.
If «explicit sexual content» is what you meant all along by "mature content", then let's say "explicit sexual content".
If I were writing the Wikipedia code, I wouldn't spend my time on writing support for these flags; that's better done (if at all) by another project that operates on top of Wikipedia
But this is important for Wikipedia the website, not just for others.
Is it? If we have an Edupedia website built on a Sifter model, then schools can block Wikipedia as long as they don't block Edupedia. And the Sifter model allows readers of Edupedia to edit Wikipedia (potentially seemlessly).
Some people in this debate have taken a very POV position, i.e. that wikipedia should shove this stuff down people's throats, and if they're too prudish to deal with it, too bad ha ha. I don't agree.
Although Stevertigo's rhetoric can go (IMO) over the top, I don't think that anybody actually advocates this position. We should put explicit material on pages with titles like [[Felching]], [[Nigger]], and [[Transubstantiation]] (^_^), but what else would you expect on such a page? As with any article, the first sentence should briefly provide a general context, so even if you go in not knowing what "felching" means, you see "'''Felching''' is a [[sexual practice]] in which ...", and you're warned -- nothing is shoved down your throat. ([[Felching]] actually doesn't begin that way right now, so let me fix it right up ....)
-- Toby
Toby Bartels wrote:
If you spell it out in such a way that it includes explicity sexual content but not explicit religious content, then how is this NPOV, given an earlier poster (Ec?) that considers the latter harmful to kids. (And if Ec is joking, I have a friend that seriously believes that about Christianity in particular.)
End users can adjust it however they like, so what's the problem?
If «explicit sexual content» is what you meant all along by "mature content", then let's say "explicit sexual content".
No, it isn't what I meant all along. It's by far the biggest problem category that we have right now, but of course other things can be included.
But this is important for Wikipedia the website, not just for others.
Is it? If we have an Edupedia website built on a Sifter model, then schools can block Wikipedia as long as they don't block Edupedia.
And that would be very unfortunate, I think.
--Jimbo
--- Jimmy Wales jwales@bomis.com wrote:
Toby Bartels wrote:
If we have an Edupedia website built on a Sifter model, then schools can block Wikipedia as long as they don't block Edupedia.
And that would be very unfortunate, I think.
There's one question we haven't investigated yet: would our filtering really lower the likelihood that schools will eventually block us completely?
I doubt that we could ever create a system that even comes close to preventing school kids from getting the Wikipedia content they want. At one point, a school administrator will find objectionable content in Wikipedia, will investigate the filtering option with the various categories, and will realize that kids can still get to the objectionable pages, e.g. by typing in the direct URL. Then our whole domain will be blocked.
Axel
__________________________________ Do you Yahoo!? Yahoo! Calendar - Free online calendar with sync to Outlook(TM). http://calendar.yahoo.com
--- Axel Boldt axelboldt@yahoo.com wrote:
--- Jimmy Wales jwales@bomis.com wrote:
Toby Bartels wrote:
If we have an Edupedia website built on a Sifter
model,
then schools can block Wikipedia as long as they
don't block
Edupedia.
And that would be very unfortunate, I think.
There's one question we haven't investigated yet: would our filtering really lower the likelihood that schools will eventually block us completely?
I doubt that we could ever create a system that even comes close to preventing school kids from getting the Wikipedia content they want. At one point, a school administrator will find objectionable content in Wikipedia, will investigate the filtering option with the various categories, and will realize that kids can still get to the objectionable pages, e.g. by typing in the direct URL. Then our whole domain will be blocked.
Axel
No, Edupedia would be on a seperate domain (edupedia.org, not yet owned) to prevent just that blocking. -ld
__________________________________ Do you Yahoo!? Yahoo! Calendar - Free online calendar with sync to Outlook(TM). http://calendar.yahoo.com
Axel Boldt wrote:
If we have an Edupedia website built on a Sifter model, then schools can block Wikipedia as long as they don't block Edupedia.
And that would be very unfortunate, I think.
There's one question we haven't investigated yet: would our filtering really lower the likelihood that schools will eventually block us completely?
Of course not. But never underestimate the will of the zealots to try. A year or so ago the Supreme Court of Canada ruled to allow three books depicting homosexual families to be used in the schools. Just yesterday the same suburban Vancouver school board again voted to ban the same three books on the grounds that the spelling and grammar in them were not up to standards. They never stop trying. We can't hope to convince all school districts to not block us. On the other hand, the gratuitous publicity always helps our cause. We only have to insist that they spell the name correctly.
I doubt that we could ever create a system that even comes close to preventing school kids from getting the Wikipedia content they want. At one point, a school administrator will find objectionable content in Wikipedia, will investigate the filtering option with the various categories, and will realize that kids can still get to the objectionable pages, e.g. by typing in the direct URL. Then our whole domain will be blocked.
Yup! And the greater the fuss by the administrator, the more the kids will be encouraged to hack their way to the site. Nevertheless, many won't even bother with the school computers when they have more sophisticated equipment at home.
Ec
Ray Saintonge wrote:
not up to standards. They never stop trying. We can't hope to convince all school districts to not block us. On the other hand, the gratuitous publicity always helps our cause. We only have to insist that they spell the name correctly.
I am not at all sure that this sort of gratuitous publicity would help our cause. Remember that the people on the "other side" are not stupid. And it's pretty easy for an interested party to go through wikipedia and find content that would be shocking to the majority of parents in most suburban school districts.
Erik argued the other day that if anyone tries to filter wikipedia, they will just look stupid, and that our power can do a lot to discredit the idea of filtering. That argument gave me pause.
But here's the counterargument -- if Wikipedia has no protection at all for some things that the majority of people would find, not just mildly offensive, but shocking, *for children*, then we become the perfect poster-child for the pro-filtering crowd.
They can put forward the argument: "Yes, it would be wonderful if we lived in the kind of world where filtering the Internet for schools is unnecessary. But here's a good example: a site that appears to be innocent and harmless but which actually has graphic and explicit depictions of highly unusual sex practices. We even found one photo of female genetalia that was lifted directly from a porn site."
And many people will quite reasonably buy that argument. It's actually a very good argument *for* filtering in schools, for anyone other than a bunch of radical information libertarians such as many of us here.
None of the above determines what we should do, of course, but I think it does add some perspective to the debate.
Yup! And the greater the fuss by the administrator, the more the kids will be encouraged to hack their way to the site. Nevertheless, many won't even bother with the school computers when they have more sophisticated equipment at home.
That's of course true.
--Jimbo
Jimmy-
Erik argued the other day that if anyone tries to filter wikipedia, they will just look stupid, and that our power can do a lot to discredit the idea of filtering. That argument gave me pause.
But here's the counterargument -- if Wikipedia has no protection at all for some things that the majority of people would find, not just mildly offensive, but shocking, *for children*, then we become the perfect poster-child for the pro-filtering crowd.
I can see it now: "THINK OF THE CHILDREN! WE NEED FILTERS TO HIDE WIKIPEDIA, THE FREE ENCYCLOPEDIA, FROM THEM!"
Oh yeah. That sounds reaaaally dangerous. Parents will run in terror when they hear our name. Just think - our frontpage even features Martha Stewart. And monkeypox.
They can put forward the argument: "Yes, it would be wonderful if we lived in the kind of world where filtering the Internet for schools is unnecessary. But here's a good example: a site that appears to be innocent and harmless but which actually has graphic and explicit depictions of highly unusual sex practices. We even found one photo of female genetalia that was lifted directly from a porn site."
And many people will quite reasonably buy that argument.
What *is* the argument here? That all 130,000 articles of Wikipedia have to be blocked because there are a dozen that describe "highly unusual sex practices"? Do you really think that is convincing?
I believe that you overestimate the prudery of the average American because you are so used to feeling like a radical ;-). CNN just had a frontpage poll about whether the state should get involved in consensual sex among adults. 90% were against it. Sure, there's a lot of panic regarding sex and children. But often it's really just the well organized religious right screaming at the top of their lungs and the rest keeping their mouth shut. And I think if these 10-20% started screaming about the dangers of Wikipedia, the remaining 80-90% might actually have something to say.
But if we are just complacent, if we just blindly accept the so-called power of these pressure groups, then *we* contribute to giving them that power. We are among those who shut up and not among those who speak up. If you are such a radical, Jimbo, why are you so eager to give them what they want?
Have you ever actually shown your mother Wikipedia? I think she would like it. Mine certainly did.
Regards,
Erik
Erik Moeller wrote:
I can see it now: "THINK OF THE CHILDREN! WE NEED FILTERS TO HIDE WIKIPEDIA, THE FREE ENCYCLOPEDIA, FROM THEM!"
Oh yeah. That sounds reaaaally dangerous. Parents will run in terror when they hear our name. Just think - our frontpage even features Martha Stewart. And monkeypox.
But my point is, no one is going to say that, because it sounds just as stupid as you hope it does. These people aren't that stupid, not usually. And certainly ordinary people who have to make tough decisions are usually pretty thoughtful.
What *is* the argument here? That all 130,000 articles of Wikipedia have to be blocked because there are a dozen that describe "highly unusual sex practices"? Do you really think that is convincing?
No, that is *not* the argument, not even close.
The argument is "We need filtering software installed in the schools because even a superb site such as Wikipedia, which should be available to children, contains explicit sexual content about obscure sexual practices. It would be nice if such sites took a responsible approach to things, but they don't, and so we have to do it."
I think that's a very reasonable argument on behalf of filtering software, *even if I don't agree with it*. It's not that the argument is correct, it's that it isn't the screaming THINK OF THE CHILDREN lunacy that you'd have us imagine.
My point here is not about access to the schools, but is rather a simple response to your (insightful!) invitation to think about how our actions might impact the wider debates on school filtering. If we act irresponsibly and ideologically on this point, then we undermine the case against filters by showing that even something as potentially useful as wikipedia may need to be filtered.
--Jimbo
Jimmy-
What *is* the argument here? That all 130,000 articles of Wikipedia have to be blocked because there are a dozen that describe "highly unusual sex practices"? Do you really think that is convincing?
No, that is *not* the argument, not even close.
The argument is "We need filtering software installed in the schools because even a superb site such as Wikipedia, which should be available to children, contains explicit sexual content about obscure sexual practices. It would be nice if such sites took a responsible approach to things, but they don't, and so we have to do it."
Once again, do what? Filter Wikipedia entirely? Do you really think they can get away with that? And if they just want to filter the respective pages, what do we care?
Regards,
Erik
Erik Moeller wrote:
Once again, do what? Filter Wikipedia entirely? Do you really think they can get away with that? And if they just want to filter the respective pages, what do we care?
You raised the argument that by having content that they'd like to filter, but by being the sort of site that it would be embarassing to filter, we make things more difficult for those who would want to filter.
After a lot of thought, I don't think that's right. If Wikipedia contains content that should be filtered, then the proponents of filtering can use us as an example of why filters are a Good Thing.
That's why we care, right? We don't want to be used as an example of why filters are a Good Thing.
Therefore, it seems to me, it's important that we act responsibly.
--Jimbo
Jimmy-
Erik Moeller wrote:
Once again, do what? Filter Wikipedia entirely? Do you really think they can get away with that? And if they just want to filter the respective pages, what do we care?
...
After a lot of thought, I don't think that's right. If Wikipedia contains content that should be filtered, then the proponents of filtering can use us as an example of why filters are a Good Thing.
I still don't follow your argument. Do you, or do you not think that the proponents of filtering could plausibly argue that all of Wikipedia, all 130,000 articles, need to be filtered in libraries because we have some articles about "highly unusual sex practices"?
If the proponents of filtering manage to come up with a filtering solution that only filters the articles they find so offensive, something which I doubt given the nature of Wikipedia, then all the better for them -- less work for us. And if they don't, we can say that they
1) ask us to do the impossible (a lot of purely technical objections against filtering have already been raised)
2) violate the First Amendment by hiding 129,500 perfectly valid articles from pupils because of 500 ones which they consider objectionable.
Oh, and by the way -- the Yahoo! message boards contain a lot more objectionable and offensive material than Wikipedia.
Regards,
Erik
Erik Moeller wrote:
I still don't follow your argument. Do you, or do you not think that the proponents of filtering could plausibly argue that all of Wikipedia, all 130,000 articles, need to be filtered in libraries because we have some articles about "highly unusual sex practices"?
No, I don't think that. I do think that our refusal to handle this issue responsibly does mean that proponents of filtering will plausibly argue that we are a prime example of why filtering software is necessary and valuable in schools.
If the proponents of filtering manage to come up with a filtering solution that only filters the articles they find so offensive, something which I doubt given the nature of Wikipedia, then all the better for them -- less work for us. And if they don't, we can say that they 1) ask us to do the impossible (a lot of purely technical objections against filtering have already been raised) 2) violate the First Amendment by hiding 129,500 perfectly valid articles from pupils because of 500 ones which they consider objectionable.
This completely ignores the most probable outcome. First, they can't come up with filtering software that does a good job, so our site gets filtered according to some very crude keyword tools. Second, that they are able to convincingly argue that this is the best that they can do, particularly since even sites that purport to be educational are so irresponsible and unhelpful.
My essential point is that our refusal to have standards is not a valid tool in the fight against censorship. Rather it plays directly into the hands of those who argue that censorship is necessary.
--Jimbo
At 01:05 PM 6/16/2003, you wrote:
Erik Moeller wrote:
I still don't follow your argument. Do you, or do you not think that the proponents of filtering could plausibly argue that all of Wikipedia, all 130,000 articles, need to be filtered in libraries because we have some articles about "highly unusual sex practices"?
No, I don't think that. I do think that our refusal to handle this issue responsibly does mean that proponents of filtering will plausibly argue that we are a prime example of why filtering software is necessary and valuable in schools.
If the proponents of filtering manage to come up with a filtering solution that only filters the articles they find so offensive, something which I doubt given the nature of Wikipedia, then all the better for them -- less work for us. And if they don't, we can say that they 1) ask us to do the impossible (a lot of purely technical objections against filtering have already been raised) 2) violate the First Amendment by hiding 129,500 perfectly valid articles from pupils because of 500 ones which they consider objectionable.
This completely ignores the most probable outcome. First, they can't come up with filtering software that does a good job, so our site gets filtered according to some very crude keyword tools. Second, that they are able to convincingly argue that this is the best that they can do, particularly since even sites that purport to be educational are so irresponsible and unhelpful.
My essential point is that our refusal to have standards is not a valid tool in the fight against censorship. Rather it plays directly into the hands of those who argue that censorship is necessary.
--Jimbo
Jimbo, I'm puzzled. Where is it written that Wikipedia contains content that needs filtering? What are these mysterious standards that we refuse to implement? Who, exactly, is it that should be choosing which content on our site is "objectionable"... and objectionable to whom? How are we irresponsible?
I am of the opinion that Wikipedia does NOT contain content that needs filtering. Ideas are both MORE dangerous and LESS dangerous than most people realize. Of course, the proponents of filtering universally want to filter the less dangerous stuff and let the more dangerous stuff through. (i.e., sexuality is not dangerous, but ideas like freedom/rationality/logic ARE dangerous... at least to people who think sexuality IS dangerous....) I am also of the opinion that we DO have standards, and that those standards are good ones for an encyclopedia.... we try to ensure that our articles are truthful, non-biased, and accurate. I am also of the opinion that those people who don't like the site should not use it. I also think that those people who are unwillingly blocked from using the site need to step up and let their voices be heard; should peak to their parents, teachers, government officials, clergymen.; tell them why Wikipedia is important and how it shouldn't be blocked. Lastly, I find the assertion that we're being somehow irresponsible, or refusing to handle the issue responsibly insulting. Holding the view that censorship is unnecessary or undesirable is not irresponsible.
----- Dante Alighieri dalighieri@digitalgrapefruit.com
"The darkest places in hell are reserved for those who maintain their neutrality in times of great moral crisis." -Dante Alighieri, 1265-1321
Dante Alighieri wrote:
Jimbo, I'm puzzled. Where is it written that Wikipedia contains content that needs filtering?
Right here, by me: "Wikipedia contains content that would be widely regarded as thoroughly objectionable by reasonable people, and this is a problem for the project."
What are these mysterious standards that we refuse to implement?
Well, we aren't refusing anything yet. We're having a vigorous discussion about what to do. :-)
Who, exactly, is it that should be choosing which content on our site is "objectionable"... and objectionable to whom?
End users should be able to choose the view of our content that is appropriate for their circumstances.
How are we irresponsible?
We currently provide parents and children absolutely no tools by which they might seek to provide age-appropriate views of our content.
I am of the opinion that Wikipedia does NOT contain content that needs filtering. Ideas are both MORE dangerous and LESS dangerous than most people realize. Of course, the proponents of filtering universally want to filter the less dangerous stuff and let the more dangerous stuff through. (i.e., sexuality is not dangerous, but ideas like freedom/rationality/logic ARE dangerous... at least to people who think sexuality IS dangerous....)
Believe me, I'm sympathetic to that viewpoint. But it *is* a viewpoint, a viewpoint which we should not let manifest itself in ramming our own ideas down other people's throats.
Consider this -- if I let all my personal preferences rule what goes into Wikipedia, it would be far different from what it is now. I couldn't be trusted to write articles on religion, for example, because if it was "up to me" in the relevant sense, they would all end up hostile to religion generally.
But we have a higher principle than that, the NPOV principle, and we need to apply it carefully and conscientiously *even at the level of policy*. Which means: giving our editors the means to introduce NPOV metadata, *even if we don't approve* of what some readers might do with it.
Lastly, I find the assertion that we're being somehow irresponsible, or refusing to handle the issue responsibly insulting. Holding the view that censorship is unnecessary or undesirable is not irresponsible.
"Censorship" is a complete red herring here. I hold that censorship is unnecessary and undesirable. I do not advocate censorship or anything resembling it.
At the same time, on my home television system, I have my cable box programmed so that I don't surf across the shopping channels or the religious channels. I don't like those channels, and I don't want my TiVo to waste disk space recording them.
Since there are only 300 or so channels to choose from, it wasn't hard to customize my system for my own preferences.
Someone at TiVo might have argued thusly: "People should watch a variety of shows, and not be such prudes, or such anti-capitalists, or such anti-religionists. They should keep all channels accessible at all times. So we must not build into the system any tools to allow people to "filter". If they want filters, they can build their own device for doing it. But we aren't going to help."
That'd be silly. And it's just as silly for us to not flag content with some meta-data, even if we think people are silly (and I don't) for using it.
--Jimbo
Believe me, I'm sympathetic to that viewpoint. But it *is* a viewpoint, a viewpoint which we should not let manifest itself in ramming our own ideas down other people's throats.
This is precisely the issue - how does WP "ram anything down people's throats?" They have a choice - If schools want to use Britannica - because they have content filtering...( or no content needing filtering....). The last point raises the issue - in order for Britannica to keep all of its content in like - it needs a heirachical command structure - entirely antithetical to WP's purported system of consensus. This is precisely the reason why I raised the issue of captainship a long timme ago.
To enforce these kind of constraints on a truly free encyclopedia is unfeasible - and therefore the WP belongs in a different niche than Britannica - and it should not concern itself with competing with Britannica for its marketshare.
The real issue is that to some of us "radical libertarians" - this smacks of a change in direction of the WP as a whole - from Open content, anyone can edit, (come on in the waters fine..) to one where "hey, I feel like this a success, and now I want to mass market this not just to Freewheeling Bob, but to Cautious Nancy and Self-Righteous Richard, too! Its like watching a favorite rock band go sour after winning a grammy.
Respectfully, Steven MacGrieves
At 01:47 PM 6/16/2003, Jimbo wrote:
Dante Alighieri wrote:
Jimbo, I'm puzzled. Where is it written that Wikipedia contains content that needs filtering?
Right here, by me: "Wikipedia contains content that would be widely regarded as thoroughly objectionable by reasonable people, and this is a problem for the project."
I hate to argue semantics (actually, no I don't, I love it, but you get my point), but that's not relevant. Saying that there is stuff that some people might find objectionable is not the same as saying there is stuff that needs filtering... oh, excuse me, "sorting". ;)
Also, I am not totally certain that I agree that there are things that "reasonable people" would consider "thoroughly objectionable" on Wikipedia.
Who, exactly, is it that should be choosing which content on our site is "objectionable"... and objectionable to whom?
End users should be able to choose the view of our content that is appropriate for their circumstances.
They are able to do so now. It's called "free will" and "choice". Anyone who doesn't want to look at an article on anal sex probably shouldn't click on a link to it. I wasn't aware that people ran into objectionable content accidently. Call me crazy, but I don't see how people are going to be blindsided by the felching article. No one is going to go there unless they either know what it is and want to read about it, or DON'T know what it is and want to know what it is... and they shortly WILL know when they read the article.
How are we irresponsible?
We currently provide parents and children absolutely no tools by which they might seek to provide age-appropriate views of our content.
It's not our responsibility. If it were, we might be viewed as irresponsible. But, just as we are not irresponsible for not providing food to the poor in Afghanistan (since it's not our job to do so) neither are we irresponsible here. People have free will. I have no sympathy for people who get offended by what they read here. If you don't like it, DON'T READ IT. As for children, if you are SO terribly concerned about what their fragile minds might encounter on the DANGEROUS and pornographic Wikipedia, how about you sit down with your child and supervise them? Hmmm? Obviously, "you" is used here rhertorically, not to refer to Jimbo.
I am of the opinion that Wikipedia does NOT contain content that needs filtering. Ideas are both MORE dangerous and LESS dangerous than most people realize. Of course, the proponents of filtering universally want to filter the less dangerous stuff and let the more dangerous stuff through. (i.e., sexuality is not dangerous, but ideas like freedom/rationality/logic ARE dangerous... at least to people who think sexuality IS dangerous....)
Believe me, I'm sympathetic to that viewpoint. But it *is* a viewpoint, a viewpoint which we should not let manifest itself in ramming our own ideas down other people's throats.
Let me get this straight... a lack of meta-data on content is ramming our ideas down other peoples' throats? So... since the Wikipedia has NEVER had that content, we've been ramming our ideas down peoples' throats for HOW long now? Wow, I'm surprised we haven't heard more complaints....
Consider this -- if I let all my personal preferences rule what goes into Wikipedia, it would be far different from what it is now. I couldn't be trusted to write articles on religion, for example, because if it was "up to me" in the relevant sense, they would all end up hostile to religion generally.
Well, in my personal opinion, that wouldn't be too terrible... I wouldn't mind seeing some hostile articles on religion... done NPOV of course. ;)
But we have a higher principle than that, the NPOV principle, and we need to apply it carefully and conscientiously *even at the level of policy*. Which means: giving our editors the means to introduce NPOV metadata, *even if we don't approve* of what some readers might do with it.
I'm sorry Jimbo, I just don't agree. I don't see why we need to bend over backwards (and regardless of how "simple" such a plan could possibly be, it would take COUNTLESS man-hours [no offense, ladies] to flag all the articles) to please people. Maybe I'm being unreasonable, but I don't see how it is OUR job to create such a system.
Lastly, I find the assertion that we're being somehow irresponsible, or refusing to handle the issue responsibly insulting. Holding the view that censorship is unnecessary or undesirable is not irresponsible.
"Censorship" is a complete red herring here. I hold that censorship is unnecessary and undesirable. I do not advocate censorship or anything resembling it.
Call it what you want. All I know is that "offensive content" and "censorship" go hand in hand. I'm sure you've seen those CDs sold with "Offensive Lyrics" right on the cover. Any censorship there? I'll make the slippery slope argument again, a move on our part to classify THIS or THAT article as POTENTIALLY objectionable is the first step towards censorship. We would be tacitly admitting (or use some other word here) that is content on the 'pedia that ought to be censored.
At the same time, on my home television system, I have my cable box programmed so that I don't surf across the shopping channels or the religious channels. I don't like those channels, and I don't want my TiVo to waste disk space recording them.
Since there are only 300 or so channels to choose from, it wasn't hard to customize my system for my own preferences.
Someone at TiVo might have argued thusly: "People should watch a variety of shows, and not be such prudes, or such anti-capitalists, or such anti-religionists. They should keep all channels accessible at all times. So we must not build into the system any tools to allow people to "filter". If they want filters, they can build their own device for doing it. But we aren't going to help."
This is not a good example. While TiVo is a proprietary hardware system that would be PROHIBITIVELY difficult to engage an after-market "sorter" for, Wikipedia is open-source software on the internet. Apples and oranges.
That'd be silly. And it's just as silly for us to not flag content with some meta-data, even if we think people are silly (and I don't) for using it.
It's not silly to not flag content. It's been mentioned before that it would be difficult, or even impossible, to flag content in an NPOV manner. This is still open to debate.
--Jimbo
All the rest aside, I agree that we're just having a friendly debate, and I do respect your position. Also, I think we need to get cracking on some juicy anti-religion articles... anything to piss of the squares. So... let's keep up the lively debate and remember, it's all in good fun. That being said.... *pbbbbbbbt* ;p
----- Dante Alighieri dalighieri@digitalgrapefruit.com
"The darkest places in hell are reserved for those who maintain their neutrality in times of great moral crisis." -Dante Alighieri, 1265-1321
They are able to do so now. It's called "free will" and "choice". Anyone who doesn't want to look at an article on anal sex probably shouldn't click on a link to it. I wasn't aware that people ran into objectionable content accidently. Call me crazy, but I don't see how people are going to be blindsided by the felching article. No one is going to go there unless they either know what it is and want to read about it, or DON'T know what it is and want to know what it is... and they shortly WILL know when they read the article.
What if they press 'random page'?
__________________________________ Do you Yahoo!? SBC Yahoo! DSL - Now only $29.95 per month! http://sbc.yahoo.com
At 03:36 PM 6/17/2003, you wrote:
They are able to do so now. It's called "free will" and "choice". Anyone who doesn't want to look at an article on anal sex probably shouldn't click on a link to it. I wasn't aware that people ran into objectionable content accidently. Call me crazy, but I don't see how people are going to be blindsided by the felching article. No one is going to go there unless they either know what it is and want to read about it, or DON'T know what it is and want to know what it is... and they shortly WILL know when they read the article.
What if they press 'random page'?
Yeah, so, when I was reading Jimbo's message and formulating a response in my head, I had more ideas than I could recall when it came time to write my response. Damn you for noticing one of the things I forgot to include. ;)
Well, the statistical likelihood is small (odds are that they'll get a rambot page) but hey, that's the whole point of random.
Even if they do manage to "random" onto the felching page, there's no reason they have to read past the point where they realize that it's a topic that might offend them. Are we really becoming so skittish as a society that it would be an unimaginable catastrophe if some prude has to read something that they found offensive for five seconds?
Let's say I find televangelists offensive. If I'm channel surfing and happen to catch a few seconds of a sermon before I realize what it is and click the channel change button, am I going to get all bent out of shape? No. You know why? Cause I'm not a goddamn idiot.
----- Dante Alighieri dalighieri@digitalgrapefruit.com
"The darkest places in hell are reserved for those who maintain their neutrality in times of great moral crisis." -Dante Alighieri, 1265-1321
Dante Alighieri wrote:
Jimbo, I'm puzzled. Where is it written that Wikipedia contains content that needs filtering? What are these mysterious standards that we refuse to implement? Who, exactly, is it that should be choosing which content on our site is "objectionable"... and objectionable to whom? How are we irresponsible?
I am of the opinion that Wikipedia does NOT contain content that needs filtering. Ideas are both MORE dangerous and LESS dangerous than most people realize. Of course, the proponents of filtering universally want to filter the less dangerous stuff and let the more dangerous stuff through. (i.e., sexuality is not dangerous, but ideas like freedom/rationality/logic ARE dangerous... at least to people who think sexuality IS dangerous....) I am also of the opinion that we DO have standards, and that those standards are good ones for an encyclopedia.... we try to ensure that our articles are truthful, non-biased, and accurate. I am also of the opinion that those people who don't like the site should not use it. I also think that those people who are unwillingly blocked from using the site need to step up and let their voices be heard; should peak to their parents, teachers, government officials, clergymen.; tell them why Wikipedia is important and how it shouldn't be blocked. Lastly, I find the assertion that we're being somehow irresponsible, or refusing to handle the issue responsibly insulting. Holding the view that censorship is unnecessary or undesirable is not irresponsible.
I don't even see the issue as being about whether Wikipedia contains objectionable material of any kind, sexual or otherwise. Being irresponsible is about ignoring the whole problem and just praying that it will go away. It's important to be aware of what we have, and what its possible effects might be. We need to be prepared to make changes where appropriate, but we also need to defend other content when we feel it's justified.
Ec
Jimmy-
My essential point is that our refusal to have standards is not a valid tool in the fight against censorship. Rather it plays directly into the hands of those who argue that censorship is necessary.
This is a very weak argument. Those who want censorship will argue that it is necessary with or without a Wikipedia article called "List of unusual sex practices". If they want a poster child for censorship, picking on an open encyclopedia project is not likely to work very well.
I have said that Wikipedia is important, and that it is true: It is important the moment it becomes censored entirely by someone, because of all the clearly useful and educational information that is there. But before this happens, Wikipedia is just one of millions of websites, very small parts of which contain small amounts of possibly offensive content within an encyclopedic framework.
Arguing that because of these small amounts of offensive information in an open encyclopedia, "we need filters NOW!" is not the kind of argument that works well in a filtering debate. You could pick many, many other sites and many, many other equivalent examples for that, from Blogspot to Livejournal, from Slashdot to Everything2, from Google ("SafeSearch" notwithstanding) to Yahoo. Let's take a look at a random news.yahoo.com story. In the comment section we find on the first page:
---- 336892 Re: Jews killed Jesus. Jesus who? 336891 Re: Wilbur05488 dilemma,,,seeker 336890 Re: Jews killed Jesus. 336889 Re: Why Arabs want Israel. 336888 Re: US jews HAVE NO LOYALTY TO AMERICA!! 336887 Re: Cretins, jealous of Jewish achieveme 336886 Re: Jews killed Juses. 336885 Re: If Hillary were President.. 336884 Re: US jews HAVE NO LOYALTY TO AMERICA!! 336883 Cretins sure sound like Arabs/Muslims 336882 Re: LET THE KILL EACH OTHER 336881 Look at this Sentence 336880 Re: Jews killed Juses. 336879 Re: LET THE KILL EACH OTHER ... ----
Our unusual sex practice articles pale in comparison to these crude and vulgar discussions. And you don't even want to know what is on groups.yahoo.com and the like. Yet even Yahoo! or Google are not used as examples in the filtering debate. That's not because they offer primitive "family filters", it's because you don't make the case for filtering by using subtle arguments. The people who are intelligent enough to understand them are probably against filtering in the first place.
Here are the likely scenarios. For simplicity's sake I'll leave out the implementation cost of filtering in each scenario, even though it is there:
1) We implement a working filter. Schools and libraries install blacklist filters regardless because of rotten.com et al. But we're not part of the filters.
WE WIN: pupils can access parts of Wikipedia WE LOSE: we have effectively endorsed the filtering of explicit content, other interactive sites will be expected to do the same or get blacklisted.
2) We implement a working filter. Schools and libraries install keyword filters for URLs and pages, and exempt Wikipedia from this filter:
WE WIN: control over what gets filtered WE LOSE: see 1)
3) We implement a working filter. Schools and libraries install keyword filters for URLs and pages, with no exceptions on URLs.
WE WIN: - WE LOSE: see 1)
4) We implement a filter that filters too much and is used repressively. Schools and libraries wholeheartedly endorse it.
WE WIN: - WE LOSE: Wikipedia sets an example for hard control and is cited as such. Others are asked to follow our model. Legitimate content is hidden from minors.
5) We implement a filter that filters too little. Schools and libraries call for additional measures.
WE WIN: - WE LOSE: -
6) We implement no filter at all. Schools and libraries install keyword filters for URLs and pages, with no exceptions on URLs.
WE WIN: - WE LOSE: -
7) We implement no filter at all. Schools and libraries blacklist Wikipedia.org because of pussy pictures.
WE WIN: opportunity to promote open access, highlight the dangers of filtering, publicity WE LOSE: access by minors
In all of the above scenarios, minors lose access to some pages. We have little to gain in that area. I personally doubt that we lose reputation because of a lack of filtering. Both scenario 6) and 7) are not particularly dangerous to us. Therefore I think that not implementing any kind of "family filter" is a reasonable choice.
Regards,
Erik
--- Jimmy Wales jwales@bomis.com wrote:
My point here is not about access to the schools, but is rather a simple response to your (insightful!) invitation to think about how our actions might impact the wider debates on school filtering. If we act irresponsibly and ideologically on this point, then we undermine the case against filters by showing that even something as potentially useful as wikipedia may need to be filtered.
I say we have a "School filter", make one for each country, and invite members of the educational system in each country to "rank" the articles, by grade (what a 12th grader can see is different than what a 6th grader can see).
===== Christopher Mahan chris_mahan@yahoo.com 818.943.1850 cell http://www.christophermahan.com/
__________________________________ Do you Yahoo!? SBC Yahoo! DSL - Now only $29.95 per month! http://sbc.yahoo.com
Christopher Mahan wrote:
--- Jimmy Wales jwales@bomis.com wrote:
My point here is not about access to the schools, but is rather a simple response to your (insightful!) invitation to think about how our actions might impact the wider debates on school filtering. If we act irresponsibly and ideologically on this point, then we undermine the case against filters by showing that even something as potentially useful as wikipedia may need to be filtered.
I say we have a "School filter", make one for each country, and invite members of the educational system in each country to "rank" the articles, by grade (what a 12th grader can see is different than what a 6th grader can see).
===== Christopher Mahan chris_mahan@yahoo.com 818.943.1850 cell http://www.christophermahan.com/
That's interesting. It reminds me of Fred Bauder's www.internet-encyclopedia.info, where articles are written from multiple points of view, rather than a neutral point of view.
What *is* the argument here? That all 130,000 articles of Wikipedia have to be blocked because there are a dozen that describe "highly unusual sex practices"? Do you really think that is convincing?
But if we are just complacent, if we just blindly accept the so-called power of these pressure groups, then *we* contribute to giving them that power. We are among those who shut up and not among those who speak up. If you are such a radical, Jimbo, why are you so eager to give them what they want?
Have you ever actually shown your mother Wikipedia? I think she would like it. Mine certainly did.
As did mine! Erik is completely right on. Furthermore - mechanical protections we institute now - in the name of fighting ..... will no doubt ( if implemented without a due degree of forethought - hence why were here ) - will have unforseen consequences - some of them influencing the way the WP works and maintains its interest to "radical libertarians." Dont fix it if it aint broke.
P.S. I was thinking of submitting an article to Kuro5hin about this debate - because its interesting - it raises important questions - and WP is among the shining stars born of the open techno-ethic. Im about 80 percent done. I wanted to submit the idea here first. Any objections?
- Steven Vertigo
Axel Boldt wrote:
There's one question we haven't investigated yet: would our filtering really lower the likelihood that schools will eventually block us completely?
I have some experience with filtering software. For sites like ours, what is likely is that specific pages will be blocked automatically based on keyword analysis, but that the site as a whole will not.
I doubt that we could ever create a system that even comes close to preventing school kids from getting the Wikipedia content they want. At one point, a school administrator will find objectionable content in Wikipedia, will investigate the filtering option with the various categories, and will realize that kids can still get to the objectionable pages, e.g. by typing in the direct URL. Then our whole domain will be blocked.
Possibly, but it really depends on the software they are using.
Sometimes blocks are that crude, but more often, they are not.
--Jimbo
--- Jimmy Wales jwales@bomis.com wrote:
Axel Boldt wrote:
There's one question we haven't investigated yet: would our filtering really lower the likelihood that schools will eventually block us completely?
I have some experience with filtering software. For sites like ours, what is likely is that specific pages will be blocked automatically based on keyword analysis, but that the site as a whole will not.
Ok, then let me reformulate my question: if we introduce a category based optional filtering in Wikipedia, will that in any way affect the likelihood that some or all pages of Wikipedia will eventually be hard blocked by schools? ("hard block" means: using some technology different from Wikipedia's optional filtering.)
Axel
__________________________________ Do you Yahoo!? SBC Yahoo! DSL - Now only $29.95 per month! http://sbc.yahoo.com
Axel Boldt wrote:
Ok, then let me reformulate my question: if we introduce a category based optional filtering in Wikipedia, will that in any way affect the likelihood that some or all pages of Wikipedia will eventually be hard blocked by schools? ("hard block" means: using some technology different from Wikipedia's optional filtering.)
I don't know. But I do know that many people will find it useful at home, and if they don't, they won't use it. Right?
--Jimbo
Jimmy Wales wrote:
Axel Boldt wrote:
There's one question we haven't investigated yet: would our filtering really lower the likelihood that schools will eventually block us completely?
I have some experience with filtering software. For sites like ours, what is likely is that specific pages will be blocked automatically based on keyword analysis, but that the site as a whole will not.
At this point, the /pragmatic/ argument for a filtering/sorting scheme -- that we must allow some sort of filtering under our control lest an outside body filter us too much without any of our control -- seems to me to be empty. Not only has nobody blocked us, but if they do block us, then it would be on a page by page basis. Thus we run no risk that they'll block more of our site than we would have to allow them to block using our sorting scheme. Indeed, a page by page blocking on whatever grounds the filter chooses (na�ve keyword analysis, sophisticated keyword analysis, human decision) is the ultimate in allowing readers to choose what they wish to read.
Of course, this doesn't affect the other arguments that have come up, such as the moral argument (�We must do this to be responsible.�) or the arguments for applying categorisation/sorting to other uses.
-- Toby
Jimmy Wales wrote:
Toby Bartels wrote:
If you spell it out in such a way that it includes explicity sexual content but not explicit religious content, then how is this NPOV, given an earlier poster (Ec?) that considers the latter harmful to kids. (And if Ec is joking, I have a friend that seriously believes that about Christianity in particular.)
End users can adjust it however they like, so what's the problem?
If end users have to adjust for themselves which articles are labelled "mature content", then (depending on what this means), either: * We abandon NPOV for the label at any single given moment, and instead the next person to come along adjusts it to their own POV, without any security that this will last more than a minute; or * The user can't use the "mature content" label without going through the whole site and labelling everything. (The first bullet point assumes a single global labelling, while the second assumes individual labelling for each content provider.)
If «explicit sexual content» is what you meant all along by "mature content", then let's say "explicit sexual content".
No, it isn't what I meant all along. It's by far the biggest problem category that we have right now, but of course other things can be included.
If "explicit sexual content" is a single *category*, then why not just label it "explicit sexual content" and label the *other* categories *different* things? Why bring them all under "mature content"?
In responding to this, your position seems to have gotten a lot stranger. Perhaps you just forgot that the above two comments are not an objection to the scheme in general (that's elsewhere), but instead just an objection to making "mature content" one of the labels used (alongside "sex" and ... I forget the other one).
But this is important for Wikipedia the website, not just for others.
Is it? If we have an Edupedia website built on a Sifter model, then schools can block Wikipedia as long as they don't block Edupedia.
And that would be very unfortunate, I think.
Unfortunate, I suppose. But would it cause significant harm? Keeping in mind the opposition that categorising Wikipedia has aroused, which opposition is not to be found against Edupedia, that may be the optimal solution.
-- Toby
Toby Bartels wrote:
Jimmy Wales wrote:
Toby Bartels wrote:
If you spell it out in such a way that it includes explicity sexual content but not explicit religious content, then how is this NPOV, given an earlier poster (Ec?) that considers the latter harmful to kids. (And if Ec is joking, I have a friend that seriously believes that about Christianity in particular.)
End users can adjust it however they like, so what's the problem?
If end users have to adjust for themselves which articles are labelled "mature content", then (depending on what this means), either:
- We abandon NPOV for the label at any single given moment,
and instead the next person to come along adjusts it to their own POV, without any security that this will last more than a minute; or
- The user can't use the "mature content" label
without going through the whole site and labelling everything. (The first bullet point assumes a single global labelling, while the second assumes individual labelling for each content provider.)
I prefer individual article labelling with the capacity for an article to have more than one label. The labels themselves would be just as editable as the articles, and just as subject to edit wars. I can live without the security of hard wired labels. Once labels were adopted, there should be no need for a mass movement to label everything immediately. The default label "unlabelled" could likely be implemented by the software, as a searchable label. Those inclined to do housework could go ahead and apply labels when they felt so inspired.
Ec
Ray Saintonge wrote:
I prefer individual article labelling with the capacity for an article to have more than one label. The labels themselves would be just as editable as the articles, and just as subject to edit wars. I can live without the security of hard wired labels. Once labels were adopted, there should be no need for a mass movement to label everything immediately. The default label "unlabelled" could likely be implemented by the software, as a searchable label. Those inclined to do housework could go ahead and apply labels when they felt so inspired.
I agree with all of this. We could, in fact, add some content categorization scheme now, without having it actually do anything on our site right now. This would be article metadata that we, or someone else, might find useful in the future, either for edupedia or wikipedia with an option search filter or whatever.
--Jimbo
Toby Bartels wrote:
If "explicit sexual content" is a single *category*, then why not just label it "explicit sexual content" and label the *other* categories *different* things? Why bring them all under "mature content"?
O.k., fine, this is a minor point. It doesn't really matter.
--Jimbo
Jimmy Wales wrote:
Toby Bartels wrote:
If you spell it out in such a way that it includes explicity sexual content but not explicit religious content, then how is this NPOV, given an earlier poster (Ec?) that considers the latter harmful to kids. (And if Ec is joking, I have a friend that seriously believes that about Christianity in particular.)
End users can adjust it however they like, so what's the problem?
I think both Erik and I had made suggestions of that sort. I am not a rabid advocate of censorship, and I don't believe that he is either. I also agree that any such censorship would be applied '''only''' at the end user level.
The visions which Erik and I were expressing were technically very different in the way they would function, but probably very similar in what they would accomplish. The capacity of these systems for censorship is secondary and incidental. They would work just as well if a person wanted to censor out articles about nuclear physics or ancient Egyptian history, or some other subject that would leave all the rest of us scratcuing our heads. To me the real issue is about scaling up and indexing in a project that is growing tremendously fast.
In my particular vision of coding the end user would simply have the opportunity to not download articles coded for sex, religion or knitting..
Ec
--- Ray Saintonge saintonge@telus.net wrote:
Jimmy Wales wrote:
Toby Bartels wrote:
If you spell it out in such a way that it includes
explicity sexual
content but not explicit religious content, then
how is this NPOV,
given an earlier poster (Ec?) that considers the
latter harmful to
kids. (And if Ec is joking, I have a friend that
seriously believes
that about Christianity in particular.)
End users can adjust it however they like, so
what's the problem?
I think both Erik and I had made suggestions of that sort. I am not a rabid advocate of censorship, and I don't believe that he is either. I also agree that any such censorship would be applied '''only''' at the end user level.
The visions which Erik and I were expressing were technically very different in the way they would function, but probably very similar in what they would accomplish. The capacity of these systems for censorship is secondary and incidental. They would work just as well if a person wanted to censor out articles about nuclear physics or ancient Egyptian history, or some other subject that would leave all the rest of us scratcuing our heads. To me the real issue is about scaling up and indexing in a project that is growing tremendously fast.
In my particular vision of coding the end user would simply have the opportunity to not download articles coded for sex, religion or knitting..
Ec
I think a built-in category system would be very useful for other purposes, too. But I still think that we would benefit from having a seperate domain where certain things were by-default blocked. --LDan
__________________________________ Do you Yahoo!? Yahoo! Calendar - Free online calendar with sync to Outlook(TM). http://calendar.yahoo.com
Daniel Ehrenberg wrote:
I think a built-in category system would be very useful for other purposes, too. But I still think that we would benefit from having a seperate domain where certain things were by-default blocked.
I'm starting to come around to this conclusion, too.
Jimmy Wales wrote:
Daniel Ehrenberg wrote:
I think a built-in category system would be very useful for other purposes, too. But I still think that we would benefit from having a seperate domain where certain things were by-default blocked.
I'm starting to come around to this conclusion, too.
Well, I remember someone starting a free, peer-reviewed encyclopedia. The same guy started an "all-you-can-edit"-encyclopedia just for fun, just to see how it turns out.
In the same spirit, the same guy could start a peer-reviewed sifter copy of that latter encyclopedia that was successful beyond belief. You know, just for fun. Might turn out to be the answer to all the problems that flooded the mailing lists for the past weeks. Might turn out to be a complete waste of time. Who knows.
Also, there could be categories added to that existing encyclopedia. Emphasis on *added*. You would like to categorize all mathematical topics? Go right ahead. You don't care? Just as well. Want to filter all articles that say "no-no" in your belief system? It's just a few clicks away. Otherwise, things will continue exactly as they are for you. Maybe you'll find a "categories" link on the article pages. I hope that doesn't scare you away.
What I'm trying to say here is: Upgrade the software so it *can* offer a category system, and a sifter "user rights management". Turn it on and see what happens. If it works (=acceptance, usability, etc), great, we'll keep it. If not, we'll change it. Or turn it off right away.
We'll never know for sure unless we try. And there's nothing we could lose trying. Maybe a little of our time.
But we're wikipedians. We're used to that...
Magnus
Jimmy Wales wrote:
Toby Bartels wrote:
Here's a set of flags that I think would work well enough for the felching article:
sexuality mature content slang
I don't see how those are very controversial.
"mature content" will definitely cause problems. You imply in the rest of what you write (not only in this post) «what some would consider to be appropriate only for the mature», but not only does "mature content" not say this literally, but also (as somebody else pointed out) that's extremely broad, and includes [[Christianity]]. So it won't be of much use.
Well, I don't agree. The standards for what we mean by mature content can be spelled out in sufficient detail and in an NPOV way so that controversy is minimized.
But I'm not invested in that particular phrase. Perhaps we could use 'explicit sexual content' to distinguish it from 'sexuality'.
If I were writing the Wikipedia code, I wouldn't spend my time on writing support for these flags; that's better done (if at all) by another project that operates on top of Wikipedia
But this is important for Wikipedia the website, not just for others.
Some people in this debate have taken a very POV position, i.e. that wikipedia should shove this stuff down people's throats, and if they're too prudish to deal with it, too bad ha ha. I don't agree.
To be effective any kind of flagging system must avoid subjective determinations. Words like "explicit" and "mature" can lead to some serious disagreements that can't easily be resolved. The following spectrum can be more objectively determined for photographs. 1. Contains images of sex acts 2. Contains close-up images of genitals 3. Contains whole-body nudity 4. Contains partial nudity 5. Shows people in underclothes 6. Shows suggestive photos of fully clad people
There is a gradation here, and that gives a greater opportunity to choose how much is acceptable. Putting these in some coding system would allow these to be presented with only a few alpha-numeric characters.
Ec
At 03:14 PM 6/11/03 -0700, Ray wrote:
To be effective any kind of flagging system must avoid subjective determinations. Words like "explicit" and "mature" can lead to some serious disagreements that can't easily be resolved. The following spectrum can be more objectively determined for photographs.
- Contains images of sex acts
- Contains close-up images of genitals
- Contains whole-body nudity
- Contains partial nudity
- Shows people in underclothes
- Shows suggestive photos of fully clad people
I'm not convinced that this doesn't just move the subjectivity. Even "sex acts" is open to argument--is one woman licking another's breast a sex act, for example? (I assume we agree that a photo of a nursing infant does not contain a sex act.) Was I partially nude at the clinic this morning, in underwear, t-shirt, and a hospital gown? (Yes, it's street legal, but this is New York State, so that doesn't mean a great deal.) In particular, what are "suggestive photos"? I doubt we'd manage a more objective definition for that than Potter Stewart's famous "I know it when I see it."
--- Vicki Rosenzweig vr@redbird.org wrote:
At 03:14 PM 6/11/03 -0700, Ray wrote:
To be effective any kind of flagging system must
avoid subjective
determinations. Words like "explicit" and "mature"
can lead to some
serious disagreements that can't easily be
resolved. The following
spectrum can be more objectively determined for
photographs.
- Contains images of sex acts
- Contains close-up images of genitals
- Contains whole-body nudity
- Contains partial nudity
- Shows people in underclothes
- Shows suggestive photos of fully clad people
I'm not convinced that this doesn't just move the subjectivity. Even "sex acts" is open to argument--is one woman licking another's breast a sex act, for example? (I assume we agree that a photo of a nursing infant does not contain a sex act.)
Personal memories of what "explicit" might be.
I breastfed for months when living in the US. In public places, I have been told several times what I was doing was "indecent", and "could you just not go please to the restrooms to do 'that'".
I assume we agree that nursing a baby while sitting on the toilet seat is certainly not a pleasant picture. Do we ? Yet, it was suggested, when I answered there was no chair in the restrooms.
I was not spreading my nudity. However I was not hiding under a huge shawl, as if I was doing a shameful act (and removing air from the baby). I was discreet and it was nobody's business.
But, yes, I was told *I* could be responsible of giving "bad" thoughts (sexual thoughts) to some people.
There are no limits to what we can assume people might label "to be censored". Which is why I am against *us* defining what could be censored.
__________________________________ Do you Yahoo!? Yahoo! Calendar - Free online calendar with sync to Outlook(TM). http://calendar.yahoo.com
--- Ray Saintonge saintonge@telus.net wrote:
Jimmy Wales wrote:
Toby Bartels wrote:
Here's a set of flags that I think would work
well enough for the
felching article:
sexuality mature content slang
I don't see how those are very controversial.
"mature content" will definitely cause problems. You imply in the rest of what you write (not only
in this post)
�what some would consider to be appropriate only
for the mature�,
but not only does "mature content" not say this
literally,
but also (as somebody else pointed out) that's
extremely broad,
and includes [[Christianity]]. So it won't be of
much use.
Well, I don't agree. The standards for what we
mean by mature content
can be spelled out in sufficient detail and in an
NPOV way so that
controversy is minimized.
But I'm not invested in that particular phrase.
Perhaps we could use
'explicit sexual content' to distinguish it from
'sexuality'.
If I were writing the Wikipedia code, I wouldn't
spend my time
on writing support for these flags; that's better
done (if at all)
by another project that operates on top of
Wikipedia
But this is important for Wikipedia the website,
not just for
others.
Some people in this debate have taken a very POV
position, i.e. that
wikipedia should shove this stuff down people's
throats, and if
they're too prudish to deal with it, too bad ha ha.
I don't agree.
To be effective any kind of flagging system must avoid subjective determinations. Words like "explicit" and "mature" can lead to some serious disagreements that can't easily be resolved. The following spectrum can be more objectively determined for photographs. 1. Contains images of sex acts 2. Contains close-up images of genitals 3. Contains whole-body nudity 4. Contains partial nudity 5. Shows people in underclothes 6. Shows suggestive photos of fully clad people
There is a gradation here, and that gives a greater opportunity to choose how much is acceptable. Putting these in some coding system would allow these to be presented with only a few alpha-numeric characters.
Ec
It that for the piano ankles ? :-)
4. should be more precise, I think in most cultures, seeing a naked hand is fine, seing a naked face not always so. Or naked hair Or legs Or tummy (look, it IS very hot these days here :-))
__________________________________ Do you Yahoo!? Yahoo! Calendar - Free online calendar with sync to Outlook(TM). http://calendar.yahoo.com
Jimmy Wales jwales@bomis.com wrote: Toby Bartels wrote:
Here's a set of flags that I think would work well enough for the felching article:
sexuality mature content slang
I don't see how those are very controversial.
"mature content" will definitely cause problems. You imply in the rest of what you write (not only in this post) �what some would consider to be appropriate only for the mature�, but not only does "mature content" not say this literally, but also (as somebody else pointed out) that's extremely broad, and includes [[Christianity]]. So it won't be of much use.
Well, I don't agree. The standards for what we mean by mature content can be spelled out in sufficient detail and in an NPOV way so that controversy is minimized.
But I'm not invested in that particular phrase. Perhaps we could use 'explicit sexual content' to distinguish it from 'sexuality'. Some think that any type of 'sexuality' content should be equally be censored. Maybe that should have a seperate flag.> If I were writing the Wikipedia code, I wouldn't spend my time
on writing support for these flags; that's better done (if at all) by another project that operates on top of Wikipedia
But this is important for Wikipedia the website, not just for others.
Some people in this debate have taken a very POV position, i.e. that wikipedia should shove this stuff down people's throats, and if they're too prudish to deal with it, too bad ha ha. I don't agree.
--Jimbo That creates a paradox. Any one scheme for censorship (or even flagging) is POV, while a lack of one is also POV. We're stuck. So let's choose the choice which will broaden our audience the most: flagging built into the software.
--------------------------------- Do you Yahoo!? Free online calendar with sync to Outlook(TM).
LittleDan wrote:
That creates a paradox. Any one scheme for censorship (or even flagging) is POV, while a lack of one is also POV. We're stuck. So let's choose the choice which will broaden our audience the most: flagging built into the software.
Failing to categorify content is POV??? Where's the bias?
-- Toby
Jimmy Wales wrote:
We may not agree about the appropriateness of [[felching]], but we can agree that it's controversial, that it's sexual in nature, and that it would be widely regarded as a mature topic. There's probably a lot of other things we could say about it that would be relevant from a content advisory point of view, but which are also NPOV.
I don't think the flagging wars will occur at [[felching]]. They'll occur on more borderline pages.
But there's no need to argue about it now. If someone implements flags, we'll find out. And if they're more trouble than they're worth, we can take them out again.
-M-