[Foundation-l] Statement on appropriate educational content
David Goodman
dgoodmanny at gmail.com
Sat May 8 05:06:49 UTC 2010
The only existing US law that I think Commons might possibly not be
complying with is the requirement to ensure that the models of some
pictures are not minors; to what extent these provisions might be
retroactive, IANAL, much less a specialist in these matters, is
something that I do not know.
But I do know about matters pertaining to libraries, and the
responsibility for filtering is on them, not the information
providers, or the sites which post the information. Most libraries
deal with this by outsourcing, and relying on the standards of the
providers of the filters. I see no reason why we should cooperate
with censorship, however well intentioned. We should, however,
maintain our own standards. (Because it is appropriate to provide some
guides about our content to users generally, maintaining certain
images in a collection labelled BDSM, and ensuring they have clearly
descriptive titles--which remains incomplete in Commons more generally
than just these images-- would seem to me quite adequate information
about their likely nature. )
David Goodman, Ph.D, M.L.S.
http://en.wikipedia.org/wiki/User_talk:DGG
On Fri, May 7, 2010 at 11:39 PM, Jayen466 <jayen466 at yahoo.com> wrote:
> One thing which I would have wished the Board's statement to address is the need for some sort of content rating and filtering system that will enable parents, schools and libraries to screen out content unsuitable for minors.
>
> Anyone giving minors access to Commons presently also gives them ready access to collections of pornographic media, via categories such as
>
> http://commons.wikimedia.org/wiki/Category:BDSM
>
> (and its various subcategories).
>
> I am concerned about this, because it reflects poorly on the project. It is also against the law in parts of the world. In Germany, for example,
>
> "The spreading of pornographic content and other harmful media via the internet is a criminal offence under German jurisdiction. A pornographic content on the internet is legal only if technical measures prohibit minors from getting access to the object (AVS = Age Verification System or Adult-Check-System)."
>
> From: http://www.bundespruefstelle.de/bpjm/information-in-english.html
>
> As far as I am concerned, the community consensus model has failed us here, resulting in immature decision-making.
>
> The same thing goes for Wikipedia articles that contain pornographic material. We should have content rating categories, so schools and libraries can make Wikipedia accessible to minors without fearing that they will
>
> * lose their E-Rate funding (per the Children's Internet Protection Act, http://www.fcc.gov/cgb/consumerfacts/cipa.html -- "Schools and libraries subject to CIPA may not receive the discounts offered by the E-rate program unless they certify that they have an Internet safety policy that includes technology protection measures. The protection measures must block or filter Internet access to pictures that are: (a) obscene, (b) child pornography, or (c) harmful to minors (for computers that are accessed by minors)"), or
>
> * will be found to have infringed laws if a parent, say, complains to a teacher about their child having stumbled upon our hardcore pornography on a school computer.
>
> Doing nothing to address concerns that are widespread in society is risky and foolhardy. There is also the issue of underage admins being asked to administer hardcore pornographic content, making deletion decisions etc. It doesn't look good and will come to bite us sooner or later, if it is not addressed.
>
> Andreas (Jayen466)
>
> On 7 May 2010 21:30, Michael Snow <wikipedia at verizon.net> wrote:
>
>> Distributing this more widely, since apparently the forwarding from
>> announce-l still has issues. The Board of Trustees has directed me to
>> release the following statement:
>>
>> The Wikimedia Foundation projects aim to bring the sum of human
>> knowledge to every person on the planet. To that end, our projects
>> contain a vast amount of material. Currently, there are more than six
>> million images and 15 million articles on the Wikimedia sites, with new
>> material continually being added.
>>
>> The vast majority of that material is entirely uncontroversial, but the
>> projects do contain material that may be inappropriate or offensive to
>> some audiences, such as children or people with religious or cultural
>> sensitivities. That is consistent with Wikimedia's goal to provide the
>> sum of all human knowledge. We do immediately remove material that is
>> illegal under U.S. law, but we do not remove material purely on the
>> grounds that it may offend.
>>
>> Having said that, the Wikimedia projects are intended to be educational
>> in nature, and there is no place in the projects for material that has
>> no educational or informational value. In saying this, we don't intend
>> to create new policy, but rather to reaffirm and support policy that
>> already exists. We encourage Wikimedia editors to scrutinize potentially
>> offensive materials with the goal of assessing their educational or
>> informational value, and to remove them from the projects if there is no
>> such value.
>>
>> --Michael Snow
>
>
>
>
>
>
> _______________________________________________
> foundation-l mailing list
> foundation-l at lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
>
More information about the wikimedia-l
mailing list