While we are discussing child pornography and drawing of children engaged in sex acts, lets talk about external links to sites with these images.
Instead of a single image that can be presented in a context to make it clear it is educational , a bigger concern is external links to image boards with images of a child engaged in sexually explicit act. These appear on Wikipedia on a regular basis. When I went to image board web sites to look for inappropriate images, I felt disgusted that another Wikipedia editor would put it on Wikipedia. I think we need to modify our policy/guidelines dealing with images with children engaged in sexually explicit acts. They should not be permitted in my opinion. Editors that repeatedly add them should be blocked for being disruptive.
regards,
Sydney aka FloNight
On 3 Apr 2006, at 15:19, Sydney Poore wrote:
While we are discussing child pornography and drawing of children engaged in sex acts, lets talk about external links to sites with these images.
Instead of a single image that can be presented in a context to make it clear it is educational , a bigger concern is external links to image boards with images of a child engaged in sexually explicit act. These appear on Wikipedia on a regular basis. When I went to image board web sites to look for inappropriate images, I felt disgusted that another Wikipedia editor would put it on Wikipedia. I think we need to modify our policy/guidelines dealing with images with children engaged in sexually explicit acts. They should not be permitted in my opinion. Editors that repeatedly add them should be blocked for being disruptive.
External links are often problematic.
It seems likely that the ones you refer to do not add useful encyclopaedic background to wikipedia. Stuff that isnt references needs to be looked at very carefully. There is lots of link spam.
I dont see that this needs a specific policy.
Justinc
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Sydney Poore stated for the record:
Instead of a single image that can be presented in a context to make it clear it is educational , a bigger concern is external links to image boards with images of a child engaged in sexually explicit act. These appear on Wikipedia on a regular basis. When I went to image board web sites to look for inappropriate images, I felt disgusted that another Wikipedia editor would put it on Wikipedia. I think we need to modify our policy/guidelines dealing with images with children engaged in sexually explicit acts. They should not be permitted in my opinion. Editors that repeatedly add them should be blocked for being disruptive.
Certain sites stand out as excellent starting points for such a policy. The worst offender is something called "Google," which has innumerable links to inappropriate material.
- -- Sean Barrett | Honk if you've never seen a gun sean@epoptic.org | fired from a moving vehicle.
Are you equating Google search, where some one must deliberately type in the words, to an external link to image boards on an article? Or are you talking about Google images site?
Sydney aka FloNight
Sean Barrett wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Sydney Poore stated for the record:
Instead of a single image that can be presented in a context to make it clear it is educational , a bigger concern is external links to image boards with images of a child engaged in sexually explicit act. These appear on Wikipedia on a regular basis. When I went to image board web sites to look for inappropriate images, I felt disgusted that another Wikipedia editor would put it on Wikipedia. I think we need to modify our policy/guidelines dealing with images with children engaged in sexually explicit acts. They should not be permitted in my opinion. Editors that repeatedly add them should be blocked for being disruptive.
Certain sites stand out as excellent starting points for such a policy. The worst offender is something called "Google," which has innumerable links to inappropriate material.
Sean Barrett | Honk if you've never seen a gun sean@epoptic.org | fired from a moving vehicle. -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.2 (MingW32) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFEMTIKMAt1wyd9d+URAtMxAJ924GG0Rb0kn/bPbRx6TJ+CyU49zQCeJnFE hRHRiotz8VRHA8NouJtS3w8= =Rqyd -----END PGP SIGNATURE----- _______________________________________________ WikiEN-l mailing list WikiEN-l@Wikipedia.org To unsubscribe from this mailing list, visit: http://mail.wikipedia.org/mailman/listinfo/wikien-l
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Sydney Poore stated for the record:
Are you equating Google search, where some one must deliberately type in the words, to an external link to image boards on an article? Or are you talking about Google images site?
Sydney aka FloNight
Sean Barrett wrote:
Sydney Poore stated for the record:
Instead of a single image that can be presented in a context to make it clear it is educational , a bigger concern is external links to image boards with images of a child engaged in sexually explicit act. These appear on Wikipedia on a regular basis. When I went to image board web sites to look for inappropriate images, I felt disgusted that another Wikipedia editor would put it on Wikipedia. I think we need to modify our policy/guidelines dealing with images with children engaged in sexually explicit acts. They should not be permitted in my opinion. Editors that repeatedly add them should be blocked for being disruptive.
Certain sites stand out as excellent starting points for such a policy. The worst offender is something called "Google," which has innumerable links to inappropriate material.
You wrote, and I quote, "When I went to image board web sites to look for inappropriate images, I felt disgusted...." If you go deliberately looking for inappropriate images, you shouldn't be disappointed when you succeed.
- -- Sean Barrett | Honk if you've never seen a gun sean@epoptic.org | fired from a moving vehicle.
I wasn't surprised that the images were there. I was disgusted that an editor added the links. There is a difference I think. :-)
Recently, another editor emailed me to ask me to check out 5 external links on an article. He was at work and couldn't safely pull them up. Some of the images there were horrible. I don't think merely looking at this type of material causes some one to fall into moral decay. Personally it saddens me to see images of young children being sexually exploited. These images are being promoted on these sites. Very different than showing them for educational purposes.
Sydney aka FloNight
Sean Barrett wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Sydney Poore stated for the record:
Are you equating Google search, where some one must deliberately type in the words, to an external link to image boards on an article? Or are you talking about Google images site?
Sydney aka FloNight
Sean Barrett wrote:
Sydney Poore stated for the record:
Instead of a single image that can be presented in a context to make it clear it is educational , a bigger concern is external links to image boards with images of a child engaged in sexually explicit act. These appear on Wikipedia on a regular basis. When I went to image board web sites to look for inappropriate images, I felt disgusted that another Wikipedia editor would put it on Wikipedia. I think we need to modify our policy/guidelines dealing with images with children engaged in sexually explicit acts. They should not be permitted in my opinion. Editors that repeatedly add them should be blocked for being disruptive.
Certain sites stand out as excellent starting points for such a policy. The worst offender is something called "Google," which has innumerable links to inappropriate material.
You wrote, and I quote, "When I went to image board web sites to look for inappropriate images, I felt disgusted...." If you go deliberately looking for inappropriate images, you shouldn't be disappointed when you succeed.
Sean Barrett | Honk if you've never seen a gun sean@epoptic.org | fired from a moving vehicle. -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.2 (MingW32) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFEMT4tMAt1wyd9d+URAoSMAJ47j/9T/tO/ijmxnIVf2TgCNHjikwCfdwB2 /xQ03scoR6X2FV+sRT4zva4= =f5Os -----END PGP SIGNATURE----- _______________________________________________ WikiEN-l mailing list WikiEN-l@Wikipedia.org To unsubscribe from this mailing list, visit: http://mail.wikipedia.org/mailman/listinfo/wikien-l
On 4/3/06, Sydney Poore poore5@adelphia.net wrote:
I wasn't surprised that the images were there. I was disgusted that an editor added the links. There is a difference I think. :-)
Recently, another editor emailed me to ask me to check out 5 external links on an article. He was at work and couldn't safely pull them up. Some of the images there were horrible. I don't think merely looking at this type of material causes some one to fall into moral decay. Personally it saddens me to see images of young children being sexually exploited. These images are being promoted on these sites. Very different than showing them for educational purposes.
Sydney aka FloNight
Children involved in sexually explicit acts is illegal in all the ways that come to mind. Most picture gallery links are non-encyclopedic spam. I don't see why we would need new policies to cover this.
Mgm
On 4/4/06, MacGyverMagic/Mgm macgyvermagic@gmail.com wrote:
On 4/3/06, Sydney Poore poore5@adelphia.net wrote:
I wasn't surprised that the images were there. I was disgusted that an editor added the links. There is a difference I think. :-)
Recently, another editor emailed me to ask me to check out 5 external links on an article. He was at work and couldn't safely pull them up. Some of the images there were horrible. I don't think merely looking at this type of material causes some one to fall into moral decay. Personally it saddens me to see images of young children being sexually exploited. These images are being promoted on these sites. Very different than showing them for educational purposes.
Children involved in sexually explicit acts is illegal in all the ways that come to mind. Most picture gallery links are non-encyclopedic spam. I don't see why we would need new policies to cover this.
The link in question is to a lolicon (ie. cartoon porn) site. It's legal in the US and Japan, and probably a number of other parts of the world.
-- Mark [[User:Carnildo]]
MacGyverMagic/Mgm wrote:
Children involved in sexually explicit acts is illegal in all the ways that come to mind. Most picture gallery links are non-encyclopedic spam. I don't see why we would need new policies to cover this.
I agree about not needing new policies, although I do think that some of the default assumptions may need to be moved, with respect to images which have been uploaded. I think a move of default assumptions would go a long way toward nuking a ton of fair use images that we would prefer not to have in en.wikipedia.
But the one thing that I hope can healthily come out of this discussion is a general feeling of empowerment by admins to actually *enforce* policy in the face of trolling by pedohiles and the like, or the sorts of people who think that merely by camping out and pushing POV on an article, they should have more of a say of what goes into it than admins.
--Jimbo
Jimmy Wales wrote:
I agree about not needing new policies, although I do think that some of the default assumptions may need to be moved, with respect to images which have been uploaded. I think a move of default assumptions would go a long way toward nuking a ton of fair use images that we would prefer not to have in en.wikipedia.
Yes, writing new policies only confuses people, when what is needed is the courage to enforce the ones that already exist.
But the one thing that I hope can healthily come out of this discussion is a general feeling of empowerment by admins to actually *enforce* policy in the face of trolling by pedohiles and the like, or the sorts of people who think that merely by camping out and pushing POV on an article, they should have more of a say of what goes into it than admins.
We had the same lolicon cartoon appear to illustrate that word in Wiktionary, and probably the same proportion of people who objected to it, and always who want to keep it as a part of some free speech crusade. I feel on safe ground enforcing the elimination of that picture from the project, but oftentimes in other circumstances enforcement is a tough call because the amount of flak can become overwhelming. A senior administrator or bureaucrat has an extra stock of goodwill to draw upon, but that does not make it easier when there's a need to come down heavy on long-standing contributors.
BTW, congrats for having your "Everybody tells jokes, but we still need comedians" make it to Quote of the Day on Google.
Ec
Sean Barrett wrote:
Sydney Poore stated for the record:
Instead of a single image that can be presented in a context to make it clear it is educational , a bigger concern is external links to image boards with images of a child engaged in sexually explicit act. These appear on Wikipedia on a regular basis. When I went to image board web sites to look for inappropriate images, I felt disgusted that another Wikipedia editor would put it on Wikipedia. I think we need to modify our policy/guidelines dealing with images with children engaged in sexually explicit acts. They should not be permitted in my opinion. Editors that repeatedly add them should be blocked for being disruptive.
Certain sites stand out as excellent starting points for such a policy. The worst offender is something called "Google," which has innumerable links to inappropriate material.
I am not fully persuaded by this argument. I think that editorially speaking we can and should make sensible judgments, even difficult judgments, about the usefulness and appropriateness of various links to our end users.
I do think, though, that we have had much less of a problem with inappropriate external links (of various kinds) than we have had with inappropriate images (of various kinds). To move this out of the realm of a discussion of pedophilia and 'censorship', let's consider a much simpler case of fair use images that are on the site when free alternatives would be easy to come by.
If I put an irrelevant bit of text into an article, including a link, a bit which is problematic on any sort of grounds at all, then anyone can come along and delete or change it. It takes *one person* to eliminate the problem, though of course an edit war or a long discussion might follow.
With images, though, there has grown this bizarre culture that we must not delete anything until we have a consensus to do so. This is partly because images can't be easily restored, and there is some legitimacy to that as a factor in how we do things, but I think it has gotten much worse. Wildly inappropriate images which do not even have a majority support for keeping are kept in articles in a way that similarly inappropriate text would be shot on sight.
--Jimbo
Jimmy Wales wrote:
With images, though, there has grown this bizarre culture that we must not delete anything until we have a consensus to do so. This is partly because images can't be easily restored, and there is some legitimacy to that as a factor in how we do things, but I think it has gotten much worse. Wildly inappropriate images which do not even have a majority support for keeping are kept in articles in a way that similarly inappropriate text would be shot on sight.
I think this is in fact a (if not the) major cause of this controversy. I'm not at all surprised that many users are very nervous about image deletion: we still remember when CSD I4 was added in September and as a consequence people had to go to Answers.com and other mirrors to look for copies of hastily deleted images. I don't know if any free images were actually permanently lost because of that, but it wouldn't surprise me at all either.
I've occasionally thought of setting up a bot to download and archive every file that is uploaded to Wikipedia. I think just the knowledge that permanent copies existed _somewhere_ would probably calm people down a lot, even if the copies wouldn't be easily accessible. I don't think the bandwidth and storage costs would be prohibitive. I'd have to get someone's help to set up offsite backups, though.
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
How about an obligation on *any* admin who speedies *any* image to first of all save that image on their hard drive, and store a local copy of it for at least a week/two weeks in case the speedy deletion turns out to have been incorrect?
Cynical
Ilmari Karonen wrote:
Jimmy Wales wrote:
With images, though, there has grown this bizarre culture that we must not delete anything until we have a consensus to do so. This is partly because images can't be easily restored, and there is some legitimacy to that as a factor in how we do things, but I think it has gotten much worse. Wildly inappropriate images which do not even have a majority support for keeping are kept in articles in a way that similarly inappropriate text would be shot on sight.
I think this is in fact a (if not the) major cause of this controversy. I'm not at all surprised that many users are very nervous about image deletion: we still remember when CSD I4 was added in September and as a consequence people had to go to Answers.com and other mirrors to look for copies of hastily deleted images. I don't know if any free images were actually permanently lost because of that, but it wouldn't surprise me at all either.
I've occasionally thought of setting up a bot to download and archive every file that is uploaded to Wikipedia. I think just the knowledge that permanent copies existed _somewhere_ would probably calm people down a lot, even if the copies wouldn't be easily accessible. I don't think the bandwidth and storage costs would be prohibitive. I'd have to get someone's help to set up offsite backups, though.
David Alexander Russell wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
How about an obligation on *any* admin who speedies *any* image to first of all save that image on their hard drive, and store a local copy of it for at least a week/two weeks in case the speedy deletion turns out to have been incorrect?
Oh yeah, just what I need, a whole archive of mustachioed Mona Lisas and popes embracing space aliens, produced by artistes with all the Photoshop skills of the average hamster.
:-)
Stan
On 4/5/06, David Alexander Russell webmaster@davidarussell.co.uk wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
How about an obligation on *any* admin who speedies *any* image to first of all save that image on their hard drive, and store a local copy of it for at least a week/two weeks in case the speedy deletion turns out to have been incorrect?
I did something like that when I was speedy-deleting no-source and no-license images, and OrphanBot's been keeping a copy of every no-source and no-license image it's encountered. Means that over the past four months, I've acquired a 2GB archive of 25,000 images.
-- Mark [[User:Carnildo]]
David Alexander Russell wrote:
How about an obligation on *any* admin who speedies *any* image to first of all save that image on their hard drive, and store a local copy of it for at least a week/two weeks in case the speedy deletion turns out to have been incorrect?
Sounds reasonable, provided that the image 1) is more that a day old and 2) is not a verbatim copy of an image available online on a site that is clearly not a Wikipedia mirror. (Those two exceptions may be called the RC patrol exception and the "you know where to find it" exception.)
In fact, that's pretty much what I've been doing myself, not only for images I've personally deleted but also for images I've tagged for eventual deletion.
Of course, a comprehensive archive would make this less important. In fact, we can probably make another exception for images tagged by OrphanBot, since apparently the bot already archives them.
On 4/5/06, Ilmari Karonen nospam@vyznev.net wrote:
David Alexander Russell wrote:
How about an obligation on *any* admin who speedies *any* image to first of all save that image on their hard drive, and store a local copy of it for at least a week/two weeks in case the speedy deletion turns out to have been incorrect?
Instead of adding a major burden to admins deleting images (which is something that needs to be done; I'm fairly sure there are large backlogs), and slowing them down, and aggravating them, why not simply fix image deletion's permament deletion?
Speaking only for myself, if I had to archive images on my own hardware, and manually, everytime I went to CSD and deleted some images (or used Gmaxwell's list, or went on New Pages patrol, or...), I'd probably simply stop deleting images, since there are many other equally worthwhile but easier things to do.
~maru
On 4/6/06, maru dubshinki marudubshinki@gmail.com wrote:
Instead of adding a major burden to admins deleting images (which is something that needs to be done; I'm fairly sure there are large backlogs), and slowing them down, and aggravating them, why not simply fix image deletion's permament deletion?
Off-the-cuff remark: Why not upload a new version of the image which is totally blank, and protect it with a note saying that if no consensus is gained to undelete the image in a month, it should then be permanently deleted?
Steve
On 4/10/06, Steve Bennett stevage@gmail.com wrote:
On 4/6/06, maru dubshinki marudubshinki@gmail.com wrote:
Instead of adding a major burden to admins deleting images (which is something that needs to be done; I'm fairly sure there are large backlogs), and slowing them down, and aggravating them, why not simply fix image deletion's permament deletion?
Off-the-cuff remark: Why not upload a new version of the image which is totally blank, and protect it with a note saying that if no consensus is gained to undelete the image in a month, it should then be permanently deleted?
Volume. Approximately 1,500 images are deleted every day. It's a slow enough process when the image is already orphaned, and clearly meets the requirements for deletion. Adding "blank, protect, and check back in a month" will just make things that much slower.
(speaking of images, [[Category:Orphaned fairuse images]] has a backlog of roughly 1800 items right now)
-- Mark [[User:Carnildo]]
On 4/10/06, Mark Wagner carnildo@gmail.com wrote:
Volume. Approximately 1,500 images are deleted every day. It's a slow enough process when the image is already orphaned, and clearly meets the requirements for deletion. Adding "blank, protect, and check back in a month" will just make things that much slower.
Well, obviously that could be streamlined, but since I'm not an admin, I won't push the point.
Steve
On 10/04/06, Mark Wagner carnildo@gmail.com wrote:
Off-the-cuff remark: Why not upload a new version of the image which is totally blank, and protect it with a note saying that if no consensus is gained to undelete the image in a month, it should then be permanently deleted?
Volume. Approximately 1,500 images are deleted every day. It's a slow enough process when the image is already orphaned, and clearly meets the requirements for deletion. Adding "blank, protect, and check back in a month" will just make things that much slower.
(speaking of images, [[Category:Orphaned fairuse images]] has a backlog of roughly 1800 items right now)
Of which 1600 have been marked for more than a week - admins, please!
http://tools.wikimedia.de/~gmaxwell/cgi-bin/report_orphan_fair_use.py
-- - Andrew Gray andrew.gray@dunelm.org.uk
Andrew Gray wrote:
On 10/04/06, Mark Wagner carnildo@gmail.com wrote:
Off-the-cuff remark: Why not upload a new version of the image which is totally blank, and protect it with a note saying that if no consensus is gained to undelete the image in a month, it should then be permanently deleted?
Volume. Approximately 1,500 images are deleted every day. It's a slow enough process when the image is already orphaned, and clearly meets the requirements for deletion. Adding "blank, protect, and check back in a month" will just make things that much slower.
(speaking of images, [[Category:Orphaned fairuse images]] has a backlog of roughly 1800 items right now)
Of which 1600 have been marked for more than a week - admins, please!
http://tools.wikimedia.de/~gmaxwell/cgi-bin/report_orphan_fair_use.py
<groan> Those aren't going to get so much attention as long as we have thousands of of bad uploads streaming in on a daily basis, and admins have to clean everything up with the equivalent of a teaspoon, because OMG we might inadvertantly delete a highly valuable view up somebody's nostrils taken with a cellphone camera.
Arbcom should consider that for a new category of punishments - "clean up 1,000 image uploads".
Stan
On 4/11/06, Andrew Gray shimgray@gmail.com wrote:
On 10/04/06, Mark Wagner carnildo@gmail.com wrote:
Off-the-cuff remark: Why not upload a new version of the image which is totally blank, and protect it with a note saying that if no consensus is gained to undelete the image in a month, it should then be permanently deleted?
Volume. Approximately 1,500 images are deleted every day. It's a slow enough process when the image is already orphaned, and clearly meets the requirements for deletion. Adding "blank, protect, and check back in a month" will just make things that much slower.
(speaking of images, [[Category:Orphaned fairuse images]] has a backlog of roughly 1800 items right now)
Of which 1600 have been marked for more than a week - admins, please!
http://tools.wikimedia.de/~gmaxwell/cgi-bin/report_orphan_fair_use.py
I would but funny things happen when I delete images. I think it is something to do with the classic skin.
-- geni
On 11 Apr 2006, at 12:42, Andrew Gray wrote:
Of which 1600 have been marked for more than a week - admins, please!
http://tools.wikimedia.de/~gmaxwell/cgi-bin/report_orphan_fair_use.py
And they are such a load of rubbish too. We need to stop this stuff from being uploaded. Maybe we should allow non-admins to delete images...
Off to delete some more.
Justinc
Justin Cormack wrote:
On 11 Apr 2006, at 12:42, Andrew Gray wrote:
Of which 1600 have been marked for more than a week - admins, please!
http://tools.wikimedia.de/~gmaxwell/cgi-bin/report_orphan_fair_use.py
And they are such a load of rubbish too. We need to stop this stuff from being uploaded. Maybe we should allow non-admins to delete images...
It's been said before, but further study of upload patterns bears it out - a large percentage of bad uploads could be prevented by disallowing image upload until a new user has been around for at least a week, and by blocking persistent copyviolaters.
Many of the uploaders are so new they haven't even figured out how to get images displayed in an article. The contributions list shows edits to topic X, then upload of X-copyvio.jpg, then maybe another couple of edits, and then that's it - the image, good or bad, is an orphan. The especially sad cases are when they upload pictures of themselves, but never edit their user page.
Stan
Stan Shebs wrote:
Justin Cormack wrote:
On 11 Apr 2006, at 12:42, Andrew Gray wrote:
Of which 1600 have been marked for more than a week - admins, please!
http://tools.wikimedia.de/~gmaxwell/cgi-bin/report_orphan_fair_use.py
And they are such a load of rubbish too. We need to stop this stuff
from being uploaded. Maybe we should allow non-admins to delete
images...
It's been said before, but further study of upload patterns bears it out - a large percentage of bad uploads could be prevented by disallowing image upload until a new user has been around for at least a week, and by blocking persistent copyviolaters.
Many of the uploaders are so new they haven't even figured out how to get images displayed in an article. The contributions list shows edits to topic X, then upload of X-copyvio.jpg, then maybe another couple of edits, and then that's it - the image, good or bad, is an orphan. The especially sad cases are when they upload pictures of themselves, but never edit their user page.
Stan
I agree. We already restrict uploading to logged-in users only, since allowing IP-only uploads was a disaster; increasing that restriction to allow only logged-in users whose accounts are not recently created, using the same criteria used in the semi-protection mechanism, seems eminently reasonable to me.
-- Neil
On Wed, 26 Apr 2006 01:53:32 +0200, Neil Harris usenet@tonal.clara.co.uk wrote:
Stan Shebs wrote:
<snip>
It's been said before, but further study of upload patterns bears it out - a large percentage of bad uploads could be prevented by disallowing image upload until a new user has been around for at least a week, and by blocking persistent copyviolaters.
Many of the uploaders are so new they haven't even figured out how to get images displayed in an article. The contributions list shows edits to topic X, then upload of X-copyvio.jpg, then maybe another couple of edits, and then that's it - the image, good or bad, is an orphan. The especially sad cases are when they upload pictures of themselves, but never edit their user page.
Stan
I agree. We already restrict uploading to logged-in users only, since allowing IP-only uploads was a disaster; increasing that restriction to allow only logged-in users whose accounts are not recently created, using the same criteria used in the semi-protection mechanism, seems eminently reasonable to me.
Yeah I do think that would be a very good idea. Users who are too fresh to move pages or edit semi protected pages rely should not be uploading images. Granted there are people who have been editing for years who probably should not be uploading images either, but all in all brand new acounts acount for the bulk of the problem images.
If we required users to "age" for a bit before they are allowed to upload images it would have seleval benefits. First of all they might just end up reading a policy page or two before they upload stuff, and secondly they are somewhat more likely to remain active, so that they can actualy go back and fix problems that are pointed out to them. Currently I'm under the impression that most of the "bad" uploads are made by "disposable" acounts where someone just register an acount purely for the ability to upload images, do a few uploads and then never log in again. This is bad in so many ways, firstly the images never get fixed, secondly it means image taggers start getting used to uploaders beeing non-responsive so they stop notifying them and then they upset the few active uploaders who suddenly find theyr images deleted without a word of warning (forgetting for a moment the dire warnings on the upload page itself) and so on. I guess there is a danger of simply moving the problem to commons if users find themselves uneable to upload to Wikipedia imedeately after registering, but I think it would at the very least would be worth trying out for a month or two and see what happens.
On 4/27/06, Sherool jamydlan@online.no wrote:
Currently I'm under the impression that most of the "bad" uploads are made by "disposable" acounts where someone just register an acount purely for the ability to upload images, do a few uploads and then never log in again.
That is one major group. The other is individals who stay around but upload insane numbers of problem images.
-- geni
Yeah, but if we can slow one group down, all the better. At least with the long-time wikipedian uploading a ton of improperly tagged images you have someone you can open a dialogue with. With the majority of images not tagged, or tagged incorrectly it seems the people never come back, or change the image info at all.
Image copyright is one of the hardest areas for some people to get, especially people who are new to the entire concept of US copyright law, a little time to read policy and the reasons behind it can't be a bad thing.
On 4/27/06, geni geniice@gmail.com wrote:
On 4/27/06, Sherool jamydlan@online.no wrote:
Currently I'm under the impression that most of the "bad" uploads are made by "disposable" acounts where someone just register an acount purely for the ability to upload images, do a few uploads and then never log in again.
That is one major group. The other is individals who stay around but upload insane numbers of problem images.
-- geni _______________________________________________ WikiEN-l mailing list WikiEN-l@Wikipedia.org To unsubscribe from this mailing list, visit: http://mail.wikipedia.org/mailman/listinfo/wikien-l
On 4/5/06, Ilmari Karonen nospam@vyznev.net wrote:
Jimmy Wales wrote:
With images, though, there has grown this bizarre culture that we must not delete anything until we have a consensus to do so. This is partly because images can't be easily restored, and there is some legitimacy to that as a factor in how we do things, but I think it has gotten much worse. Wildly inappropriate images which do not even have a majority support for keeping are kept in articles in a way that similarly inappropriate text would be shot on sight.
I think this is in fact a (if not the) major cause of this controversy. I'm not at all surprised that many users are very nervous about image deletion: we still remember when CSD I4 was added in September and as a consequence people had to go to Answers.com and other mirrors to look for copies of hastily deleted images. I don't know if any free images were actually permanently lost because of that, but it wouldn't surprise me at all either.
If I read Jimbo correctly, he's just saying that while the debate is happening, the image should not be linked to from any articles. Not necessarily that the image should be "permanently" deleted. I'm basing this on "...are kept in articles..."
Steve
Sounds good to me. :-)
Sydney aka FloNight
Steve Bennett wrote:
On 4/5/06, Ilmari Karonen nospam@vyznev.net wrote:
Jimmy Wales wrote:
With images, though, there has grown this bizarre culture that we must not delete anything until we have a consensus to do so. This is partly because images can't be easily restored, and there is some legitimacy to that as a factor in how we do things, but I think it has gotten much worse. Wildly inappropriate images which do not even have a majority support for keeping are kept in articles in a way that similarly inappropriate text would be shot on sight.
On 4/3/06, Sydney Poore poore5@adelphia.net wrote:
While we are discussing child pornography and drawing of children engaged in sex acts, lets talk about external links to sites with these images.
Instead of a single image that can be presented in a context to make it clear it is educational , a bigger concern is external links to image boards with images of a child engaged in sexually explicit act. These appear on Wikipedia on a regular basis. When I went to image board web sites to look for inappropriate images, I felt disgusted that another Wikipedia editor would put it on Wikipedia. I think we need to modify our policy/guidelines dealing with images with children engaged in sexually explicit acts. They should not be permitted in my opinion. Editors that repeatedly add them should be blocked for being disruptive.
What articles are you referring to?
-- Mark [[User:Carnildo]]
Mark,
The external link of the Lolicon article labeled Renchan Lolicon Community contains hundreds of lolicon images of children engaged in sex acts.
Sydney
Mark Wagner wrote:
On 4/3/06, Sydney Poore poore5@adelphia.net wrote:
While we are discussing child pornography and drawing of children engaged in sex acts, lets talk about external links to sites with these images.
Instead of a single image that can be presented in a context to make it clear it is educational , a bigger concern is external links to image boards with images of a child engaged in sexually explicit act. These appear on Wikipedia on a regular basis. When I went to image board web sites to look for inappropriate images, I felt disgusted that another Wikipedia editor would put it on Wikipedia. I think we need to modify our policy/guidelines dealing with images with children engaged in sexually explicit acts. They should not be permitted in my opinion. Editors that repeatedly add them should be blocked for being disruptive.
What articles are you referring to?
-- Mark [[User:Carnildo]] _______________________________________________ WikiEN-l mailing list WikiEN-l@Wikipedia.org To unsubscribe from this mailing list, visit: http://mail.wikipedia.org/mailman/listinfo/wikien-l
On 4/4/06, Sydney Poore poore5@adelphia.net wrote:
Mark,
The external link of the Lolicon article labeled Renchan Lolicon Community contains hundreds of lolicon images of children engaged in sex acts.
Sydney
At least those appear to be legal in Japan and they at least illustrate the article. That's slightly less of a problem than photographs, though not very much less.
Mgm
MacGyverMagic/Mgm wrote:
At least those appear to be legal in Japan and they at least illustrate the article. That's slightly less of a problem than photographs, though not very much less.
I am on a plane at the moment (to Japan, ironically enough) and so I can't look at that link until I land. So I can only speak generally a few thoughts about this sort of thing.
One way that we have sensibly handled difficult external links in the past, is to print them, but as text, not as links, along with appropriate warnings. This could be the right answer in some cases.
Consider GNAA's [[Last Measure]] as an example. There is absolute no way we should ever have an actual link that sets off this monstrosity (*) for an end user, but we might merely print the link with sufficient warning to the end user about what will happen to them if they are foolish or experimental enough to paste it into their browser.
Why? This helps ensure that no user ever absentmindly clicks on this thing by accident.
--Jimbo
(*) In case you don't know about it, Last Measure takes advantage of Evil Javascript Tricks to pop up dozens or hundreds of windows on your computer, of various sorts of not-safe-for-anyone images, along with a loud voice saying 'Hey everybody, I'm looking at gay porn!' You have to kill your browser or perhaps even reboot your computer to make it stop. It is evil.