They copyright policy adopted by the contributors to Wikipedia is more restrictive than strictly required by law. This is a good thing for two primary reasons: The first is that we have a commitment to Free Content where the law has no objection to us distributing content with outrageous restrictions, and the second is that a restrictive policy removes grey areas which make enforcement difficult.
The existence of a strict policy encourages people to create free images where they are possible and has greatly increased the number of free images available to the world. An easy example is the articles on automobile models on enwiki... in the last year substantial progress has been made in creating and using free images and in many places this progress was directly driven by editors refusing to accept unfree images in the articles.
However, there are a number of cases such as the recent thread on the image of Treanna where common sense would tell us to permit the image... that it represents no compromise of our goals and no legal threat. There have been quite a few examples that I've run into...
The challenge is that if we permit this sort of decision making we find that for almost any image there is someone who will find inclusion reasonable. This is substantiated by the fact that we delete over 30,000 images per month on enwiki... after all, at least the *uploader* thought it was reasonable to inclued the image. A widespread permission to heed the 'common sense' of individual contributors in these matters would simply result in chaos and likely a massive regression in the overall freeness of our content.
I would like to propose a solution:
We should appoint or elect someone to make exceptions to our policy.
Ideally this person would carry a strong commitment to keeping our content maximally free... but my view is that even if we appointed someone with poor judgement that the bad calls of one person are highly preferable to the bad calls of all our contributors.
Much like the arbcom acts as a consensus tool to help us achieve consensus on bans and other such methods, a person in this position would help us achieve consensus for exceptions to our image use policy. They would not exist to make determinations on matters of policy, but only to permit things which are legally permissible, obviously non-harmful to our goals, but clearly against our policy.
In this manner the strict 'bright lines' policy can remain, preserving the sanity of those who work to keep our content free, but we do not suffer the harm of rejecting material which would be permitted by common sense.
Thoughts?
On 7/27/06, Gregory Maxwell gmaxwell@gmail.com wrote:
However, there are a number of cases such as the recent thread on the image of Treanna where common sense would tell us to permit the image... that it represents no compromise of our goals and no legal threat. There have been quite a few examples that I've run into...
Where?
On 7/27/06, geni geniice@gmail.com wrote:
On 7/27/06, Gregory Maxwell gmaxwell@gmail.com wrote:
However, there are a number of cases such as the recent thread on the image of Treanna where common sense would tell us to permit the image... that it represents no compromise of our goals and no legal threat. There have been quite a few examples that I've run into...
Where?
Wikipedia-l
Steve
On 7/27/06, Gregory Maxwell gmaxwell@gmail.com wrote:
Much like the arbcom acts as a consensus tool to help us achieve consensus on bans and other such methods, a person in this position would help us achieve consensus for exceptions to our image use policy.
The problem you describe can be generalized to all policymaking. Changing policy on Wikipedia is a slow and often intensely frustrating process. A tiny, petulant minority can resist positive change through sheer repetition. Often the process is cut short by bold admins trying to do "the right thing" -- which then tends to lead to an all-out escalation. Months after things have cooled down a bit, a solution may emerge from the ashes. It may not be the best solution; it may simply be that everyone is tired of hearing about the problem, and will accept whatever compromise proposal comes along.
Elected or appointed expert committees are one way to deal with this problem, and they may often work quite well. However, if we accept an expert committee as a method to achieve consensus about policy, we should equally consider direct voting on certain policy amendments. Certainly, direct voting is a more participatory model than the election of a decision-making body. And it seems as likely to lead to an accepted outcome.
"But," I hear some people say, "you can't let ordinary people vote on these complex issues. They do not understand! That's why we need to have smart people to make these decisions for us. Through enlightened, reasoned debate, surely they will find the solution that is best for us all." I'm not sure that's true. There are good votes and there are bad votes. A good vote is one where voters are presented with a concise summary of the different arguments that have come up in a discussion that preceded the vote, where the _options_ in the vote have been developed through consensus, and where there is a strong culture that pressures voters to read and understand all arguments before voting. A bad vote is one that is done ad hoc, out of process, with poor methodology and no clear prerequisites.
In my view, establishing clear ground rules for votes to change policy is a better way to deal with the problem than delegation of authority. It allows for community consensus processes (and indeed requires them to be tried first), brings out as many arguments and solutions as possible, and enables everyone to share the responsibility, credit and blame for the result.
Erik
On 7/27/06, Erik Moeller eloquence@gmail.com wrote: [snip, out of order]
In my view, establishing clear ground rules for votes to change policy is a better way to deal with the problem than delegation of authority. It allows for community consensus processes (and indeed requires them to be tried first), brings out as many arguments and solutions as possible, and enables everyone to share the responsibility, credit and blame for the result.
The current image use policy reflects the position of the active editing base and is result of a fairly strong consensus, not merely the mob rule of a majority wins vote. There is no substantial desire to change our policy.
The challenge is that only a bright line policy can protect us against a "slow and often intensely frustrating process" for each of tens of thousands of images per month. But a bright line policy will exclude things which common sense would permit. I'd like to discuss ways we can accept such exceptions without breaking the well functioning policy and without creating a slow and intensely frustrating process.
Voting is currently used as part of our image process... but because it's such a niche area, deletions tend to be virtually uncontested or the vote turns into a contest of who can inject more friends.
Elected or appointed expert committees are one way to deal with this problem, and they may often work quite well. However, if we accept an expert committee as a method to achieve consensus about policy
[snip]
I have not proposed an expert based solution. This is a role which practically any established Wikipedian would be qualified for.
"But," I hear some people say, "you can't let ordinary people vote on these complex issues. They do not understand! That's why we need to have smart people to make these decisions for us. Through enlightened, reasoned debate, surely they will find the solution that is best for us all." I'm not sure that's true.
I'm glad you're not sure it's true, because I haven't suggested it.. and I haven't seen anyone else here suggest it.
There isn't any group of people appointed to be experts on enwiki, and as far as I'm aware there has been no serious advocacy of such a system any time recently.
There are good votes and there are bad votes. A good vote is one where voters are presented with a concise summary of the different arguments that have come up in a discussion that preceded the vote, where the _options_ in the vote have been developed through consensus, and where there is a strong culture that pressures voters to read and understand all arguments before voting. A bad vote is one that is done ad hoc, out of process, with poor methodology and no clear prerequisites.
It's not clear that the facts support your position. Perhaps on dewiki? Experimentation on Enwiki has demonstrated that the majority of the participants in at least some of our voting process do not read evidence presented preceding or included in the debate (measured by placing external links in intros and in individual votes). Instead it would appear that many voters make their decision based on initial impression and a passing glance at the standing votes.
While it may be true that acceptable results can be achieved through good votes, where the participants consist of informed parties, it is not clear if such votes ever actually happen on enwiki.
I'm not sure if you're aware of [[Dunbar's number]]....
Experience on Enwiki seems to support that and related research.
On single articles (over 99% of which have less than that dunbar upper bound on community size) decisions are often quickly and supported made via consensus which is so clear that a poll is unneeded.
These decisions are not merely tolerated with indifference but supported.
In many voting areas and on many policy pages, where we have substantially more than Dunbar's number of participants, new proposals are treated with either icy indifference or extreme factionalization. It sometimes appears that the vote has become nothing more than a contest to measure which subgroup can draw in the most members. This is also supported by the overwhelming clustering that happens with some votes.
I believe that without the control feedback of a functioning social group our users are too unwilling to engage in the mixture of compromise and consideration which are required to have a 'good vote' and instead their behavior appears to be determined more by a desire to assert their authority (by fighting against something rather than working with it).
This matter appears to be made substantially worse because partisan social groups form (i.e. inclusionist vs deletionists) which include participating in votes as part of their social activity. So rather than having a big social group which includes people of all perspectives encouraging their members to work for the good of the project, we get fights between smaller social groups which include more myopic subsets who socially reinforce unthinking 'mob like' behavior. :( ... then we reward this behavior by allowing the count of people to impact our decisions.
On 7/27/06, Gregory Maxwell gmaxwell@gmail.com wrote:
I have not proposed an expert based solution. This is a role which practically any established Wikipedian would be qualified for.
Then I misunderstood your proposal. What would this role be advertised as? Wouldn't candidates who are experts in copyright law be favored for it, naturally? If not, wouldn't there be a great risk that the exceptions made by a single person or small group are actually in violation of the law?
It's not clear that the facts support your position. Perhaps on dewiki? Experimentation on Enwiki has demonstrated that the majority of the participants in at least some of our voting process do not read evidence presented preceding or included in the debate (measured by placing external links in intros and in individual votes). Instead it would appear that many voters make their decision based on initial impression and a passing glance at the standing votes.
I have not seen many votes or polls where the vote was conducted according to a defined process, where the arguments have to be fully and carefully summarized long before any voting begins. Rather, in most cases, it seems to be a mixed process of voting and arguing, where arguments often have a hard time reaching visibility, and early voters ignore later arguments -- making the results difficult to interpret.
It seems more sensible to split the process into first arguing and then, if no consensus can be found, voting. Then you also avoid the headcount on processes like AfD and instead have it as a pure discussion, with a vote only when necessary.
Take a look at the polls and surveys in http://en.wikipedia.org/wiki/Wikipedia:Current_surveys where, at best, you will find a brief intro summarizing the _issue_, but hardly ever a summary of the arguments for and against each position. Most of these polls are started in moments of frustration, with little planning and inconsistent prerequisites.
While it may be true that acceptable results can be achieved through good votes, where the participants consist of informed parties, it is not clear if such votes ever actually happen on enwiki.
That seems to be a matter of defining policy in such a way that only good (i.e. well-organized) votes are allowed to go through. The de.wiki Meinungsbilder do indeed seem to be more well thought-out in this regard: http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder
Take this example: http://de.wikipedia.org/wiki/Wikipedia:Meinungsbilder/Unver%C3%B6ffentlichte...
It's about whether unpublished movies should have articles. Note the extensive background information provided, the clearly defined voting criteria, the rules on (re)moving votes, and the summaries of each proposal. It still has room for improvement, but is pretty close to what I'm thinking of. This kind of process can be followed both on a small and a large scale -- depending on the complexity of the problem.
I'm not sure if you're aware of [[Dunbar's number]]....
Dunbar's number is the exact reason I propose voting and summarizing/refactoring as a last resort in large scale decision-making. Note that both arguments and votes can be anonymized.
I believe that without the control feedback of a functioning social group our users are too unwilling to engage in the mixture of compromise and consideration which are required to have a 'good vote' and instead their behavior appears to be determined more by a desire to assert their authority (by fighting against something rather than working with it).
That's why trying to work out a mutually agreeable solution in a social group should always precede a vote. When the group gets too large and factions form, a vote may be the best way to find acquiescence to a particular solution.
Erik
We can't rely on the common sense of uploaders. Many editors, not neccesarily new ones, have no clue whatsoever about copyright as is shown by the vast number of copyvio images circulating the net.
Recently there was a question about someone who wanted to post another image in an article on anime animation which already included multiple images. Any more would not illustrate the article any better so including it would be a bad idea.
To actually have smart/good decisions made and enacted, we need to enlighten as much people about the existence and finer details of copyright and fair use.
Take for example [[Nicole Kidman]]. One fair use image to illustrate who we're talking about is okay. Having an entire gallery isn't. Maps, don't use Google Earth or commercial maps, there's an entire team for free alternatives.
We shouldn't rely on the completely clueless when determining rules and actions with regard to copyright.
Mgm
On 7/27/06, MacGyverMagic/Mgm macgyvermagic@gmail.com wrote:
We shouldn't rely on the completely clueless when determining rules and actions with regard to copyright.
We shouldn't? Then why do we give everyone an upload button and encourage them to use it?
On 7/28/06, Mark Wagner carnildo@gmail.com wrote:
We shouldn't? Then why do we give everyone an upload button and encourage them to use it?
-- Mark
Technicaly we didn't but in any case were young an inoccent.
On 7/28/06, Mark Wagner carnildo@gmail.com wrote:
On 7/27/06, MacGyverMagic/Mgm macgyvermagic@gmail.com wrote:
We shouldn't rely on the completely clueless when determining rules and actions with regard to copyright.
We shouldn't? Then why do we give everyone an upload button and encourage them to use it?
+1 insightful
Steve
On 7/28/06, Steve Bennett stevagewp@gmail.com wrote:
On 7/28/06, Mark Wagner carnildo@gmail.com wrote:
On 7/27/06, MacGyverMagic/Mgm macgyvermagic@gmail.com wrote:
We shouldn't rely on the completely clueless when determining rules
and
actions with regard to copyright.
We shouldn't? Then why do we give everyone an upload button and encourage them to use it?
+1 insightful
Steve _______________________________________________ WikiEN-l mailing list WikiEN-l@Wikipedia.org To unsubscribe from this mailing list, visit: http://mail.wikipedia.org/mailman/listinfo/wikien-l
I'm still not sure why that is. Personally, I find doing that is a bad idea.
Mgm
On 7/28/06, MacGyverMagic/Mgm macgyvermagic@gmail.com wrote:
On 7/28/06, Steve Bennett stevagewp@gmail.com wrote:
On 7/28/06, Mark Wagner carnildo@gmail.com wrote:
On 7/27/06, MacGyverMagic/Mgm macgyvermagic@gmail.com wrote:
We shouldn't rely on the completely clueless when determining rules
and
actions with regard to copyright.
We shouldn't? Then why do we give everyone an upload button and encourage them to use it?
+1 insightful
I'm still not sure why that is. Personally, I find doing that is a bad idea.
I can't speak for the rest of you If Wikipedia didn't permit editing by anons I never would have started and If I had to go ask permission to upload I probably never would have uploaded...
Actually, the friction of figuring out the uploading procedure (tagging, commons, policy, etc) resulted in me asking someone else to upload my first image submission ([[Celesta]]). At the time I had many hundreds of edits as an anon, I had an account I had recently opened, and I was already operating another Mediawiki wiki, so I understood uploading technically.. What I wasn't comfortable with was policy.
I am strongly in support of taking action to reduce the amount of copyright outrageous violations that we get... But I think it would be self defeating to implement any policy which discourages real contribution. There must be some solution which simultaneously curtails copyright violation (or at least its impact) and encourages contribution.
Our challenge is to find that solution.
On 7/28/06, Gregory Maxwell gmaxwell@gmail.com wrote:
I am strongly in support of taking action to reduce the amount of copyright outrageous violations that we get... But I think it would be self defeating to implement any policy which discourages real contribution. There must be some solution which simultaneously curtails copyright violation (or at least its impact) and encourages contribution.
How about a middle ground: 1) By default, any logged in user can upload any image they themselves have produced. (some way of discouraging them from uploading any other images needs to be found) 2) "Confirmed" users, preferably those who have uploaded 3 or 4 of their own images, can upload copyrighted images for fair use.
Steve
On 7/28/06, Steve Bennett stevagewp@gmail.com wrote:
How about a middle ground:
- By default, any logged in user can upload any image they themselves
have produced. (some way of discouraging them from uploading any other images needs to be found) 2) "Confirmed" users, preferably those who have uploaded 3 or 4 of their own images, can upload copyrighted images for fair use.
...err, I clearly left out public domain/free/reusable images, which would go in bucket 2).
Steve
On 7/28/06, Steve Bennett stevagewp@gmail.com wrote:
On 7/28/06, Gregory Maxwell gmaxwell@gmail.com wrote:
I am strongly in support of taking action to reduce the amount of copyright outrageous violations that we get... But I think it would be self defeating to implement any policy which discourages real contribution. There must be some solution which simultaneously curtails copyright violation (or at least its impact) and encourages contribution.
How about a middle ground:
- By default, any logged in user can upload any image they themselves
have produced. (some way of discouraging them from uploading any other images needs to be found) 2) "Confirmed" users, preferably those who have uploaded 3 or 4 of their own images, can upload copyrighted images for fair use.
The general idea is acceptable to me... but will it be sufficient? We get a lot of "self made public domain" movie screenshots (for example http://en.wikipedia.org/wiki/Image:Ep3skies.jpg).
It would seem that uploaders can't even figure out that they aren't the copyright holder... so a soft security limitation on uploading possibly questionable content may not be as effective as we'd like...
But would it be effective enough?
On 7/28/06, Gregory Maxwell gmaxwell@gmail.com wrote:
The general idea is acceptable to me... but will it be sufficient? We get a lot of "self made public domain" movie screenshots (for example http://en.wikipedia.org/wiki/Image:Ep3skies.jpg).
It would seem that uploaders can't even figure out that they aren't the copyright holder... so a soft security limitation on uploading possibly questionable content may not be as effective as we'd like...
But would it be effective enough?
I admit I don't do a lot of dealing with newbies uploading bad images. But given that most of them are new, naive and not malicious there must be ways of sheepherding them to give us the results we want. A series of questions, wizard style:
a) Is this a photo, a diagram, or something else like a logo or screenshot b) If it's a photo, did you take this with your own camera? c) If not, do you know where it came from? Where? d) Do you know what kind of licence they have? Is it "non-commercial", "public domain", etc? e) Do you think we can use it under "fair use" provisions? Please explain.
etc. I would steer clear of telling them the consequences of "the wrong choice". Best that they simply give us as much information as possible, and let them know their uploads will be reviewed by others for suitability.
Maybe rather than preventing newbies from uploading, we should simply force newbies to follow the wizard each time, and permission to bypass the wizard would be a privilege to obtain.
Steve
On 7/28/06, Steve Bennett stevagewp@gmail.com wrote:
I admit I don't do a lot of dealing with newbies uploading bad images. But given that most of them are new, naive and not malicious there must be ways of sheepherding them to give us the results we want. A series of questions, wizard style:
a) Is this a photo, a diagram, or something else like a logo or screenshot b) If it's a photo, did you take this with your own camera? c) If not, do you know where it came from? Where? d) Do you know what kind of licence they have? Is it "non-commercial", "public domain", etc? e) Do you think we can use it under "fair use" provisions? Please explain.
etc.
I'm working on something like this. It's already up to ten questions just to decide if an image is under a free license, a non-free license, or is self-created. You can't just ask "do you think we can use it under 'fair use' provisions" -- the answer will always be "yes". The set of questions for determining fair use is going to need to be fairly large.
I would steer clear of telling them the consequences of "the wrong choice". Best that they simply give us as much information as possible, and let them know their uploads will be reviewed by others for suitability.
Are you offering to do the reviewing? There are over two thousand images uploaded each day, about half of which are uploaded by very new users.
On 7/29/06, Mark Wagner carnildo@gmail.com wrote:
I'm working on something like this. It's already up to ten questions just to decide if an image is under a free license, a non-free license, or is self-created. You can't just ask "do you think we can use it under 'fair use' provisions" -- the answer will always be "yes". The set of questions for determining fair use is going to need to be fairly large.
My presumption was actually to do something like this: 1 Do you think it can be used under fair use? Why? 2 User writes some half-arsed response 3 Image is stored in quarantine 4 Periodically, administrator goes through and checks if any have decent rationales, and deletes the rest
That is, kind of like a police officer dutifully noting down "So, you're an extra terrestrial here to save humanity by removing gold necklaces from "earthlings" necks. That's great, thanks".
I suppose it would be kind of bad faith, but how else to resolve the problem?
Are you offering to do the reviewing? There are over two thousand images uploaded each day, about half of which are uploaded by very new users.
Once the images are filtered appropriately, spend as much time reviewing them as is appropriate. In "problem areas", delete slabs without looking if needed.
Steve
Steve Bennett wrote:
On 7/29/06, Mark Wagner carnildo@gmail.com wrote:
Are you offering to do the reviewing? There are over two thousand images uploaded each day, about half of which are uploaded by very new users.
Once the images are filtered appropriately, spend as much time reviewing them as is appropriate. In "problem areas", delete slabs without looking if needed.
Every experienced editor should volunteer to process 500 new uploads, it's most enlightening. For instance, it's not that common to have even 2-3 images that can be handled as a group, they're all different in completely random ways. I've never been able to get my time much below 100 images/hour, and that's only at times when the servers are responsive.
Stan
Steve Bennett wrote:
On 7/28/06, Gregory Maxwell gmaxwell@gmail.com wrote:
I am strongly in support of taking action to reduce the amount of copyright outrageous violations that we get... But I think it would be self defeating to implement any policy which discourages real contribution. There must be some solution which simultaneously curtails copyright violation (or at least its impact) and encourages contribution.
How about a middle ground:
- By default, any logged in user can upload any image they themselves
have produced. (some way of discouraging them from uploading any other images needs to be found) 2) "Confirmed" users, preferably those who have uploaded 3 or 4 of their own images, can upload copyrighted images for fair use.
I hate to spoil your fun, but what is going to stop people from just claiming "Yes, I made this" on every random image they get from Flickr/Photobucket/Google images?
On 7/29/06, Alphax (Wikipedia email) alphasigmax@gmail.com wrote:
I hate to spoil your fun, but what is going to stop people from just claiming "Yes, I made this" on every random image they get from Flickr/Photobucket/Google images?
It is pretty clear that our copyvios come because strangers to our project don't know any better... It was free for them to obtain from some site, so they can't see a reason we can't have the picture too.
Unfortunately it seems that when the site tells them no (as happens with the prohibited options in the drop down, a great many people have no reservations lying to the site in order to get it to not say their images will be deleted. :( I don't think this is because people are dishonest at heart, but rather because people are conditioned to press buttons until the computer does whatever they want.
That said, I think "saying no" would cause a real reduction in bad images... but if we aren't careful how we say no we will encourage people to twiddle the knobs until they've left misleading metadata. I'd rather we have more violations which are tagged somewhat correctly than fewer violations but with them tagged as free content.
On 7/29/06, Gregory Maxwell gmaxwell@gmail.com wrote:
That said, I think "saying no" would cause a real reduction in bad images... but if we aren't careful how we say no we will encourage people to twiddle the knobs until they've left misleading metadata. I'd rather we have more violations which are tagged somewhat correctly than fewer violations but with them tagged as free content.
Maybe if there weren't any knobs at all it'd be better.
1) Any logged in user can upload an image, if they include text which explains why they think the image fits within the policy. A tutorial can help guide them as to what the policy is, but ultimately anything can be uploaded with explanatory text.
2) Only after you've convinced us that you know what you're talking about and aren't lying about everything, then you get a pulldown menu of choices.
This strategy would probably work best if there were a way to lock an image so that it can't appear in any article. Who would be able to add or remove the lock, and whether or not the lock would be on or off by default (personally I'd say probably off) could be tweaked to find the best solution. Blatant violations would of course be deleted - the lock would be for situations where there's just not enough explanation, or the explanation hasn't been checked.
Anthony
On 7/29/06, Anthony wikilegal@inbox.org wrote:
This strategy would probably work best if there were a way to lock an image so that it can't appear in any article.
there is but useing it to list several thousand problem images would kill the servers.
geni wrote:
On 7/29/06, Anthony wikilegal@inbox.org wrote:
This strategy would probably work best if there were a way to lock an image so that it can't appear in any article.
there is but useing it to list several thousand problem images would kill the servers.
I assume you mean the bad image list? Thanks to robchurch, there is now an extension hook for it, so more efficient methods could be added quite easily.
In fact, both this proposal and the current bad image list could be replaced by a single boolean field in the image table, toggleable by admins.
The only real issue, as Anthony notes, is when and how to grant users the right to bypass the quarantine when uploading.
Ilmari Karonen wrote:
geni wrote:
On 7/29/06, Anthony wikilegal@inbox.org wrote:
This strategy would probably work best if there were a way to lock an image so that it can't appear in any article.
there is but useing it to list several thousand problem images would kill the servers.
In fact, both this proposal and the current bad image list could be replaced by a single boolean field in the image table, toggleable by admins.
Actually, thinking about it a bit more, anyone who can bypass the quarantine when uploading should presumably also be able to release an existing image from the quarantine, assuming it hasn't been explicitly protected.
Also, we'd need to decide how to handle images on Commons.
Ilmari Karonen wrote:
Ilmari Karonen wrote:
geni wrote:
On 7/29/06, Anthony wikilegal@inbox.org wrote:
This strategy would probably work best if there were a way to lock an image so that it can't appear in any article.
there is but useing it to list several thousand problem images would kill the servers.
In fact, both this proposal and the current bad image list could be replaced by a single boolean field in the image table, toggleable by admins.
Actually, thinking about it a bit more, anyone who can bypass the quarantine when uploading should presumably also be able to release an existing image from the quarantine, assuming it hasn't been explicitly protected.
Also, we'd need to decide how to handle images on Commons.
Cooperate with Commons?
Alphax (Wikipedia email) wrote:
Ilmari Karonen wrote:
Also, we'd need to decide how to handle images on Commons.
Cooperate with Commons?
...yes, of course. Sorry, I meant in the technical sense. The Commons images don't exists in the same database as the local ones, and I'm not really very familiar with the way they're handled. The current bad image list circumvents that issue by simply doing hash lookups on the image names without actually checking the image tables at all.
On Sat, 29 Jul 2006 14:25:22 +0200, Anthony wikilegal@inbox.org wrote:
On 7/29/06, Gregory Maxwell gmaxwell@gmail.com wrote:
That said, I think "saying no" would cause a real reduction in bad images... but if we aren't careful how we say no we will encourage people to twiddle the knobs until they've left misleading metadata. I'd rather we have more violations which are tagged somewhat correctly than fewer violations but with them tagged as free content.
Maybe if there weren't any knobs at all it'd be better.
- Any logged in user can upload an image, if they include text which
explains why they think the image fits within the policy. A tutorial can help guide them as to what the policy is, but ultimately anything can be uploaded with explanatory text.
- Only after you've convinced us that you know what you're talking
about and aren't lying about everything, then you get a pulldown menu of choices.
This strategy would probably work best if there were a way to lock an image so that it can't appear in any article. Who would be able to add or remove the lock, and whether or not the lock would be on or off by default (personally I'd say probably off) could be tweaked to find the best solution. Blatant violations would of course be deleted - the lock would be for situations where there's just not enough explanation, or the explanation hasn't been checked.
Hmm, this gave be an idea, maybe not a good idea, but I'll throw it out there anyway:
Remove the dropdown list entierly. Rather than a copyright tag include a boilplate text that give detailed step by step instruction on what you need to add to the image to have it kept. This boilplace also categorise the image as [[Category:Incomplete uploads as of XXX]] (replace XXX with the date). The "sucessfull upload" message should also be changed to make it clear that unless the uploader goes to the image page and follow the instructions and include all the required information the upload is considered "incomplete" and the image can not be used and will be deleted after a week (yes the downside would be that even peple who know what they are doing would have to do this for each upload) unless the process is completed by having the proper source and license data added.
Then have a bot summarily remove all "incomplete uploads" from articles and remind the uploader after a couple of days (kinda like what Orphan bot does now), and make images with the default boilplate intact after a week a speedy deletion criterea.
This would force the uploader to read a bit and work a little to find the correct tag rater than just picking something at random from a dropdown list. The problem is though that a lot of people are remarkably resistant to taking any notice of instructions and warnings. They just want to upload cool images, and get very upset when we remove and delete them.
On 7/29/06, Sherool jamydlan@online.no wrote:
The problem is though that a lot of people are remarkably resistant to taking any notice of instructions and warnings. They just want to upload cool images, and get very upset when we remove and delete them.
Which is why we need to remove the upload link. If they need to hunt around a bit before they can upload images, there's a chance they'll run into the instructions.
On 7/31/06, Mark Wagner carnildo@gmail.com wrote:
On 7/29/06, Sherool jamydlan@online.no wrote:
The problem is though that a lot of people are remarkably resistant to taking any notice of instructions and warnings. They just want to upload cool images, and get very upset when we remove and delete them.
Which is why we need to remove the upload link. If they need to hunt around a bit before they can upload images, there's a chance they'll run into the instructions.
Simply making the upload link a link to upload instructions... and asking users to upload by forming a image redlink would actually do a lot to help. A lot of images get uploaded without ever being linked in.. and where an image is used is very helpful in figuring it out its status (for fair use images, at least).
On 7/29/06, Alphax (Wikipedia email) alphasigmax@gmail.com wrote:
I hate to spoil your fun, but what is going to stop people from just claiming "Yes, I made this" on every random image they get from Flickr/Photobucket/Google images?
I would be surprised if changing from a "stick a template on a page and hope no one notices" to a "tell us honestly - did you make this? we're going to check!" model, made no difference. People who aren't there for malicious reasons are going to dislike blatantly lying when asked "did you take this photo with your own camera?"
So: physically what stops them? Nothing. Morally what stops them? Having to blatantly lie.
Steve
On 7/29/06, Alphax (Wikipedia email) alphasigmax@gmail.com wrote:
Steve Bennett wrote:
On 7/28/06, Gregory Maxwell gmaxwell@gmail.com wrote:
I am strongly in support of taking action to reduce the amount of copyright outrageous violations that we get... But I think it would be self defeating to implement any policy which discourages real contribution. There must be some solution which simultaneously curtails copyright violation (or at least its impact) and encourages contribution.
How about a middle ground:
- By default, any logged in user can upload any image they themselves
have produced. (some way of discouraging them from uploading any other images needs to be found) 2) "Confirmed" users, preferably those who have uploaded 3 or 4 of their own images, can upload copyrighted images for fair use.
I hate to spoil your fun, but what is going to stop people from just claiming "Yes, I made this" on every random image they get from Flickr/Photobucket/Google images?
If you upload enough random images from Flickr/Photobucket/Google, you're bound to get caught. And once you get caught blatantly lying, all your images should become highly suspect.
Of course, "fair use images" don't really require you to trust anyone, so if anything *they* should be the ones that any user can upload.
Ultimately I think the best "solution" is the same one that's used for text. Assume good faith, require references, and get rid of the policy violations as much as possible.
Anthony
On 7/27/06, Gregory Maxwell gmaxwell@gmail.com wrote:
On 7/27/06, Erik Moeller eloquence@gmail.com wrote: [snip, out of order]
In my view, establishing clear ground rules for votes to change policy is a better way to deal with the problem than delegation of authority. It allows for community consensus processes (and indeed requires them to be tried first), brings out as many arguments and solutions as possible, and enables everyone to share the responsibility, credit and blame for the result.
The current image use policy reflects the position of the active editing base and is result of a fairly strong consensus, not merely the mob rule of a majority wins vote. There is no substantial desire to change our policy.
The challenge is that only a bright line policy can protect us against a "slow and often intensely frustrating process" for each of tens of thousands of images per month. But a bright line policy will exclude things which common sense would permit. I'd like to discuss ways we can accept such exceptions without breaking the well functioning policy and without creating a slow and intensely frustrating process.
Where is it that you see the current image use policy as ambiguous? I'd actually say it's fairly clear.
I could see the benefit of having a relatively small number of people decide what the policy says about a particular image, as it's a much more efficient than voting, but I wouldn't want to give them the power to make exceptions.
Policies should reflect common sense. If common sense tells us to permit an image but policies tell us not to, then the policies should be fixed.
http://en.wikipedia.org/w/index.php?title=Wikipedia:Fair_use&diff=234059...
Looks like policy was fine until September 2005. It should probably be noted that the explanation of the policy change ("Because "fair use" images are only not copyright infringement on Wikipedia when used for strictly encyclopedic reasons, their use in other contexts on Wikipedia is most likely copyright infringement.") is blatantly incorrect.
Anthony
On 7/27/06, Gregory Maxwell gmaxwell@gmail.com wrote:
Thoughts?
I think the general intent is a good one, though I don't think that the Wikipedia system of government would allow any individual to easily sit in such a position with the exception of Jimbo.
There may be other ways to achieve the same ends, though. Perhaps we could draw up special rules for exceptions on copyright issues -- they would be VERY hard to satisfy and would be designed with the few known cases of "common sense" exceptions in mind (and could amended later if others popped up). Then it could just become part of a deletion review or image policy or what have you.
Personally I think there is a lot of room for a little commonsense flexibiity in images -- people get all bent out of shape about whether or not it is hypothetical that the German government could decide to enforce all copyrights on Nazi photographs that defaulted to them, even though there is no reason to think that they ever will and it is questionable whether or not they have the legal standing to do that even if they wanted to. Ditto with people worrying that just because you can copyright a lighting arrangement scheme, suddenly all photographs of the building with this scheme become derivative works and copyright infringement. In both cases the legal danger is entirely hypothetical and dubious at that -- they are based on stretched and literal interpretations of complicated aspects of the copyright law and nobody has ever been taken to court over them. And yet they are used to delete widely circulated historical content (the Nazi images) or even user-produced free content (in terms of the lighting and architecture issues).
I don't know the best way around it -- having a class of images that says "this image is hypothetically copyrighted but Jesus Christ people it's not ever going to make trouble for anyone, even commercial reusers" is, I think, likely to be misapplied unless it is closely watched. I don't know if that should be done by one individual, by a committee, by anyone who feels up to it, etc.
FF
On 31/07/06, Fastfission fastfission@gmail.com wrote:
Personally I think there is a lot of room for a little commonsense flexibiity in images -- people get all bent out of shape about whether or not it is hypothetical that the German government could decide to enforce all copyrights on Nazi photographs that defaulted to them, even though there is no reason to think that they ever will and it is questionable whether or not they have the legal standing to do that even if they wanted to. Ditto with people worrying that just because you can copyright a lighting arrangement scheme, suddenly all photographs of the building with this scheme become derivative works and copyright infringement. In both cases the legal danger is entirely hypothetical and dubious at that -- they are based on stretched and literal interpretations of complicated aspects of the copyright law and nobody has ever been taken to court over them. And yet they are used to delete widely circulated historical content (the Nazi images) or even user-produced free content (in terms of the lighting and architecture issues).
Although I entirely agree with you on the Nazi copyright issue (though I think the images copyrights defaulted to Bavaria, which further complicates matters). Buildings are a problem for our contributors. A well known example of this is the Eiffel Tower whose owners have been known to chase down those who claim copyright over night-time pictures of the tower due to the particular way they arrange the lights.
On 7/31/06, Oldak Quill oldakquill@gmail.com wrote:
Although I entirely agree with you on the Nazi copyright issue (though I think the images copyrights defaulted to Bavaria, which further complicates matters). Buildings are a problem for our contributors. A well known example of this is the Eiffel Tower whose owners have been known to chase down those who claim copyright over night-time pictures of the tower due to the particular way they arrange the lights.
Do we have any evidence that this has ever been ruled as valid under U.S. copyright law? I know nothing of French copyright law and do not claim to, but this strikes me as something that a court would knock down in five minutes in the United States.
FF
On 7/31/06, Fastfission fastfission@gmail.com wrote: In both cases the legal danger is entirely
hypothetical and dubious at that -- they are based on stretched and literal interpretations of complicated aspects of the copyright law and nobody has ever been taken to court over them.
Did you just volunteer to be a test case?
geni wrote:
On 7/31/06, Fastfission fastfission@gmail.com wrote: In both cases the legal danger is entirely
hypothetical and dubious at that -- they are based on stretched and literal interpretations of complicated aspects of the copyright law and nobody has ever been taken to court over them.
Did you just volunteer to be a test case?
Anyone volunteering for a test case would want to view the merits of the specific case before proceeding.
Ec
On 7/31/06, geni geniice@gmail.com wrote:
On 7/31/06, Fastfission fastfission@gmail.com wrote: In both cases the legal danger is entirely
hypothetical and dubious at that -- they are based on stretched and literal interpretations of complicated aspects of the copyright law and nobody has ever been taken to court over them.
Did you just volunteer to be a test case?
Before anyone could be a test case we'd have to have someone taking them to court over the copyrights for Nazi posters. Which to my knowledge has not happened. Which is my point.
FF
Fastfission wrote:
On 7/31/06, geni geniice@gmail.com wrote:
On 7/31/06, Fastfission fastfission@gmail.com wrote: In both cases the legal danger is entirely
hypothetical and dubious at that -- they are based on stretched and literal interpretations of complicated aspects of the copyright law and nobody has ever been taken to court over them.
Did you just volunteer to be a test case?
Before anyone could be a test case we'd have to have someone taking them to court over the copyrights for Nazi posters. Which to my knowledge has not happened. Which is my point.
AFAIK the copyrights were forfeited at the end of the war.
On 8/3/06, Alphax (Wikipedia email) alphasigmax@gmail.com wrote:
AFAIK the copyrights were forfeited at the end of the war.
No, the goverment inherited the rights. The State of Bavaria claim the rights to Mein Kampf in all languages but English and Dutch, for example, and they occasionally sue to prevent it being published (see [[Mein Kampf]] article). But I've never heard them do anything like that with photographs (and their suits seem to be based more on the desire to not spread the information than a concern with copyright infringement).
Again, to my knowledge nobody has ever been sued over Nazi photograph copyrights in the post-WWII era. I use it as an example because there has been a lot of text spilled about this issue in the past on the Wiki awhile back. In situations like this I think it would be easy to have highly specialized categories of "super-low-risk fair use" claims, like one for Nazi photographs.
FF