Daniel Mayer (maveric149) wrote:
Toby Bartels wrote:
Certainly we can't blame the Grimm brothers for not copylefting their work. But we can observe the consequences of their work's public domain status and compare this to what might have happened had it been copyleft instead.
Even if they did the situation for their original work would be the same: It would have fallen into the public domain by the time Disney got their hands onto it.
Well, if we're going to consider what it might have been like if copyleft had been thought of back in the Grimms' day, then we might as well consider what it might have been like if copyrights had lasted as long then as they do now. If so, then the standard 7th edition could still have been copyright when Disney made his movie. ^_^
Of course, the point is not what the Grimms' ''should'' have done; they had no choice in the matter. The point is what would have happened under various circumstances. Such hypothesising won't be very historical.
The Disney example isn't ideal; the WikiNews example is better. But for that example, you just state, as if you know for a fact, that it will be used just as much if it's copyleft as if it isn't. The Disney example is a clear case where it would have been used less. The fact remains, however, that until you demonstrate that copyleft does ''not'' decrease usage, then your argument is not valid.
Hm. Proving a negative. Interesting logic. Prove to me that UFOs do not exist! ;)
You smile, but this would be an absolutely legitimate demand if I were to propose a policy that would only be a good policy if UFOs did not exist.
Keep in mind, you're the one that's proposing a stricter policy. So it's no surprise that I'm asking you to justify assumptions that go into making that policy!
Of course, you don't really have to ''prove'' anything absolutely; in this analogy, it'd enough to explain why UFOs are implausible. And it's quite possible that I could do that, if I had to.
Let's go through the argument carefully:
- There are two free possibilities: copyleft (A) and non-copyleft (B).
- Under possibility (A), essentially all of the derivative works
can be fed back into the original programme.
- Under possibility (B), some of the derivative works can be fed back,
while some of them will become proprietary and thus can not.
- Conclusion: More derivative works will be available for positive feedback
under (A) than under (B).
Yes, possibility B creates more forks. Exactly my point - thank you for proving it for me. :)
Aargh!!! No, I did ''not'' prove it for you!
I went through the argument carefully, so that anybody could see that in fact the conclusion remains ''unproven''. I mean, the conclusion definitely does not follow from the argument that I wrote above!
There is a difference between «greater proportion» (proven) and «greater amount» (unproven). Ignoring this difference leads to a host of statistical fallacies.
However, all of this may be moot now:
Under possibility B the number of content forks (not just mirrors) is greatly increased, thus update energy is diluted.
Now, ''this'' is a new argument. This is a premise that I never stated, and I never noticed before that you were claiming it to be true. So now maybe this will fill in the logical gap! ^_^
So, let me see if I have this straight. First, you're saying that there will be more derivative works with (B) than with (A). I have to agree that I find this quite plausible. ^_^
Then, this introduces a new factor, which is dilution of effort; presumably lowering the average quality of each fork (even ''before'' we consider whether they can be recombined). I don't think that I agree with this, since more derivative works may be spread more widely, causing more people to work on improvement in the first place. This is particularly likely in the WikiNews example, since any hesitation on the part of news organisations to redistribute GNU FDL or CC-by-sa stories translates directly into a less widespread audience. I expect that the greatest positive feedback from WikiNews will come not from how news redistributors edit our dispatches but from readers flocking back to WikiNews through an attribution link. After all, once the newspaper prints it, it's no longer news. ^_^
Under possibility A update energy is concentrated on a smaller number of forks and when there are forks any modifications made to them can be used by all the others. This reduces duplicated effort and gives more time to create more content (mirrors will and abridged versions will also propogate all over the place).
OK, this is a very important point. Again, I don't believe that this would apply to WikiNews; I doubt that WikiNews could develop from reiincorporating forks, since the lifetime of a news article would be so short. Instead, WikiNews would benefit from wide distribution by attracting new editors. Forks would still be a bad thing, but any fork of WikiNews would fork only the editing community, not very much of the content. A fork could be brought back in, with only a few days' delay to wait for the old articles to become obsolete.
In fact, this reminds me of what you wrote earlier in your post, which I skipped earlier in this reply:
When content goes into the public domain anybody can do whatever they want with it. This is especially fine by me for reference works since by the time its copyright expires it is already old and probably way out of date. I see nothing wrong with that. I just want to make sure that before that happens the reference content Wikimedia creates is protected by copyleft. Eventually all Wikimedia content will also enter the public domain, but so long as derivative works are being made, then the most recent and useful versions of that content will be copyleft.
In the case of Wikipedia, there are several forks (all defunct, I believe, except for EL) whose content remains up to date, and our copyleft licence insures that none of these will have to remain forked permanently for copyright reasons. However, after 95 years (or even the 14 years of the first US statute), a new fork of the (by then PD) 2004 material in 2099 (or even 2018) would be hopelessly out of date and thus nothing to worry about.
With Wikinews, however, this deadline would arrive even faster. As 95 years is an insanely long time to protect an encyclopaedia article, so even 14 years would be an insanely long time to protect a news story. One day might be enough, and one week would surely suffice; so this makes it less significant if the protection dwindles to zero. Any proprietary fork would have only a week's worth of useful material.
Your original conclusion would then follow from this premise:
- Just as many derivative works will be made under (A) as under (B).
If I indicated that (which I'm pretty sure I did not), then I was mistaken.
Don't worry, I didn't think that you had indicated this. This premise would fill the logical gap in the bullet-point argument, so I mentioned it in order to clarify the nature of that gap. I'm glad that you don't actually believe this, because neither do I! ^_^
But now you've given more detailed reasoning in another direction, so if you agree that the bullet-point argument in insufficient by itself, then we can set the above mistaken premise aside.
----
So let me summarise the ideas at hand.
First of all, free noncopyleft documents (like CC-by and PD documents) will be more widely distributed and will generate more derivative works. On the one hand, this will generate more public interest in the project (especially if there is attribution, as a CC-by licence would enforce), causing more people to get involved in it. But on the other hand, these derivative works may fork the project, and such a fork might get saddled with a nonfree licence, making the fork permanent.
Thus the danger of copyleft is that people may redistribute fewer documents, depressing the growth in Wikimedia popularity and new editors, not to mention the value that seeing the documents brings to the public. But at the same time, the value of copyleft is that any forks can definitely be brought back to the original project if wanted, and if nothing else, they will necessarily remain free for future users.
So there is a tension between wanting wide distribution to a point, so that many people get to use the work, some of whom will contribute, and wanting to keep things close enough to home to prevent forking and the saddling of our free content with proprietary crap. This is reflected in the tension over the pragmatic case for copyleft.
Now, assuming that I understand your position correctly, you believe that the tension should always resolve in the same way. It's always better that the material stay cohesive, with contributors working under a single free licence right here on Wikimedia.
That's certainly a reasonable pragmatic position for Wikipedia; there is little point in having encyclopaedia articles mirrored, since people only look in an encyclopaedia on special occasions. When they want to search for information, they'll use Google and we're much better off if they hit Wikipedia consistently. (Here I'm ignoring the people without access to the Internet who will eventually look stuff up in printed excerpts from Wikipedia, because that situation isn't conducive to forking in the first place.)
But the situation with WikiNews will probably be very different. First of all, many people consistently look for news every day, and they often want to get their news from a local source (like a newspaper or radio wave broadcast) that covers local events. So WikiNews will be useful to them only if local media picks it up. But also, if it gets redistributed ''too'' far away, to the point that a new project with forked content appears, then even if this fork uses proprietary copyright content, those copyrights won't prevent combining the forked project with WikiNews, should its editors agree -- only the archives would have to remain separate.
Would WikiNews be so different from Wikipedia that the tension should be resolved in a different way? Since WikiNews is now only an idea, not yet begun even in the sense of developing policies and a userbase, I don't think that any of us is in much of a position to judge. It should be the Wikimedia Foundation's policy that, whatever may come up in the future (like WikiNews), our goal shall be to develop content for free use by others. But it should not be the Wikimedia Foundation's policy that copyleft shall always be the best way to realise that goal. We may yet find ourselves in a situation where it isn't.
In any specific case, there will still be a lot of precedent to overcome before Wikimedia puts out anything that doesn't have a copyleft licence. But if there's a reason good enough to overcome Wikimedians' objections, then it shouldn't have to go through a by-law change as well.
By-laws can be changed by a simple majority of the trustees. If in the future we find that a particular project, such as Wikinews, isn't doing so well and we suspect it is the fact that it is using a copyleft license, then a change can be made to make an exception.
This is why getting the right policy now is not absolutely necessary. Still, I believe that getting the right policy now is a good idea. Otherwise, we'll just have to go through this whole discussion again at the trustees' meeting. ^_^
Perhaps the discussion will be easier to do in that future potientiality, when there will be more concrete issues at hand. Maybe we should just retain the ambiguous wording that Anthère first asked about???
-- Toby