I mean [[Digital Universe]], which will be WP in Technicolor without errors or trolls, or something? I realise userboxes must be more significant ...
Charles
Personally, I'm looking forward to the arbcom elections; the discussion around here will hopefully get more interesting.
Spangineer
On 1/8/06, charles matthews charles.r.matthews@ntlworld.com wrote:
I mean [[Digital Universe]], which will be WP in Technicolor without errors or trolls, or something? I realise userboxes must be more significant ...
Charles
WikiEN-l mailing list WikiEN-l@Wikipedia.org To unsubscribe from this mailing list, visit: http://mail.wikipedia.org/mailman/listinfo/wikien-l
-- Nathaniel C. Sheetz http://www.personal.psu.edu/ncs124
On 1/8/06, charles matthews charles.r.matthews@ntlworld.com wrote:
I mean [[Digital Universe]], which will be WP in Technicolor without errors or trolls, or something? I realise userboxes must be more significant ...
Charles
What to say. They have $10 million in funding. They would be hard pushed not to produce something at least slightly worthwhile with that. The user editing system sounds rather like the one encarta are useing (indeed the whole project sounds rather simular to encarta). We shall wait and see.
-- geni
On 1/8/06, charles matthews charles.r.matthews@ntlworld.com wrote:
I mean [[Digital Universe]], which will be WP in Technicolor without errors or trolls, or something? I realise userboxes must be more significant ...
I suspect that it is seldom debated for much the same reason that Nupedia is seldom debated. There is as yet very little to discuss. I wish the project well; it looks as if , in the encyclopedia part, Larry is trying to draw lessons from both Nupedia and Wikipedia.
Some time ago I contributed briefly to a project call h2g2, which (I seem to recall) had a similar two-tier editorial process and a hierarchy of editors. At the time it was hosted by the BBC. The slowness of the bureaucacy led to my drifting off out of boredom, Meanwhile the success of Wikipedia shows that there is a lot of mileage in the concept and I hope we'll see many alternatives on the theme.
I think the consensus was, "It'll be humorous and perhaps a little sad to see how little they end up with after blowing $10 million," or maybe that was just my opinion on it.
FF
On 1/8/06, charles matthews charles.r.matthews@ntlworld.com wrote:
I mean [[Digital Universe]], which will be WP in Technicolor without errors or trolls, or something? I realise userboxes must be more significant ...
Charles
WikiEN-l mailing list WikiEN-l@Wikipedia.org To unsubscribe from this mailing list, visit: http://mail.wikipedia.org/mailman/listinfo/wikien-l
On 1/8/06, Fastfission fastfission@gmail.com wrote:
I think the consensus was, "It'll be humorous and perhaps a little sad to see how little they end up with after blowing $10 million," or maybe that was just my opinion on it.
FF
oh I don't know. They could currently afford to spend $11.02 per wikipedia article.
-- geni
geni wrote:
On 1/8/06, Fastfission fastfission@gmail.com wrote:
I think the consensus was, "It'll be humorous and perhaps a little sad to see how little they end up with after blowing $10 million," or maybe that was just my opinion on it.
FF
oh I don't know. They could currently afford to spend $11.02 per wikipedia article.
Which for the average expert is going to mean 5-10 minutes per article, only enough time for the most minimal fact-checking, let alone making it a "featured article". Even if you pinch down and say that a good reference needs only 100K articles, you're still only getting an hour or so on each.
Reading between the lines, I think what they're really hoping is to leverage the $10M for infrastructure, with most expert content creators being "paid" in the form of bylines. For many experts, this will be sufficient draw; they will be able to justify it as part of work, it will be a way to get ahead, etc. DU is also going to be able to advertise itself as a cretin-free zone, which starts to look pretty good after an hour of RC patrol here.
The most concerning aspect for us is the potential to drain away activity in science and technical areas, leaving WP to evolve into the "free encyclopedia of garage bands and manga trivia". (You laugh, but consider vandals gradually blanking math articles that are not on any active editor's watchlist, and not noticed in RC because of the continuous flood of recategorization edits.)
Stan
On 1/9/06, Stan Shebs shebs@apple.com wrote:
The most concerning aspect for us is the potential to drain away activity in science and technical areas, leaving WP to evolve into the "free encyclopedia of garage bands and manga trivia". (You laugh, but consider vandals gradually blanking math articles that are not on any active editor's watchlist, and not noticed in RC because of the continuous flood of recategorization edits.)
Stan
Doubtful. They are looking to recurite PhD plus. A lot of our science articles are writen by students and people with BScs.
-- geni
geni wrote:
On 1/9/06, Stan Shebs shebs@apple.com wrote:
The most concerning aspect for us is the potential to drain away activity in science and technical areas, leaving WP to evolve into the "free encyclopedia of garage bands and manga trivia". (You laugh, but consider vandals gradually blanking math articles that are not on any active editor's watchlist, and not noticed in RC because of the continuous flood of recategorization edits.)
Stan
Doubtful. They are looking to recurite PhD plus. A lot of our science articles are writen by students and people with BScs.
Somewhere in the verbiage they mentioned the PhDs as leaders accepting contributions. Students (including a professor's own students) and talented amateurs could find themselves more motivated to tweak a high-quality high-reputation article rather than to try to turn an almost-English stub into something minimally coherent.
DU has a lot of hurdles, but they know all about WP and are seeking to capitalize on our weakest points. We shouldn't get too complacent.
Stan
Stan Shebs wrote
DU has a lot of hurdles, but they know all about WP and are seeking to capitalize on our weakest points. We shouldn't get too complacent.
Agreed. But I don't mind the division of labour too much if we accept the bright grad students with time on their hands, who IMO are just the people to push our coverage off the well-marked paths onto more current areas, and leave them the post-docs with tenure worries.
Charles
On Mon, 2006-01-09 at 06:25 -0800, Stan Shebs wrote:
Reading between the lines, I think what they're really hoping is to leverage the $10M for infrastructure, with most expert content creators being "paid" in the form of bylines. For many experts, this will be sufficient draw; they will be able to justify it as part of work, it will be a way to get ahead, etc.
This doesnt happen. Academics and so forth are busy and expect to be paid for this type of thing. Bylines in an encyclopaedia dont cut it any more.
Will be interesting.
Justinc
Justin Cormack wrote:
On Mon, 2006-01-09 at 06:25 -0800, Stan Shebs wrote:
Reading between the lines, I think what they're really hoping is to leverage the $10M for infrastructure, with most expert content creators being "paid" in the form of bylines. For many experts, this will be sufficient draw; they will be able to justify it as part of work, it will be a way to get ahead, etc.
This doesnt happen. Academics and so forth are busy and expect to be paid for this type of thing. Bylines in an encyclopaedia dont cut it any more.
Unless things have changed a lot since I was in academia, there are lots of things one doesn't get paid for, such as peer review of papers. Computing Reviews didn't pay me for the paper and book reviews I wrote for them. For that matter, academics rarely get paid for publishing papers in the first place, they even pay to get them published (the "page charges").
Most likely, DU will have an (unpublicized) sliding scale, where the big names or otherwise desirable contributors can get cash, and the smaller names will have to settle for less. By buying the prestigious, they're hoping to gain instant credibility, not unlike journals hoping to increase their prestige by recruiting big names for their editorial board.
WP's best response is to continue with our program of increasing the scholarly rigor (source citation, etc), and highlight how we can do all the good things that experts do, but without being handicapped by the gamesmanship and careerism that affict the usual institutions. Sort of a return to the best traditions of pure scholarship, if you will.
Stan
On 1/9/06, Stan Shebs shebs@apple.com wrote:
geni wrote:
On 1/8/06, Fastfission fastfission@gmail.com wrote:
I think the consensus was, "It'll be humorous and perhaps a little sad to see how little they end up with after blowing $10 million," or maybe that was just my opinion on it.
FF
oh I don't know. They could currently afford to spend $11.02 per wikipedia article.
Which for the average expert is going to mean 5-10 minutes per article, only enough time for the most minimal fact-checking, let alone making it a "featured article". Even if you pinch down and say that a good reference needs only 100K articles, you're still only getting an hour or so on each.
If the source text written for free is well referenced an hour wouldn't be too bad. Of course, what people really seem to be missing is that the $10 million is merely the seed money. *If* they can get some good content created by that (even just 2000 articles would probably do it), there should be no problem raising more money (through donations, sales, services, etc.).
What did Wikipedia start with, half a million and one paid expert? Then Larry was fired for lack of funding. I'm sure in his opinion, if Wikipedia could have afforded to pay 10 times as many experts for twice as long, they wouldn't have the credibility problem they have today.
Anthony
Anthony DiPierro wrote:
If the source text written for free is well referenced an hour wouldn't be too bad. Of course, what people really seem to be missing is that the $10 million is merely the seed money. *If* they can get some good content created by that (even just 2000 articles would probably do it), there should be no problem raising more money (through donations, sales, services, etc.).
This is possible, but I think still a bit of a stretch. Apart from starting with no articles, they have a number of disadvantages: * They're a commercial organization rather than a non-profit, which tends to make people less willing to donate and volunteer * They are known to have $10m, which makes people less likely to donate or work for free * They appear to be charging for user accounts, which will drastically reduce the number of people who create them * The end result appears to be under a murky and possibly proprietary license, which will not encourage people to work on it for free
Not necessarily fatal flaws, but I'd say it's a long-shot that they will be a serious competitor to Wikipedia anytime in the near future.
-Mark
On 1/9/06, Delirium delirium@hackish.org wrote:
Anthony DiPierro wrote:
If the source text written for free is well referenced an hour wouldn't be too bad. Of course, what people really seem to be missing is that the $10 million is merely the seed money. *If* they can get some good content created by that (even just 2000 articles would probably do it), there should be no problem raising more money (through donations, sales, services, etc.).
This is possible, but I think still a bit of a stretch. Apart from starting with no articles, they have a number of disadvantages:
I'm not sure what you're comparing them to, but I'll comment on your points as is.
- They're a commercial organization rather than a non-profit, which
tends to make people less willing to donate and volunteer
Digital Universe Foundation is a non-profit organization incorporated in Nevada. See for youself at https://esos.state.nv.us/SOSServices/AnonymousAccess/CorpSearch/CorpSearch.a.... There is a for-profit which runs the ISP but supposedly this is set up in a way so that most of the profits from the for-profit go to the non-profit. I'm not sure exactly what the arrangement is, though.
- They are known to have $10m, which makes people less likely to donate
or work for free
Small time people, maybe, but the larger grant money which they are probably targetting is actually more likely to donate to a company which isn't hanging on by a thread.
For donations of time, I'm not sure I agree, but you might be right. I think it depends more on how they spend their money than how much of it there is, though.
- They appear to be charging for user accounts, which will drastically
reduce the number of people who create them
I have an account and didn't pay anything. AFAIK you only have to pay if you subscribe to the ISP.
- The end result appears to be under a murky and possibly proprietary
license, which will not encourage people to work on it for free
Seems to me they will release *some* things under proprietary licenses and some things under free licenses. I certainly agree it will be tough to get people to contribute to those parts under the proprietary licenses for free, though if it's set up right maybe not too hard (people will donate time to proprietary non-profit projects in some circumstances).
What's murky, it seems, is exactly what they're going to do. I'm sure it'll be clear once they start publicizing this.
Not necessarily fatal flaws, but I'd say it's a long-shot that they will be a serious competitor to Wikipedia anytime in the near future.
-Mark
Well, you seem to have been misinformed on all your points. But I see them filling a different niche from Wikipedia anyway.
I have my doubts as to whether or not DU will be successful. It's really a matter of how well it's managed. I think the idea is a good one, though. (I've personally watched a company I've co-founded, with a similar idea, waste millions of dollars and go out of business, during the dot-com days. Actually when I showed this to one of the other co-founders he asked me if this was our old CEO, who botched the thing up back then, trying the idea again. It isn't.)
Anthony
On 9 Jan 2006, at 19:34, Anthony DiPierro wrote:
Well, you seem to have been misinformed on all your points. But I see them filling a different niche from Wikipedia anyway.
I have my doubts as to whether or not DU will be successful. It's really a matter of how well it's managed. I think the idea is a good one, though. (I've personally watched a company I've co-founded, with a similar idea, waste millions of dollars and go out of business, during the dot-com days. Actually when I showed this to one of the other co-founders he asked me if this was our old CEO, who botched the thing up back then, trying the idea again. It isn't.)
I just hope they feel that they can use some of what we have done (commons, perhaps, as a minimum) and dont try to remake the world from scratch. And realising this, use a compatible license.
Justinc
On 1/9/06, Justin Cormack justin@specialbusservice.com wrote:
On 9 Jan 2006, at 19:34, Anthony DiPierro wrote:
Well, you seem to have been misinformed on all your points. But I see them filling a different niche from Wikipedia anyway.
I have my doubts as to whether or not DU will be successful. It's really a matter of how well it's managed. I think the idea is a good one, though. (I've personally watched a company I've co-founded, with a similar idea, waste millions of dollars and go out of business, during the dot-com days. Actually when I showed this to one of the other co-founders he asked me if this was our old CEO, who botched the thing up back then, trying the idea again. It isn't.)
I just hope they feel that they can use some of what we have done (commons, perhaps, as a minimum) and dont try to remake the world from scratch. And realising this, use a compatible license.
Justinc
Is there a license that isn't compatible with the stuff on commons?
On 10 Jan 2006, at 00:44, Anthony DiPierro wrote:
Is there a license that isn't compatible with the stuff on commons?
Well 2 ways they could go:
1. Not use commons as they are not interested
2. Not produce media that we could use on commons (proprietary).
I cant see any license reason why they cant use whats one commons, its pretty free.
Justinc
On 1/9/06, Justin Cormack justin@specialbusservice.com wrote:
On 10 Jan 2006, at 00:44, Anthony DiPierro wrote:
Is there a license that isn't compatible with the stuff on commons?
Well 2 ways they could go:
Not use commons as they are not interested
Not produce media that we could use on commons (proprietary).
I cant see any license reason why they cant use whats one commons, its pretty free.
Justinc
Yeah, I hope they keep the vast majority of their content non-proprietary (I'd say all of it, but I've kind of given up on that one). Otherwise there's not much of a point. There's plenty of no-cost proprietary information already out there. I guess most of it isn't ad-free, but I don't see that as *that* big of a draw.
Using the GFDLed stuff is more problematic. Using it was, in hindsight, probably the biggest mistake made by Wikipedia. It might be worth starting over from scratch just to get rid of the ties to the GFDL. Of course one can always hope that the FSF is finally going to fix that license, but we've been asking for years and it hasn't happened yet.
Anthony
On 10 Jan 2006, at 01:27, Anthony DiPierro wrote:
Using the GFDLed stuff is more problematic. Using it was, in hindsight, probably the biggest mistake made by Wikipedia. It might be worth starting over from scratch just to get rid of the ties to the GFDL. Of course one can always hope that the FSF is finally going to fix that license, but we've been asking for years and it hasn't happened yet.
Out of interest whats your problem (and fix?).
Justinc
On 1/9/06, Justin Cormack justin@specialbusservice.com wrote:
On 10 Jan 2006, at 01:27, Anthony DiPierro wrote:
Using the GFDLed stuff is more problematic. Using it was, in hindsight, probably the biggest mistake made by Wikipedia. It might be worth starting over from scratch just to get rid of the ties to the GFDL. Of course one can always hope that the FSF is finally going to fix that license, but we've been asking for years and it hasn't happened yet.
Out of interest whats your problem (and fix?).
Not Anthony, but thought I'd put my oar in. I'd rather the GFDL had been something like the GPL, but for documents. Easy to understand, not very burdensome requirements. Instead you get all the complexity about invariant sections, about author attribution, about prohibiting DRM, about all kinds of things.
Part of the problem was that the FSF seem to have been only thinking of conventional books when they invented the GFDL. Many of its requirements (e.g. having to include the whole license with any re-use, rather than simply a pointer to it) make sense with a book but not if you want to re-use just a tiny portion of the work.
Bear in mind that Wikipedia is in actuality rather far from complete compliance with the GFDL ourselves, even though we officially 'turn off' parts of the GFDL for Wikipedia. I'm not the person to go to about this, but others on this list have studied this in more depth.
The Creative Commons people got a lot more right, although people have issues with parts of it, e.g. no requirement for the original source to be available.
-Matt
On 1/9/06, Matt Brown morven@gmail.com wrote:
On 1/9/06, Justin Cormack justin@specialbusservice.com wrote:
On 10 Jan 2006, at 01:27, Anthony DiPierro wrote:
Using the GFDLed stuff is more problematic. Using it was, in hindsight, probably the biggest mistake made by Wikipedia. It might be worth starting over from scratch just to get rid of the ties to the GFDL. Of course one can always hope that the FSF is finally going to fix that license, but we've been asking for years and it hasn't happened yet.
Out of interest whats your problem (and fix?).
Not Anthony, but thought I'd put my oar in. I'd rather the GFDL had been something like the GPL, but for documents. Easy to understand, not very burdensome requirements. Instead you get all the complexity about invariant sections, about author attribution, about prohibiting DRM, about all kinds of things.
Actually I don't see any reason why using the GPL unmodified wouldn't have been a much better solution than using the GFDL. Sure, the GPL was designed for software, but the GFDL was designed for software documentation. The GPL defines "Program" as "any program or other work which contains a notice placed by the copyright holder saying it may be distributed under the terms of this General Public License" and "source code" as "the preferred form of the work for making modifications to it", so it *can* be applied to Wikipedia AFAICT.
Part of the problem was that the FSF seem to have been only thinking of conventional books when they invented the GFDL. Many of its requirements (e.g. having to include the whole license with any re-use, rather than simply a pointer to it) make sense with a book but not if you want to re-use just a tiny portion of the work.
I believe the GPL imposes this requirement too, though. The best solution (again, in hindsight) would have been a custom license.
Bear in mind that Wikipedia is in actuality rather far from complete compliance with the GFDL ourselves, even though we officially 'turn off' parts of the GFDL for Wikipedia. I'm not the person to go to about this, but others on this list have studied this in more depth.
The Creative Commons people got a lot more right, although people have issues with parts of it, e.g. no requirement for the original source to be available.
-Matt
I doubt the GFDL accomplishes much wrt making the original source available either, though (at least in terms of Wikipedia). The GFDL lists HTML as an example of a "transparent file format", which probably means there is no requirement to distribute the original wikitext.
Anthony
On 1/9/06, Justin Cormack justin@specialbusservice.com wrote:
On 10 Jan 2006, at 01:27, Anthony DiPierro wrote:
Using the GFDLed stuff is more problematic. Using it was, in hindsight, probably the biggest mistake made by Wikipedia. It might be worth starting over from scratch just to get rid of the ties to the GFDL. Of course one can always hope that the FSF is finally going to fix that license, but we've been asking for years and it hasn't happened yet.
Out of interest whats your problem (and fix?).
Justinc
My problem is pretty much everything in the GFDL that isn't in CC-by-SA. See Section 4 of the GFDL - everything there except the requirement to attribute authors (and personally I don't even like that requirement, which is arguably too onerous for a wiki, but it's a compromise to at least keep in the spirit of the GFDL).
The perfect fix (that is at all possible) would be to create a new version of the GFDL which says something to the effect of "You may opt to apply the terms of the Creative Commons Attribution-ShareAlike 2.0 instead of this license." Of course, it's probably more realistic to expect more gradual changes.
Anthony
--- Anthony DiPierro wikilegal@inbox.org wrote:
Using the GFDLed stuff is more problematic. Using it was, in hindsight, probably the biggest mistake made by Wikipedia. It might be worth starting over from scratch just to get rid of the ties to the GFDL. Of course one can always hope that the FSF is finally going to fix that license, but we've been asking for years and it hasn't happened yet.
What requests has Wikipedia made FSF to improve the GFDL to date? Has there been any response? Has this been documented somewhere?
-- Matt
Wikipedia: http://en.wikipedia.org/wiki/User:Matt_Crypto Blog: http://cipher-text.blogspot.com
___________________________________________________________ Yahoo! Messenger - NEW crystal clear PC to PC calling worldwide with voicemail http://uk.messenger.yahoo.com
charles matthews wrote:
I mean [[Digital Universe]], which will be WP in Technicolor without errors or trolls, or something? I realise userboxes must be more significant ...
Well, it's hard to tell what it'll turn out to be until something starts happening.
So long as their license is compatible with the GFDL (which if they bootstrap from Wikipedia it'll have to be), I see no problems with it. If they produce useful improvements in content quality on some articles that they get reviewed by experts, we can fold in those changes to Wikipedia, too.
-Mark
On 1/8/06, Delirium delirium@hackish.org wrote:
So long as their license is compatible with the GFDL (which if they bootstrap from Wikipedia it'll have to be), I see no problems with it. If they produce useful improvements in content quality on some articles that they get reviewed by experts, we can fold in those changes to Wikipedia, too.
I recall hearing they were planning to use a combination of all-rights-reserved full copyright and an as of yet undetermined Creative Commons license (probably one incompatible with the GFDL).
On 1/9/06, Wikiacc wikiacc@gmail.com wrote:
On 1/8/06, Delirium delirium@hackish.org wrote:
So long as their license is compatible with the GFDL (which if they bootstrap from Wikipedia it'll have to be), I see no problems with it. If they produce useful improvements in content quality on some articles that they get reviewed by experts, we can fold in those changes to Wikipedia, too.
I recall hearing they were planning to use a combination of all-rights-reserved full copyright and an as of yet undetermined Creative Commons license (probably one incompatible with the GFDL).
According to Larry Sanger, they will not use Wikipedia content to "bootstrap" their content.
-Andrew (User:Fuzheado)