On 30 June 2011 17:00, Alec Conroy alecmconroy@gmail.com wrote:
[a git-like distributed wikisphere]
It's not my idea, I believe it's been independently suggested at least five different times that I know of. But it's a HUGE step that would require a big, bold push from developers and thus potentially a large initial commitment from the foundation to spur development of such a thing. That commitment might not be huge in terms of resources-- a few professional lead developer-coordinators, perhaps. But it would require some courage, leadership, and a vision to rally volunteer developers around. If you visibly agree to it being built, an amorphous 'they' will likely show up to actually build it for you, free of charge. It would will radically change things for everyone the instant such a tool is actually created.
Adapting MediaWiki to git has been tried a few times. I suspect the problem is that the software deeply assumes a database behind it, not a version-controlled file tree. Wrong model for an easy fix to MediaWiki itself.
Pouring en:wp's entire history into git is feasible (Greg Maxwell posted about doing it, IIRC).
svnwiki exists - a wiki engine which uses files version-controlled by Subversion. Perhaps something like that - articles as files in a git repository, read by the new parser when that's done.
Such a wiki is inevitable, I just hope we can be the ones to develop it.
Someone else could actually do it without our weight of organisational inertia and NIH. We need competitors.
Further to your idea: people developing little specialist wikis along these lines, and said wikis being mergeable. This makes such wikis easier to start, without having to start yet another wiki-based general encyclopedia that directly competes with Wikipedia. Disruptive innovation starts in niches, not in a position where it'll just end up a bug on Wikipedia's windscreen.
- d.
On Thu, Jun 30, 2011 at 10:35 AM, David Gerard dgerard@gmail.com wrote:
Adapting MediaWiki to git has been tried a few times. I suspect the problem is that the software deeply assumes a database behind it, not a version-controlled file tree. Wrong model for an easy fix to MediaWiki itself.
Yeah, I don't mean it 'quite' that literally. I agree that literally incorporating git would be unlikely to be a simple solution-- I mostly mean 'incorporate the lessons of git'. A lot of those lessons could be implemented with existing software if the culture changes. Sandbox articles, for example, would be trivial to implement-- just stop deleting them. New project creation is about to get a LOT easier with Incubator extension. We don't yet have the technology to do distributed hosting, but we could do some sort of 'donated funds' hosting for specialist wikis that we might not feel comfy actually using our regular donations for. (or, we could decide the cost of such specialist wikis is negligible and just host them ourselves with our own donations.. case by case).
The biggest barriers are cultural, not technological. We all 'grew up' hearing that "forks are evil", but funny story, turns out forks aren't only not evil, they're a huge advantage in collaborative document creation. Forks are only evil because we haven't learned the lessons of git yet, so we don't have a 'merge' yet.
Alec
On 06/30/2011 07:35 PM, David Gerard wrote:
Further to your idea: people developing little specialist wikis along these lines, and said wikis being mergeable. This makes such wikis easier to start, without having to start yet another wiki-based general encyclopedia that directly competes with Wikipedia. Disruptive innovation starts in niches, not in a position where it'll just end up a bug on Wikipedia's windscreen.
Some things I believe could be easily programmed:
* Ability to surf through multiple wikis. For example, you could be reading article on a specialist wiki such as http://memory-alpha.org/wiki/Darmok_%28episode%29 ; upon clicking the link Gilgamesh, you would be taken to http://en.wikipedia.org/wiki/Gilgamesh ; when you further browse Wikipedia and click on Star Trek you would go not to Wikipedia's article but back to http://memory-alpha.org/wiki/Star_Trek .
** This could be more easily applied to multilingual wikis. For example, you could select which languages you know; and when you click on a link, you would be taken not to the current language but to the best article available in any of your languages.
* Ability to view diffs between two articles on two wikis. I believe this would be very easy to do.
* Ability to edit from diff (when you view a diff, you could select which differences do you want to insert into the article, and which differences do you want to discard). This could be very useful even within a single wiki.
On 1 July 2011 07:58, Nikola Smolenski smolensk@eunet.rs wrote:
On 06/30/2011 07:35 PM, David Gerard wrote:
Further to your idea: people developing little specialist wikis along these lines, and said wikis being mergeable. This makes such wikis
Some things I believe could be easily programmed:
Per HaeB's link, this is a perennial proposal. People like the idea, but in eighteen years - back as far as the Interpedia proposal, before wikis existed - no-one has made one that works. Why not? What's failing to go on here?
- d.
On 07/01/2011 09:15 AM, David Gerard wrote:
On 1 July 2011 07:58, Nikola Smolenskismolensk@eunet.rs wrote:
On 06/30/2011 07:35 PM, David Gerard wrote:
Further to your idea: people developing little specialist wikis along these lines, and said wikis being mergeable. This makes such wikis
Some things I believe could be easily programmed:
Per HaeB's link, this is a perennial proposal. People like the idea, but in eighteen years - back as far as the Interpedia proposal, before wikis existed - no-one has made one that works. Why not? What's failing to go on here?
Per HaeB's link, IMO no proposal was specific enough, and no proposal was actually done.
On Fri, Jul 1, 2011 at 12:21 AM, Nikola Smolenski smolensk@eunet.rs wrote:
On 07/01/2011 09:15 AM, David Gerard wrote:
Per HaeB's link, this is a perennial proposal. People like the idea, but in eighteen years - back as far as the Interpedia proposal, before wikis existed - no-one has made one that works. Why not? What's failing to go on here?
Per HaeB's link, IMO no proposal was specific enough, and no proposal was actually done.
I don't know why it took so long, but here's my guess. It hasn't worked for the past 18 years because prior to wikipedia, nobody ever got anything like this to work. It took a Jimmy to look at patent absurdity of 'anyone can edit' encyclopedias and somehow see that it was working in an amazing and world-changing way.
Making just one Wikipedia was crazy enough in 2002-- distributed revision control was only developed years later.
We've only had git for 6 years, and for at least the first 2 years, you'd still talk to people who would swear on intuition that git couldn't work on sheer principal. It was pure insanity-- and kinda like wikipedia, it took one of those handy charismatic genius community-builders to believe in such a silly system. I was a skeptic of both wikipedia and git the first time I heard them described (inaccurately).
At the end of the day, I think the only reason it hasn't happened yet is really simply that nobody has gotten it together and decided to do it, and it's the sort of thing that no for-profit entity can really do, since it's not easy to profit off of. But also, we're the most natural 'end users' of this tool. To the extent that we dominate the wiki field, other wikis may be looking to us to develop this kind of thing, since we have such greater resources and such greater need. We're the first group in history that really really needs this tool enough to have reason of our own to build it. We're the "Revision Control applied to Documents" people-- it's natural we should be the ones to do it.
But honestly, we haven't been ready for that kind of action in the past. Before the fundraising when nuclear, we didn't necessarily have a choice, and we definitely didn't want to mess with our brand before our organizational architecture was stabilized. And we're still not _quite_ ready to launch a major development initiative-- but plans for ramping up innovation and development are in progress, and new innovation is on the horizon. So I think the stars have finally aligned where the organization that needs the tool most could finally actually get it built, if it decides to.
-- If the board issued a statement saying it wanted such a "new model" wiki, announcing small symbolic prizes to the participants who show up to build it, and most importantly, if the WMF promised to let people use it once it's built-- I bet it would get built. The proposal's come up over and over and over for 18 years. There's no shortage of excitement about the ideas.
Unless there's some secret theoretically flaw I don't know about, I think it would just be a matter of how many geekhours it would take to create it, and whether that's a reasonable use of resources at this stage in our evolution. I've mostly heard 'it's difficult to do' but never 'that software can't be made and here's why". But obviously, this discussion goes back 18 years, so I haven't read all the threads :)
Alec
On 1 July 2011 09:27, Alec Conroy alecmconroy@gmail.com wrote:
On Fri, Jul 1, 2011 at 12:21 AM, Nikola Smolenski smolensk@eunet.rs wrote:
On 07/01/2011 09:15 AM, David Gerard wrote:
Per HaeB's link, this is a perennial proposal. People like the idea, but in eighteen years - back as far as the Interpedia proposal, before wikis existed - no-one has made one that works. Why not? What's failing to go on here?
Per HaeB's link, IMO no proposal was specific enough, and no proposal was actually done.
I don't know why it took so long, but here's my guess. It hasn't worked for the past 18 years because prior to wikipedia, nobody ever got anything like this to work. It took a Jimmy to look at patent absurdity of 'anyone can edit' encyclopedias and somehow see that it was working in an amazing and world-changing way.
The fact that Github's git-backed wikis haven't been seized upon suggests to me that there's no demand for a distributed wiki system amongst the *readers*.
It's like the perennial proposal for multiple article versions on Wikipedia for each point of view. This solves a problem for the *writers*, but makes one for the *readers*. They seem to want one source with one article on a topic, else they'd just hit the top ten links in Google instead of going to Wikipedia. (Wikinfo has tried implementing this. Its readership is negligible compared to Wikipedia, but its writers enjoy it.)
Why do people want ten Wikipedias to look up instead of one? They observably don't - they want a source they can quickly look up something in that they can reasonably trust to be useful. They only go to multiple sources if that one starts sucking.
A distributed wiki proposal needs to clearly solve a problem the readers have.
There are several such perennial proposals that are ignored because they are actually about solving problems for the writers, and not solving problems for the readers.
- d.
Why do people want ten Wikipedias to look up instead of one?
Why would people want millions of computers instead of just eight? Why would we want terabytes of memory when we could have just 640 kilobytes? When I go to the library, why are there a gazillion books, instead of just "the best book"?
Why do people want simple english wikipedia, when they could just have english? Why do some people want a "Scientific Point of View" at Rationalwiki or a Conservative Point of View at Conservapedia or a sympathetic point of view? Just because Neutral Point of View is, in my eyes "the best", that doesn't mean we don't want to read other points of view.
A better question is-- why do we want there to be ten Wikipedias instead of just one? And the answer is-- to recruit their users and to copy their best ideas. :) The more people familiar with wikitext, wikis, the wiki concept, and the wikimovement, the better.
A distributed wiki proposal needs to clearly solve a problem the readers have.
There are a lot of real problems that readers have that this could be helped with. Technologically the most obvious is the "Offline Reader" problem. If offline readers would, for example, would prefer to be offline editors-- they'll need this.
The problem I'm most excited about is the "The Wikipedia-Article Quality Problem"-- we all know that Wikipedia's a pot-luck. Sometimes an article is wonderful, sometimes it's horrible. Most importantly, many of our articles aren't necessarily getting better with time, and in some instances they seem to be eroding away as editors realize that there's "no point" to putting too much polish on an article that's under such active development. In this light, distributed development would be a little like Flagged Revision, only far far more powerful.
Another problem readers have is the "Openness Problem"-- namely, our readers would love to be editors if they had a way that worked for them. For whatever reason, these people can't handle the full-blown Wikipedia experience. We're not sure whether it's the massively-peer-reviewed culture or the highly-templated wikitext code that's the larger barrier, but the fact is, some readers just can't handle Wikipedia, so instead they're choosing not to participate at all, even though the have the inclination to.
Lessons from distributed revision control suggest that a small "sandbox space" feeder project would be more inviting. It would let experts who know how to write in a pre-wiki style come as they are, and help contribute without having to change themselves or their writing style. On Wikipedia itself, the usual process would then take over and use good parts from these articles to make an even better Wikipedia article.
Supppose "I" am a prestigious, recently-retired science professor willing to spend time working on one of our projects. Imagine all the BS I will have to put with. Just think about all the hoops I will have to jump through-- learn wikitext, learn wikiculture, learn the rules, hand being insulted, handle having people who know less than I do incorrectly delete my correct contributions. "I"'ve spent a lifetime in one kind of culture writing one sort of way, and I've been getting a lot of respect for it. If I want to help here, "I" have to learn computer code, learn a new culture, learn a very new and different writing style with lots of different rules, and I have to deal with being disrespected by people who know less about the subject than I do. (again, I am none of these things myself, but I'm such such people exist)
The "Lesson of Git" is that we really should be letting these people just "come as they are" and start writing for us, with the knowledge that some of it will be good, some of it will be bad, but as long as it's all written in good faith, the net effect on our project will be positive.
This solves a problem for the *writers*, but makes one for the *readers*.
This should not be confused with Wikipedia article forks, that's a whole different question that bares only the tiniest of similarities. The reader experience, for true readers and only readers, wouldn't be different at all. They go to wikipedia, there's an article in the same place it's always been, and the article is our editor's version, just like it is now.
The editor experience for most editors wouldn't necessarily be different unless they wanted it to be. -- Lastly, I don't see this new model just "taking over" Wikipedia-- Wikipedias might always use the current model. and that'd be fine. Either way, we could use satellite 'specialist' projects' that used the new model. It'd be up to the Wikipedia community to figure out how best to use that information-- but it would help us to have it.
Alec
In order to have many articles on the same topic, you must have a way so that the readers have those articles ranked. This way, the reader would instantly see the article he most trusts, no more effort for the reader.
I dont know whether trust is required to be formalized for a small group of developers working for a project, but it is necessary for a project like Wikipedia where there are thousands of contributors.
Google found a trust metric to rank the internet. He ranked pages by having sites trust sites(links).
We need to study and formalize a trust metric (with people trusting people ) for that kind of revolution of a distributed Wikipedia to take place.
*None of the previous proposals tried to cooperate with someone that is working on trust metrics.*
I think that the best way to go forward is to create a distributed wikipedia and let it be a test bed for a few trust metrics.
I am not a developer but I recently started working on creating such a trust metric http://opensociety.referata.com/wiki/Main_Page.
Here is another more mature effort on the study of trust metrics. http://www.trustlet.org/wiki
2011/7/2 David Gerard dgerard@gmail.com
On 1 July 2011 09:27, Alec Conroy alecmconroy@gmail.com wrote:
On Fri, Jul 1, 2011 at 12:21 AM, Nikola Smolenski smolensk@eunet.rs
wrote:
On 07/01/2011 09:15 AM, David Gerard wrote:
Per HaeB's link, this is a perennial proposal. People like the idea, but in eighteen years - back as far as the Interpedia proposal, before wikis existed - no-one has made one that works. Why not? What's failing to go on here?
Per HaeB's link, IMO no proposal was specific enough, and no proposal was actually done.
I don't know why it took so long, but here's my guess. It hasn't worked for the past 18 years because prior to wikipedia, nobody ever got anything like this to work. It took a Jimmy to look at patent absurdity of 'anyone can edit' encyclopedias and somehow see that it was working in an amazing and world-changing way.
The fact that Github's git-backed wikis haven't been seized upon suggests to me that there's no demand for a distributed wiki system amongst the *readers*.
It's like the perennial proposal for multiple article versions on Wikipedia for each point of view. This solves a problem for the *writers*, but makes one for the *readers*. They seem to want one source with one article on a topic, else they'd just hit the top ten links in Google instead of going to Wikipedia. (Wikinfo has tried implementing this. Its readership is negligible compared to Wikipedia, but its writers enjoy it.)
Why do people want ten Wikipedias to look up instead of one? They observably don't - they want a source they can quickly look up something in that they can reasonably trust to be useful. They only go to multiple sources if that one starts sucking.
A distributed wiki proposal needs to clearly solve a problem the readers have.
There are several such perennial proposals that are ignored because they are actually about solving problems for the writers, and not solving problems for the readers.
- d.
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On Sat, Jul 2, 2011 at 5:30 AM, David Gerard dgerard@gmail.com wrote:
It's like the perennial proposal for multiple article versions on Wikipedia for each point of view. This solves a problem for the *writers*, but makes one for the *readers*. They seem to want one source with one article on a topic, else they'd just hit the top ten links in Google instead of going to Wikipedia. (Wikinfo has tried implementing this. Its readership is negligible compared to Wikipedia, but its writers enjoy it.)
Why do people want ten Wikipedias to look up instead of one? They observably don't - they want a source they can quickly look up something in that they can reasonably trust to be useful. They only go to multiple sources if that one starts sucking.
As a reader, this is exactly my subconscious opinion. I'm glad you nailed our subconscious thoughts. ;-)
On 1 July 2011 07:58, Nikola Smolenski smolensk@eunet.rs wrote:
On 06/30/2011 07:35 PM, David Gerard wrote:
Further to your idea: people developing little specialist wikis along these lines, and said wikis being mergeable. This makes such wikis easier to start, without having to start yet another wiki-based general encyclopedia that directly competes with Wikipedia. Disruptive innovation starts in niches, not in a position where it'll just end up a bug on Wikipedia's windscreen.
Some things I believe could be easily programmed:
- Ability to surf through multiple wikis. For example, you could be
reading article on a specialist wiki such as http://memory-alpha.org/wiki/Darmok_%28episode%29 ; upon clicking the link Gilgamesh, you would be taken to http://en.wikipedia.org/wiki/Gilgamesh ; when you further browse Wikipedia and click on Star Trek you would go not to Wikipedia's article but back to http://memory-alpha.org/wiki/Star_Trek .
Already in the inventory. In practice on wikipedia we normally assume that when you click on an inline link in wikipedia you go to the wikipedia article on that subject. For non wikipedia wikis to inline link to wikipedia for more general background is pretty common though.
On 07/01/2011 04:42 PM, geni wrote:
On 1 July 2011 07:58, Nikola Smolenskismolensk@eunet.rs wrote:
- Ability to surf through multiple wikis. For example, you could be
reading article on a specialist wiki such as http://memory-alpha.org/wiki/Darmok_%28episode%29 ; upon clicking the link Gilgamesh, you would be taken to http://en.wikipedia.org/wiki/Gilgamesh ; when you further browse Wikipedia and click on Star Trek you would go not to Wikipedia's article but back to http://memory-alpha.org/wiki/Star_Trek .
Already in the inventory. In practice on wikipedia we normally assume that when you click on an inline link in wikipedia you go to the wikipedia article on that subject. For non wikipedia wikis to inline link to wikipedia for more general background is pretty common though.
Ah, but you don't return when you click on a link that exists both on Wikipedia and another wiki.
On Fri, Jul 1, 2011 at 7:44 AM, Nikola Smolenski smolensk@eunet.rs wrote:
Ah, but you don't return when you click on a link that exists both on Wikipedia and another wiki.
Not only that, but you miss out on a huge set of features. You can't have shared user account names across wikis, you can't use templates across wikis. Userspace isn't unified-- and most of all, it's very difficult for a "normal person" to start a wiki, even if they have their own time and money they want to donate to the fledgling project.
Not being part of Wikimedia is an unnecessary barrier to communication that makes it hard for everyone to get things done. We can remove these barriers through technological innovation, but also just by being more openness to third-party-projects projects that are 'non inconsistent' with our mission.
wikimedia-l@lists.wikimedia.org