2009/5/31 Anthony <wikimail(a)inbox.org>rg>:
On Sun, May 31, 2009 at 12:37 PM, Thomas Dalton
<thomas.dalton(a)gmail.com>wrote;wrote:
2009/5/31 Anthony <wikimail(a)inbox.org>rg>:
If you watched the Wave presentation you'll
see that there is quite a bit
of
edit conflict handling already built in (they
showed three people editing
the same page simultaneously).
I did watch it. That was live, they could see each other editing and
avoid each other. There was no conflict. You are talking about people
without live internet connections.
Watch it again then. There was at least one conflict, and they even pointed
it out and mentioned conflict resolution.
Clearly if two people edit exactly the same text one of the edits is going
to fail. But that's not the common situation.
Edit conflicts with live editing aren't an issue, manual resolution is
trivial. Edit conflicts with significant delays are a much bigger
problem and require automated merging, which isn't always possible,
and is often very difficult.
All people reading Wikipedia need is a plain HTML file
per article,
nothing
more.
Easier said than done, though. The static HTML Wikipedia dumps haven't
been
updated since June 2008. With Wave, updates are
instantaneous.
MediaWiki updates are instantaneous. Where is the improvement in
people reading Wikipedia via Wave?
One advantage, the one we were talking about, is that you don't need to be
connected to the Internet while you're reading it.
Yes, you could in theory implement such a thing without Wave, but unless the
WMF starts offering free live feeds (even for intermittent connections),
it's not going to be updated like Wave is. Maybe the WMF won't support
access to Wikipedia articles through Wave, but then, a fork will, and the
WMF's goal ("to empower and engage people around the world to collect and
develop educational content under a free
license<http://en.wikipedia.org/wiki/en:free_content>or in the public
domain, and to disseminate it effectively and globally"),
is still met, even if their particular implementation of it isn't.
We already have dumps (the latest dump of all enwiki primary content
finished a couple of hours ago and is 4.8 gig), all we would need to
do is make incremental dumps available so people don't have to
download the whole thing repeatedly. That would be pretty easy to
program compared to rewriting the whole of MediaWiki to function via
Waves.