[Foundation-l] what do we do in the event the Foundation fails? - Re: Policy governance ends
saintonge at telus.net
Wed Apr 18 19:44:55 UTC 2007
David Gerard wrote:
>On 18/04/07, Platonides <Platonides at gmail.com> wrote:
>>en.wikipedia would survive. I'm more afraid of the little wikis. They
>>dumps, right, but how many copies of them are out there?
>>And not so many people will download the full dump and share it if
>>something breaks here. Jeff Merkey, and some statistics studies would,
>>but ihaveanemptydomainandwanttoleechcontent.com won't.
>If the need for archived dumps is popularised - and made easy - it
>might be more plausible.
>Outboard 400GB USB disk: £70. Plug that in, BitTorrent dumps to it,
>have something which updates the dumps when there's a new one ...
>there should be enough people who can afford that to do it.
>At the moment I'm quite worried about Commons. The image dump is HUGE
>and I don't think there's been a good dump of it for a while. And
>300-400GB of images is quite a lot for people to keep around casually.
Can it be segmented where no one person needs to protect the whole
thing. Each person archiving would only need to keep downloading the
same predetermined part of the database.
>>I was going to suggest we could publish some user db data with yet
>>another hashing layer, but we can't. We sure have too many users with a
>>But don't worry. Users table are so small that Brion can save them under
>>his pillow ;-)
>Let's assume Brion is 100% trustworthy with personal information (I
>think we can actually assume this is true). The WMF is eaten by
>invading Martian badgers.
>* How much work to verify any person is who they say they are?
>* Who does he give the results to?
>In the second case, a public list of "this old-userID is this OpenID"
>would be something that wouldn't violate a confidence - if each person
>made the match by logging in and submitting the OpenID they wanted to
>correlate and publicise for credit.
Sometimes I feel that people go overboard about privacy. One potential
problem that could come out of having all this information fried could
be an inability to trace things back to the original free licence in a
copyright dispute. I would hate to see the situation at some time in
the future where someone claims copyright because nobody is able to
establish that it was based on a free licence.
>>If WMF defrauds its userbase, there will be forks anyway, as there were
>>on the past, like the [[Enciclopedia Libre]], which is still being
>>developed in paralell.
>>If data is destroyed by a natural disaster, WMF will need to hold firm
>>to avoid that disperssion.
>I think multiple forks would actually be the best way to preserve
>things. One will accumulate more weight.
>Wiki encyclopedias aren't going away. They're the only reasonable
>model for the 21st century, in fact ... and we did that. We just have
>to make sure our work is preserved in case of disaster.
There is a risk that some of tend to view Wikimedia as some kind of a
Borg. While I am clearly in the inclusionist part of the thought
spectrum, I also believe that other sites too have their functions and
should be encouraged to also engage in some kind of open editing model.
At the very least these other sites are not bound by the NPOV
principle. This could lead to other sites that have completely opposing
points of view, and that would be healthy. We sometimes tend to focus
too much on sites like Wikipedia Review, where the bias is more than
evident. Encouraging other wikis, particularly specialized ones with
which we can have compatible sharing arrangements is good for everyone.
For example, a wiki that focuses on a single city, and brings together
issues of interest to people in that city can go into much greater depth
about topics whose notability might be questioned in Wikipedia. The
other benefit is that it helps relieve server load, and passes some
costs on to people running smaller projects.
More information about the foundation-l