Domas Mituzas wrote:
>>Domas seems to be experimenting with thttpd for that server, so I'll let him
reconfigure it when he's awake again.
>
>
> Thanks for all remarks :) As I woke up I did fix some stuff.
>
> * recompiled thttpd in order for different .gz handling ( well, you edit .txt
file, but you recompile afterwards ;-)
> * did some web design stuff
> * added md5 sums
> * increased visibility of text :)
>
Thanks, Domas! Can you please add the total size of cur/history for all
wikipedias (like printed red at http://download.wikimedia.org/)
Greetings,
Jakob
Hi,
How about a workshop on analysing and visualising Wikis at the Wikimania
conference? Erik can tell something about Wikistat, I can tell something about
"Measuring Wikipedia" (I'll publish the paper wrote for ISSI-conference[1]) and
you present you tools?
Jakob
[1] http://www.umu.se/inforsk/ISSI2005/ISSI2005Program.htm
Hello, there!
As I am just newbie in Wikimedia I'd like to ask some question.
WikiPedia has few sister projects (ie WikiQuote, WikiNews, WikiBooks).
Are they based on the same engine MediaWiki and have their particular
knowledge database or there is something else?
Thanks in advance,
Nicolay
> Domas seems to be experimenting with thttpd for that server, so I'll let him reconfigure it when he's awake again.
Thanks for all remarks :) As I woke up I did fix some stuff.
* recompiled thttpd in order for different .gz handling ( well, you edit .txt file, but you recompile afterwards ;-)
* did some web design stuff
* added md5 sums
* increased visibility of text :)
Cheers,
Domas
Just a reminder of work in progress and general background, for those
who might be commenting without being aware of present work...
First, in MediaWiki 1.5 we've made a major schema change, intended to
reduce the number of changes to data rows that have to be made and to
slim down the amount of data that has to be pulled per-row when scanning
non-bulk-text metadata.
Specifically, the 'cur' and 'old' tables are being split into 'page',
'revision', and 'text'. Lists of pages won't be trudging through large
page text fields, and operations like renames of heavily-edited pages
won't have to touch 15000 records. This will also give us the potential
to move the bulk text to a separate replicated object store to keep the
core metadata DBs relatively small and limber (and cacheable).
Talk to icee in #mediawiki if interested in the object store; he's
working on a prototype for us, to use for image uploads and potentially
bulk text storage.
Second, remember that each wiki's database is independent. It's very
likely that at some point we'll want to split out some of the larger
wikis to separate master servers; aside from localizing disk and cache
utilization, this could provide some fault isolation in that a failure
in one master would not affect the wikis running off the other master.
Third, we're expecting to have at least two new additional data centers
soon in Europe and the US. Initially these are probably going to be
squid proxies since that's easy for us to do (we have a small offsite
squid farm in France currently in addition to the squids in the main
cluster in Florida) but local web server boxen pulling from local slaved
databases at least for read-only requests is something we're likely to
see, to move more of the load off of the central location.
Finally, people constantly bring up the 'PostgreSQL cures cancer'
bugaboo. 1.4 has experimental PostgreSQL support, which I'd like to see
as a first-class supported configuration for the 1.5 release. This is
only going to happen, though, if people pitch in to help in testing, bug
fixing, and of course make some benchmarks and failure mode tests
compared to MySQL! If you ever want Wikimedia to consider switching, the
software needs to be available to make it work and it needs to be
demonstrated as a legitimate improvement with a feasible conversion.
Domas is the PostgreSQL partisan on our team and wrote the existing
PostgreSQL support. If you'd like to help you should probably track him
down; in #mediawiki you'll usually find him as 'dammit'.
-- brion vibber (brion @ pobox.com)
Hi,
a journalist from a German magazine about technology emailed me and
asked for graphics. They are writing an article about wikipedia and so
far, they don't have enough images to make the article nice. Yes, they
already know the IBM history flow project and they already have pictures
of the server racks in Florida. And they will show the wikipedia globe
and so on. Maybe one of two graphs from Erik Zachte's great scripts.
He had in mind something like a graphical map of all wikipedia-articles
and their connections via links, something like this
http://research.lumeta.com/ches/map/gallery/wired.gif or
http://research.lumeta.com/ches/map/gallery/isp-ss.gif
I have no idea how to make such picture out of wikipedia-articles but I
would welcome any kind of feedback.
Mathias Schindler
Hi,
This is not so common, so I thonk it worthwhile to copy it here. ;oP
(from the French "Bistro").
Yann
Il y a quelques temps, nous rallions tous à propos de la lenteur et des
interruptions de la wikipédia. Or depuis, celle-ci semble fonctionner à
merveille (en tout cas pour moi). Je tiens donc pour une fois à féliciter
tous les administrateurs (informatique, s'entend) qui travaillent dans
l‘ombre et que l’on ne remarque que quand cela ne marche pas. Bravo et bon
courage. Gadjou 5 avr 2005 à 15:55 (CEST)
--
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikipedia.org/ | Encyclopédie libre
http://www.forget-me.net/pro/ | Formations et services Linux
Hi,
My team is using mediawiki to create a wikipedia for our needs.
I am trying to create set of pages in it, which need to be linked through
some identifiers ( I wont be able to use Templates in this case).
So, when a user navigates from one internal page to other, one or more
number of parameters need to be passed to the other page.
I guess, my question is how to send parameters to an internal page if its
not a template.
Is there any methodology to implement that ?
Please advice.
Thanks,
Suresh Kadirvel
>From the Dept. of Dreams:
What I would really like is to be able to "subscribe" to a specific
Wikipedia article and be notified **in a manner of my choosing** when
updates occur.
via Bloglines, another RSS/news reader, via email or an IM / irc msg, etc...
I realise I can create a WP account and then my own custom watch list,
but quite frankly this ain't cutting the mustard for me (Dept. of
Dreams, right ;)
I have more than enough passwords, accounts and pages to check on a
regular basis, and I would generally like to have less not more. To
put it bluntly, I don't want to have to login and check a wikipedia
account...
Is there a way to do this?
Sort of a ....
WP watch list to *anything*
convertor / service....?
Hello,
I have implemented mediawiki to manage documentation on development rules
and source code management tools usage from my firm.
All users find it very usefull and simple to use and easy to make
modifications on articles.
We have now a new office in another country. We have a VPN between the 2
locations but the bandwith is to small. The users in the new office find
borring to use wiki (20 secondes to display a page against instant display
in "old" office)
I would like to have 2 instances of mediawiki on 2 servers, to synchronize
them and to allow users to make modification on both instances.
For php scripts and uploaded files, I have the solution, but I don't have
one for database.
We have a server A (located in old office) and a server B (located in new
office), both of them are complete webserver (L.A.M.P.)
Users from the old office are connected to A, Users from the new one to B.
Users make modification on A, they are transfert to B ("simple" master/slave
replication in mysql).
But if users modify something on B ... it's not transfered (mysql don't have
master/master replication).
Is there a way to redirect mysql queries according to their type in
mediawiki : select on B(slave) and insert/update/delete on master(A) ?
(mysql replication is used to push modification for A to B)
Perhaps, the solution implemented to wikipedia is the one I need.
Could you explain it ?
Best regards,
Laurent
PS1 : I tryed a solution with cache Turck MMCache and mediawiki on B that
connect to the database on A. The time to display a page is of 10s.
PS2 : Sorry for the long mail ...
PS3 : Soon ;)