Hi Folks,
I'm interested in using Mediawiki 1.4.5 for an upcoming project at work, so
I've been poking around for a little while now, with various amounts of
success.
I'm looking for a way to insert category tags to whatever the user enters when
editing a page. I've been focusing in EditPage.php, and tried appending the
text ("[[category:test]]") to EditPage->textbox1 just before the
Article::insertNewArticle call, but it doesn't seem to work. Instead of a
link to my test category, I get a broken link to a "category:test" page.
I've also fooled with changing Article::getContent, but it seems like I
shouldn't have to change that code to do this. $wgHooks, too, but still no
luck.
I'd appreciate any help floating around out there. Thanks,
ry
--
Laissez Faire Economics is the theory that if
each acts like a vulture, all will end as doves.
Brion:
> We'll see how it goes this weekend...
Just a reminder: could we have a fresh dump before then?
It will be the last opportunity to run wikistats for quite a while.
Thanks, Erik Zachte
Just a heads-up for those not following CVS; I'm fixing up an alternate
schema upgrader which will do the major schema updates and UTF-8
conversion for the 1.5 upgrade, and should be relatively friendly to
replication.
One of the many tricky things about MySQL server replication is that the
update log is serialized. When you're working on one server, you can be
making updates to many databases and tables simultaneously (so long as
they don't interfere with each other); the update log used for
replication reorders the queries into a logically equivalent stream, one
after the other.
That makes your life much simpler if your job is writing replication
support for MySQL. ;) However it makes your life more complicated if
you're trying to use replicated servers for load balancing of read-only
queries with time-sensitive data (that's us!).
The worst case is when something like a major database schema change
happens -- you're interleaving small, fast changes (edits, saved
preferences, new accounts...) with large, slow changes (change the
layout of a million old-page records). On the single server things may
mostly work, but when the replication gets to that million-record
honker, nothing else gets applied on the slave server until it's done
with that one. As a result the view of the database in the slave servers
get more and more behind the current state of the master. Weird things
happen on the site, with out-of-date or missing pages.
We get that to a degree when certain mass nasties happen (like page-move
attacks on massively-linked pages), but doing database maintainence
while keeping wikis online would be particularly ugly.
I've been putting together an alternate 1.5 schema upgrader which also
includes the UTF-8 conversion (which we need anyway), which applies the
updates in smaller chunks. That is, instead of "copy these fields from a
million 'old' records into the 'revision' table", it chunks in say 100
records at a time.
This is probably slower 'in total' (though since we have to do UTF-8
conversion on several large wikis anyway, it's necessary to a degree),
but the main benefit is that it should allow the replication stream to
be interleaved -- a chunk of upgrade, a few edits, a chunk of upgrade, a
few edits. The slaves should stay up to date during the process, always
applying small, recent updates.
So instead of turning everything off indefinitely while tables are
churned around, we should be able to upgrade a wiki at a time, while
keeping the rest of the wikis online and editable as they wait their turn.
For instance while en.wikipedia.org is in read-only for conversion,
people should still be able to edit Wikinews or Wiktionary, or another
language Wikipedia.
In theory. :)
We'll see how it goes this weekend...
-- brion vibber (brion @ pobox.com)
Hello, sorry to insist:
I'm a french wikipedia contributor especially to link orphaned pages.
But I have a problem with orphaned list page update frequency.
http://fr.wikipedia.org/wiki/Special:Lonelypages
I wonder if it is possible to make an update on week or 2 weeks bases
even if it is only on french orphaned list page.
That will ease my linkage job.
Thanks.
I tested Special:Import on test.leuksman.com yesterday. While it
certainly has improved I sure hope that it will NOT be enabled on the
Wikimedia servers in its current state.
It is very easy (for admins) to falsify (with the file upload feature)
or just mess up article histories and user contributions, and since
imports are not logged there's no way to keep track of it.
I don't fear that we have many rogue admins that would try to abuse the
feature, but you don't have to be rogue to make a mess. Since the
contributors (/authors) of an imported article are treated as they were
local you can create a real mess of user contributions.
On no.wikipedia and nn.wikipedia there are for example two different
active users called "Jhs". If I were to import an article contributed
by "Jhs" from no.wikipedia to nn.wikipedia, it would show up in Jhs's
user contributions on nn.wikipedia, even though that user never wrote
anything like it.
In its current state I would say that for license complience it is also
inferior to manual import, since it does not point to the source of the
article in any way, just to a non-existent user (as far as I can
understand). And since the default is to only export (and thus import)
the last revision, only one contributor is attributed.
--Guttorm Flatabø
(user:dittaeva)
I wonder if people can update me/us on how the integration with the
Amsterdam cluster is going. Are there any lessons? Are we still
committed to our multiple-data-center strategy? Is there anything new
which we have learned?
I am working with Yahoo to finalize the details of the South Korean
facility... what have we learned which might help with this?
What might we ask of a large potential donor next?
--Jimbo
I added a parsertest keyword to MediaZilla to encourage the writing of
parser tests, if you feel that there's a parsing bug that should have
a parser test please add the keyword so that people who want to write
those tests know about it.
Hello everybody,
I just installed the wiki media and the german version of wikipedia on
my pc. Looks like it works fine, but the images are missing. On the spot
where the image should be located it says missing imag and the name.
Does anybody know where to optain the images, or if they are anyhow
inside the tables and i made a mistake with the inserts.
Thank you for any hint,
merlin