Rowan Collins wrote:
On Sun, 02 Jan 2005 22:09:32 -0600, =James Birkholz= j.birchwood@verizon.net wrote:
- I'll want to merge the registration for the wiki portion with the
non-wiki existing site.
You're not the first to consider this, but I can't remember off-hand anyone reporting that (and how) they achieved it successfully. You could try searching the archives (Google site:mail.wikimedia.org +your query), but I guess the easiest approach is to hack up includes/SpecialUserlogin.php to access the extra bits of database and whatever that the other parts of the site use. I think there may be an authentication plugin system "on the way" which could be useful in this regard, but don't quote me on that.
Unified login will be extremly hard, as it requires changes to lots of functions.
In the future we should have a way for scripts to 'talk' to each other and syncronize login information. I'm working on something for my own seperate project, which wont be done for a while.
- What is the best way to modify the pages templates? For example, if I
wanted to add a "Donations" link likes WikiPedia's? Or links to the non-wiki sections? Or display the time in Timbuctoo in the bottom left corner? Or have a second image randomly selected that morphs out of the main logo image? Obviously I'm looking for general pointers, not specific solutions here. (That may come later :] )
Well, there's several solutions:
- the PHPTAL templates (*.pt) are theoretically there to make things
like this easy, but in practise everyone that tries seems to just break their wiki :(
- as of 1.4, you can add things to the sidebar by editing the
$wgNavigationLinks variable in LocalSettings.php (you may have to manually copy the default section from includes/DefaultSettings.php, I'm not sure)
- to be really adventurous, you can write a Skin.php sub-class of your
own; in fact, 1.4 includes a version of MonoBook written entirely in PHP, which may in fact be *easier* to hack up than the PHPTAL one.
We should really make it easier to modify templates without breaking everything.
- I've been transferring articles by hand from WikiPedia to my GenWiki,
but am wondering if there's a better way. I've looked at the WikiPedia's data dump downloads instruction pages, but frankly they don't make it seem too easy, especially if I'm using a remote web server that I don't admin. Is this the case? [...] I doubt that it exists, but what would best serve my needs would be some spider that given a start page, crawls my GenWiki looking for empty articles and then goes to WikiPedia (using approved behavior of one page per second, etc) and copies the edit box material and pastes it back into my GenWiki.
Well, I can point you to several things that might make this job a little easier
- Special:Export can be used to grab an XMLified version of a page,
optionally including history; unfortunately, the equivalent Special:Import has never quite been finished.
- You can get the raw wikitext of a page by using
"pagename?action=raw" / "?title=pagename&action=raw"
- Best of all, there is a bot framework, written in Python,
specifically for interacting with Wikipedia (and, by extension, and MediaWiki site), so hopefully you needn't write a whole spider from scratch: see http://pywikipediabot.sourceforge.net/
Wikipedia specifically recommends NOT spidering the site. I would grab the dump, install it into a local MySQL, then just grab the ones you want. Or, you can try loading the entire database into your webserver via PHPMyAdmin. But that may take a while... or overload your server... or just eat up all your disk quota.
HTH, and good luck!