2010/8/6 Nikola Smolenski smolensk@eunet.rs:
I have recently fixed some of its shortcomings (see https://bugzilla.wikimedia.org/show_bug.cgi?id=15607#c41 and onward) and I believe that the extension is now ready for practical use. Could anyone interested review the code and see if it is technically good enough, and if there is anything else that needs to be done? And if it is good enough and there is nothing critical missing, what would be the steps to test it in the real world and actually deploy it?
When it has passed review, I suppose we could set up a few wikis at prototype.wikimedia.org and test it there. It was originally set up for the usability initiative but it seems to be used for increasingly random testing purposes now, and we have no other proper place for this.
- Reading the links from a foreign database instead of via the API. Is there
a "proper" way to do this?
Yes. You can obtain a connection to a foreign DB with wfGetDb( DB_SLAVE, array() 'wikiID' ); . Both the wiki ID and the API URL can be obtained from the interwiki table now (that is, there are fields for them, we're not actually filling those fields yet), as part of Peter's work on the iwtransclusion branch for GSoC. How have you been obtaining API URLs anyway?
- Updating links on dependent wikis when deleting, undeleting etc. articles on
the central wiki.
This is something the iwtransclusion branch also has to address. The current plan is to do that using a global templatelinks table.
- Purging the pages on dependent wikis via the API, or smarter way of purging
without the API. Why does purging via the API require login, when purging the usual way does not?
That's a bug in the API purge module; I'll fix it later today. Again, the iwtransclusion branch also needs to take care of purging. There's no code for it yet, but I don't think it involves more than updating page_touched in the remote DB and notifying Squid.
Roan Kattouw (Catrope)