There's no need to change index.php, this is what
LocalSettings.php is
for. (Nor do we need more than one of it, as site-specific setup logic
happens under it and its includes.)
Maybe I am confused, but as I understood it, the wikipedia site itself is
running with an alternative configuration, described here:
http://wp.wikidev.net/Wiki_farm
and using this additional configuration
http://wikimedia.org/conf/
and presumably, a modified index.php or LocalSettings that initializes and
uses the SiteConfiguration object. The 1.3.8 download does not do
ordinarily use of the SiteConfiguration object.
But, I think I didn't do a good job of explaining the difference between
running multiple sites all with the same policy versus running multiple
sites with different policies.
As a very typical use case, imagine trying to power two (or more) mediawiki
instances on the same server that are branded completely differently. One
way to model this is for each of them to have their own skin, which can be
worked on separately and independently by webdesign specialists. Creating a
custom phptal skin requires extending the SkinPHPTal class. If you are
managing custom code for two different installations, they really should be
in their own versioned repositories, not intermingled.
What's the alternative? Hacking the Skin.php or SkinPHPTal.php directly in
the single, server-wide, software_home?
Beyond that, you may want to extend different mediawiki instances (special
pages, extensions, etc) differently for each project (not in a server-wide
way). So some special pages may exist on wiki A, but not on wiki B, but you
would still like for wiki A and B to be running off the same set of source.
To top it off, once you begin this kind of development, its really nice to
be able to test it, stage it, and deploy it in separate environments.
This is the reasoning that led me to set up a separate, minimal footprint
and include path on a per-instance basis, ''not'' a per-server basis.
Do other developers building wikis using the mediawiki software typically
version their entire wiki codebase themselves? How do they easily manage
and track the patches that they apply to the mediawiki, if they don't
cleanly separate out their changes?
Perhaps one of the reasons that the wikipedia team doesn't encounter this
difficulty in practice, is because whenever the wikipedia needs a new
feature it ends up in the core, and furthermore, it is usually applicable to
all the sites that the wikipedia powers.
I guess it just seems a bit weird to have people tell
us why they can't
use our software to run multiple sites off one set of source when
running multiple sites off one set of source is exactly what we do day
after day.
Is this really such a weird requirement? I am fairly sure there is no easy
way to manage the collaborative development of multiple custom wikis without
giving each of them their own "private" space. The challenge is doing so
w/out having to replicate the core code for each instance.
Thanks for responding. Please let me know if you think I am being thick.
Best Regards,
Jonah