On 17/10/12 10:55, Johannes Weberhofer wrote:
Platonides,
thank you very much for your review! I'll incorporate some changes in the next release.
I'm glad you liked it :)
The makealias.sh looks quite pretty but
Alias /wiki/common $TARGETDIR/webroot/common
We don't have a common folder (we do have skins/common but that's included in skins alias)
Hmm... Don't know why I have added that. I'm quite sure, there was some reason; I've removed it, as it's definitely no longer needed
It may have been in the root folder a long time ago.
You are mapping both script paths and documents inside /wiki What would happen if I wanted to create an article named 'skins'?
All Pages start with an uppercase-letter, so there should be no problem related to that.
Unless you have $wgCapitalLinks = false; (yes, I'm playing devil's advocate)
How would you forbid robots accessing history urls but allow normal pages?
I'd never thought about that; I've seen that there is '<meta name="robots" content="noindex,nofollow" />' included in HTML, so robots shouldn't index older versions.
Only after they downloaded the page and made you parse it. Whereas if forbidden in robots.txt, they don't even request it.
I'd prefer the classical /w /wiki setup.
I don't prefer none of the setups; for some reason I decided to use this setup some years ago.
Having the articles and scripts in the same location leads to issues (even more if they are also at the root, which thanksfully you're not doing).
I'm a bit intrigued pn how it is finding the right LocalSettings.php, though.
Pleas explain me more...
You have a mediawiki in /var/www/foo and another in /var/www/bar, how is it choosing between /var/www/foo/LocalSettings.php and /var/www/bar/LocalSettings.php ? Looks like it would always load /usr/share/mediawiki/LocalSettings.php ...