If you're looking to make a straight HTML mirror, HTTrack might meet your needs: http://www.httrack.com/
I wouldn't recommend it as a publishing mechanism per se - but it'll give you a quick and dirty mirror. There may be settings to omit 'edit' and 'history' links - I'm not absolutely sure about this (it's been a while since I've used it).
Good luck!
-- Jim R. Wilson (jimbojw)
On 5/3/07, johan.boye@latecoere.fr johan.boye@latecoere.fr wrote:
-----Message d'origine----- De : mediawiki-l-bounces@lists.wikimedia.org [mailto:mediawiki-l-bounces@lists.wikimedia.org] De la part de Rob Church Envoyé : jeudi 3 mai 2007 13:06 À : MediaWiki announcements and site admin list Objet : Re: [Mediawiki-l] Couple of Mediawiki questions
On 03/05/07, johan.boye@latecoere.fr johan.boye@latecoere.fr wrote:
# The search doesn't work on partial (incomplete) word. Got an article with the word "foobar", then I have a result when
I type "foobar" but nothing when I type "foob".
I have some exception, when I type "est"; it founds some pages
with the word "test"...
Is there a way to fix it?
The default search relies upon the MyISAM full text indexing engine, and may not be entirely useful, depending on your needs. You might have more luck with a custom search backend, such as the LuceneSearch extension, hooked up to the MWSearch daemon in Subversion, or something else entirely.
I've checked this out, look a little bit complicated. Is there any option on the MyISAM side to improve it easly ?
# Links (usually nasty file:// links) does not work with URL
containing spaces
You need to URL-encode spaces in URLs. Use %20.
Got it, thanks.
# Is there a plug-in to scan documents like .PDF or .DOC
on the fly during the upload to index them in the search engine?
Yes and no; see http://lists.wikimedia.org/pipermail/mediawiki-l/2007-April/019490.html
Nice one. Any idea when it will be released (if it will) ?
Is there a way to make a full backup of the website to make in available offline or export all pages to HTML in a save place on an automatic way?
You can write a shell script to do it. The procedure is documented at
http://www.mediawiki.org/wiki/Manual:Backing_up_a_wiki.
Ok, I will check on the XML side.
Thanks very much!
"Les informations contenues dans ce message électronique peuvent être de nature confidentielles et soumises à une obligation de secret. Elles sont destinées à l'usage exclusif du réel destinataire. Si vous n'êtes pas le réel destinataire, ou si vous recevez ce message par erreur, merci de le détruire immédiatement et de le notifier à son émetteur."
"The information contained in this e-mail may be privileged and confidential. It is intended for the exclusive use of the designated recipients named above. If you are not the intended recipient or if you receive this e-mail in error, please delete it and immediately notify the sender."
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/mediawiki-l