As some of you might remember, I tried to write a wiki(pedia) offline
reader/editor a while ago. After some trouble with the GUI, and because
someone pointed (rightly!) out that my parser won't do unicode, I
stopped working on this.
Lately, I had several people asking me about a stand-alone wiki(pedia)
parser. The latest related request was on the mailing list yesterday
("Java code...").
Now, I have created a (partly) working parser, written in C++. It is
based on a string object I also wrote, natively using 16-bit chars and
thus supporting unicode. (A function to actually import/export mysql
unicode remains to be written, though).
Together with this, I started rewriting the common wikipedia functions
(skins, languages, user management, etc.). That part is still at its
beginning, but it can already render the "Wikipedia:How to edit a page"
in a half-complete standard skin. This includes a function to parse the
"LanguageXX.php" files, so no need to rewrite all of that. (Import is a
little slow, though, so that won't be a permanent solution; it rather
screams "conversion").
As an example, the whole thing comes as a command line tool, which can
render a wiki-style text (from file or pipe) into HTML (no skin or
standard skin, by parameter).
I have commited the sources (hereby under GPL) at the wikipedia CVS,
into a new module called "Waikiki" (seemed like the natural extension of
wiki to me;-)
This could be the basis for several wiki(pedia) software projects:
* Offline editor
* Reader, to be distributed on CD/DVD
* Wiki(pedia) apache module
So, fire up those compilers and go to work! ;-)
Magnus