For proof of principle (and just for fun, of course!) I took a few hours to
write a simple single-pass parser for wikipedia pages in C++, using sql++
(the "official" MySQL API).
It supports external and internal links (with "does topic exist" check),
special line beginnings (:, *, #, and leading space), wiki-style bold and
italics, ---- things etc.
No == headings == (beheadings neither;), no namespaces, but these are simple
to implement.
Called from the shell (as I didn't connect it with the apache yet), it
renders the Main Page in 0:00.04 (without caching ;)
Should I continue to work on this, and eventually add it to the CVS, or is
it just a waste of time?
I will, of course, continue to work on the PHP script, no doubt about that!
Magnus
Show replies by date
Magnus Manske wrote:
Called from the shell (as I didn't connect it with
the apache yet), it
renders the Main Page in 0:00.04 (without caching ;)
Is there any way to compare that in a "fair way" to the performance of
the php script? One thing to keep in mind is that if/when apc is working
properly, the overhead of interpreting the php goes away.
--Jimbo