Looks like a loot of fun :-)
On 1 March 2010 11:10, Domas Mituzas midom.lists@gmail.com wrote: ...
Even if it wasn't hotspots like the parser could still be compiled with hiphop and turned into a PECL extension.
hiphop provides major boost for actual mediawiki initialization too - while Zend has to reinitialize objects and data all the time, having all that in core process image is quite efficient.
One other nice thing about hiphop is that the compiler output is relatively readable compared to most compilers. Meaning that if you
That especially helps with debugging :)
need to optimize some particular function it's easy to take the generated .cpp output and replace the generated code with something more native to C++ that doesn't lose speed because it needs to manipulate everything as a php object.
Well, that is not entirely true - if it manipulated everything as PHP object (zval), it would be as slow and inefficient as PHP. The major cost benefit here is that it does strict type inference, and falls back to Variant only when it cannot come up with decent type. And yes, one can find offending code that causes the expensive paths. I don't see manual C++ code optimizations as way to go though - because they'd be overwritten by next code build.
this smell like something that can benefict from metadata.
/* [return integer] */ function getApparatusId($obj){ //body }
- - -
User question follows:
What we can expect? will future versions of MediaWiki be "hiphop compatible"? there will be a fork or snapshot compatible? The whole experiment looks like will help to profile and enhance the engine, will it generate a MediaWiki.tar.gz file we (the users) will able to install in our intranetss ??
Maybe a blog article about your findings could be nice. It may help "write fast PHP code". And will scare littel childrens and PHP programmers with a C++ background.
-- ℱin del ℳensaje.