This is exactly what I was working on : http://www.theregister.co.uk/2010/02/02/facebook_hiphop_unveiled/
Compile php to c++.
mike
http://www.phpcompiler.org/lists/phc-general/2009-February/000894.html
On Wed, Feb 3, 2010 at 12:10 AM, Domas Mituzas midom.lists@gmail.com wrote:
Hi,
This is exactly what I was working on :
Where can we read more about your work?
Domas
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
That is old news, I have stopped working on it, not because I dont think it was a good idea, but because phc was not working. They have cancelled the project. I was looking into roadsend recently, but this new hiphop php compiler is exactly what we need.
When I find time, will try compiling my test cases with it.
This could be a real boost for the wikipedia! A native optimized binary version of the bloated php code, amazing!
mike
On Wed, Feb 3, 2010 at 12:17 AM, Domas Mituzas midom.lists@gmail.com wrote:
Hi!
http://www.phpcompiler.org/lists/phc-general/2009-February/000894.html
Great progress!
Domas
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Hi!
That is old news, I have stopped working on it, not because I dont think it was a good idea, but because phc was not working. They have cancelled the project.
Poor you. You wasted your bandwidth on project that was doomed :(
I was looking into roadsend recently,
I was too! But they're rewriting everything anyway ;-)
but this new hiphop php compiler is exactly what we need.
I wouldn't be so sure about "exactly" part.
When I find time, will try compiling my test cases with it.
Great!
This could be a real boost for the wikipedia!
At FB it shaves off about 50% of execution time. I'm not sure how you quantify "real boost".
A native optimized binary version of the bloated php code, amazing!
Why would you call the code 'bloated'? It is relatively clean code, that lots of people would envy (except few guys who'd say they can build much better software, though wouldn't have anything to show).
Do note that this is translation of dynamic language code, it doesn't make it as efficient as code written for native compilation.
Domas
On 3 February 2010 00:15, Domas Mituzas midom.lists@gmail.com wrote:
Do note that this is translation of dynamic language code, it doesn't make it as efficient as code written for native compilation.
OTOH, given that wikitext is defined as "what the parser does", it's the only current realistic prospect for something faster ...
Wonder how it does at parser-function-laden nested templates.
- d.
David Gerard wrote:
On 3 February 2010 00:15, Domas Mituzas midom.lists@gmail.com wrote:
Do note that this is translation of dynamic language code, it doesn't make it as efficient as code written for native compilation.
OTOH, given that wikitext is defined as "what the parser does", it's the only current realistic prospect for something faster ...
Wonder how it does at parser-function-laden nested templates.
The template part of the parser is defined by:
http://www.mediawiki.org/wiki/Preprocessor_ABNF
Not by "what the parser does". It can easily be ported from PHP to C++, since that was a design goal when I rewrote it for MW 1.12. In fact, the Preprocessor_Hash implementation was meant as a model for a C++ port, not as a permanent and useful part of MediaWiki.
-- Tim Starling
wikimedia-l@lists.wikimedia.org