<!-- Served in 0.003 secs. -->
root@flack:/hiphop/web/phase3/includes# ab -n 100 -c 1 'http://dom.as:8085/phase3/api.php?action=query&prop=info&titles=Main...' This is ApacheBench, Version 2.3 <$Revision: 655654 $> Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/ Licensed to The Apache Software Foundation, http://www.apache.org/
Benchmarking dom.as (be patient).....done
Server Software: Server Hostname: dom.as Server Port: 8085
Document Path: /phase3/api.php?action=query&prop=info&titles=Main%20Page Document Length: 991 bytes
Concurrency Level: 1 Time taken for tests: 0.389 seconds Complete requests: 100 Failed requests: 0 Write errors: 0 Total transferred: 116600 bytes HTML transferred: 99100 bytes Requests per second: 256.87 [#/sec] (mean) Time per request: 3.893 [ms] (mean) Time per request: 3.893 [ms] (mean, across all concurrent requests) Transfer rate: 292.49 [Kbytes/sec] received
Connection Times (ms) min mean[+/-sd] median max Connect: 0 0 0.0 0 0 Processing: 3 4 0.2 4 4 Waiting: 2 4 0.4 4 4 Total: 3 4 0.2 4 4
Percentage of the requests served within a certain time (ms) 50% 4 66% 4 75% 4 80% 4 90% 4 95% 4 98% 4 99% 4 100% 4 (longest request)
For those of us not familiar with MediaWiki benchmarking, what kind of times were you getting without hiphop?
On 27 February 2010 11:37, Domas Mituzas midom.lists@gmail.com wrote:
<!-- Served in 0.003 secs. -->
root@flack:/hiphop/web/phase3/includes# ab -n 100 -c 1 'http://dom.as:8085/phase3/api.php?action=query&prop=info&titles=Main...' This is ApacheBench, Version 2.3 <$Revision: 655654 $> Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/ Licensed to The Apache Software Foundation, http://www.apache.org/
Benchmarking dom.as (be patient).....done
Server Software: Server Hostname: dom.as Server Port: 8085
Document Path: /phase3/api.php?action=query&prop=info&titles=Main%20Page Document Length: 991 bytes
Concurrency Level: 1 Time taken for tests: 0.389 seconds Complete requests: 100 Failed requests: 0 Write errors: 0 Total transferred: 116600 bytes HTML transferred: 99100 bytes Requests per second: 256.87 [#/sec] (mean) Time per request: 3.893 [ms] (mean) Time per request: 3.893 [ms] (mean, across all concurrent requests) Transfer rate: 292.49 [Kbytes/sec] received
Connection Times (ms) min mean[+/-sd] median max Connect: 0 0 0.0 0 0 Processing: 3 4 0.2 4 4 Waiting: 2 4 0.4 4 4 Total: 3 4 0.2 4 4
Percentage of the requests served within a certain time (ms) 50% 4 66% 4 75% 4 80% 4 90% 4 95% 4 98% 4 99% 4 100% 4 (longest request) _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On 27 February 2010 19:58, Thomas Dalton thomas.dalton@gmail.com wrote:
For those of us not familiar with MediaWiki benchmarking, what kind of times were you getting without hiphop?
The parser typically takes 2-10 seconds on an uncached en:wp page, so speeding that process up 1000x is, um, HOLY CRAP!
(Hosting providers sell network bandwidth and disk space; I can see them putting resources into Hiphopifying the common PHP crapware just to use 1/1000 the CPU.)
Domas, how much hacking did you have to do to MediaWiki to get it to compile in Hiphop?
- d.
2010/2/27 David Gerard dgerard@gmail.com:
The parser typically takes 2-10 seconds on an uncached en:wp page, so speeding that process up 1000x is, um, HOLY CRAP!
You're comparing apples with oranges. Domas was testing a simple API page info query, which is much more lightweight than a full-blown parse involving enwiki's crazy templates.
Roan Kattouw (Catrope)
On 28 February 2010 00:30, Roan Kattouw roan.kattouw@gmail.com wrote:
2010/2/27 David Gerard dgerard@gmail.com:
The parser typically takes 2-10 seconds on an uncached en:wp page, so speeding that process up 1000x is, um, HOLY CRAP!
You're comparing apples with oranges. Domas was testing a simple API page info query, which is much more lightweight than a full-blown parse involving enwiki's crazy templates.
So I saw from Domas's followup :-)
Nevertheless - a process isn't the same process when it's going at 10x the speed. This'll be interesting.
(I'm sure the complexity of templates will go up to compensate, unless Tim's parser functions reaper is set down to match, muwahaha.)
- d.
On 27 February 2010 16:41, David Gerard dgerard@gmail.com wrote:
(I'm sure the complexity of templates will go up to compensate, unless Tim's parser functions reaper is set down to match, muwahaha.)
Speeding up parsing will reveal a new bottleneck for the devs to fight the enwiki community over, don't worry about that.
Nevertheless - a process isn't the same process when it's going at 10x the speed. This'll be interesting.
not 10x. I did concurrent benchmarks for API requests (e.g. opensearch) on modern boxes, and saw:
HipHop: Requests per second: 1975.39 [#/sec] (mean) Zend: Requests per second: 371.29 [#/sec] (mean)
these numbers seriously kick ass. I still can't believe I observe 2000 mediawiki requests/s from a single box ;-)
Domas
On 28 February 2010 21:33, Domas Mituzas midom.lists@gmail.com wrote:
these numbers seriously kick ass. I still can't believe I observe 2000 mediawiki requests/s from a single box ;-)
So ... how restricted is HipHop PHP, and what are the hotspots in MediaWiki that would most benefit from it?
- d.
On Sun, Feb 28, 2010 at 21:39, David Gerard dgerard@gmail.com wrote:
On 28 February 2010 21:33, Domas Mituzas midom.lists@gmail.com wrote:
these numbers seriously kick ass. I still can't believe I observe 2000 mediawiki requests/s from a single box ;-)
So ... how restricted is HipHop PHP, and what are the hotspots in MediaWiki that would most benefit from it?
Most of the code in MediaWiki works just fine with it (since most of it is mundane) but things like dynamically including certain files, declaring classes, eval() and so on are all out.
It should be possible to replace all that at the cost of code that's a bit more verbose.
Even if it wasn't hotspots like the parser could still be compiled with hiphop and turned into a PECL extension.
One other nice thing about hiphop is that the compiler output is relatively readable compared to most compilers. Meaning that if you need to optimize some particular function it's easy to take the generated .cpp output and replace the generated code with something more native to C++ that doesn't lose speed because it needs to manipulate everything as a php object.
Howdy,
Most of the code in MediaWiki works just fine with it (since most of it is mundane) but things like dynamically including certain files, declaring classes, eval() and so on are all out.
There're two types of includes in MediaWiki, ones I fixed for AutoLoader and ones I didn't - HPHP has all classes loaded, so AutoLoader is redundant. Generally, every include that just defines classes/functions is fine with HPHP, it is just some of MediaWiki's startup logic (Setup/WebStart) that depends on files included in certain order, so we have to make sure HipHop understands those includes. There was some different behavior with file including - in Zend you can say require("File.php"), and it will try current script's directory, but if you do require("../File.php") - it will
We don't have any eval() at the moment, and actually there's a mode when eval() works, people are just scared too much of it. We had some double class definitions (depending on whether certain components are available), as well as double function definitions ( ProfilerStub vs Profiler )
One of major problems is simply still not complete function set, that we'd need:
* session - though we could sure work around it by setting up our own Session abstraction, team at facebook is already busy implementing full support * xdiff, mhash - the only two calls to it are from DiffHistoryBlob - so getting the feature to work is mandatory for production, not needed for testing :) * tidy - have to call the binary now
function_exists() is somewhat crippled, as far as I understand, so I had to work around certain issues there. There're some other crippled functions, which we hit through the testing...
It is quite fun to hit all the various edge cases in PHP language (e.g. interfaces may have constants) which are broken in hiphop. Good thing is having developers carefully reading/looking at those. Some things are still broken, some can be worked around in MediaWiki.
Some of crashes I hit are quite difficult to reproduce - it is easier to bypass that code for now, and come up with good reproduction cases later.
Even if it wasn't hotspots like the parser could still be compiled with hiphop and turned into a PECL extension.
hiphop provides major boost for actual mediawiki initialization too - while Zend has to reinitialize objects and data all the time, having all that in core process image is quite efficient.
One other nice thing about hiphop is that the compiler output is relatively readable compared to most compilers. Meaning that if you
That especially helps with debugging :)
need to optimize some particular function it's easy to take the generated .cpp output and replace the generated code with something more native to C++ that doesn't lose speed because it needs to manipulate everything as a php object.
Well, that is not entirely true - if it manipulated everything as PHP object (zval), it would be as slow and inefficient as PHP. The major cost benefit here is that it does strict type inference, and falls back to Variant only when it cannot come up with decent type. And yes, one can find offending code that causes the expensive paths. I don't see manual C++ code optimizations as way to go though - because they'd be overwritten by next code build.
Anyway, there're lots of interesting problems after we get mediawiki working on it - that is, how would we deploy it, how would we maintain it, etc. Building on single box takes around 10 minutes, and the image has to be replaced by shutting down old one and starting new one, not just overwriting the files.
Domas
Looks like a loot of fun :-)
On 1 March 2010 11:10, Domas Mituzas midom.lists@gmail.com wrote: ...
Even if it wasn't hotspots like the parser could still be compiled with hiphop and turned into a PECL extension.
hiphop provides major boost for actual mediawiki initialization too - while Zend has to reinitialize objects and data all the time, having all that in core process image is quite efficient.
One other nice thing about hiphop is that the compiler output is relatively readable compared to most compilers. Meaning that if you
That especially helps with debugging :)
need to optimize some particular function it's easy to take the generated .cpp output and replace the generated code with something more native to C++ that doesn't lose speed because it needs to manipulate everything as a php object.
Well, that is not entirely true - if it manipulated everything as PHP object (zval), it would be as slow and inefficient as PHP. The major cost benefit here is that it does strict type inference, and falls back to Variant only when it cannot come up with decent type. And yes, one can find offending code that causes the expensive paths. I don't see manual C++ code optimizations as way to go though - because they'd be overwritten by next code build.
this smell like something that can benefict from metadata.
/* [return integer] */ function getApparatusId($obj){ //body }
- - -
User question follows:
What we can expect? will future versions of MediaWiki be "hiphop compatible"? there will be a fork or snapshot compatible? The whole experiment looks like will help to profile and enhance the engine, will it generate a MediaWiki.tar.gz file we (the users) will able to install in our intranetss ??
Maybe a blog article about your findings could be nice. It may help "write fast PHP code". And will scare littel childrens and PHP programmers with a C++ background.
-- ℱin del ℳensaje.
Howdy,
Looks like a loot of fun :-)
Fun enough to have my evenings and weekends on it :)
this smell like something that can benefict from metadata. /* [return integer] */ function getApparatusId($obj){ //body }
Indeed - type hints can be quite useful, though hiphop is smart enough to figure out it will be an integer return from code :)
It is quite interesting to see the enhancements to PHP that have been inside facebook and now are all released - XHP evolves PHP syntax to fit the web world ( http://www.facebook.com/notes/facebook-engineering/xhp-a-new-way-to-write-ph... ), the XBOX thing allows background/async execution of work without standing in the way of page rendering, etc.
What we can expect? will future versions of MediaWiki be "hiphop compatible"? there will be a fork or snapshot compatible? The whole experiment looks like will help to profile and enhance the engine, will it generate a MediaWiki.tar.gz file we (the users) will able to install in our intranetss ??
Well, the build itself is quite portable (you'd have to have single binary and LocalSettings.php ;-)
Still, the decision to merge certain changes into MediaWiki codebase (e.g. relative includes, rather than $IP-based absolute ones) would be quite invasive. Also, we'd have to enforce stricter policy on how some of the dynamic PHP features are used.
I have to deal here with three teams (wikimedia ops, mediawiki development community and hiphop developers) to make stuff possible. Do note, getting it work for MediaWiki is quite simple task, compared to getting it work in Wikimedia operations environment.
What I'd like to see though as final result - MediaWiki that works fine with both Zend and HPHP, and Wikimedia using the latter. Unfortunately, I will not be able to visit Berlin developer meeting to present this work to other developers and will try to get some separate discussions. You know, most of work will be coming up with solutions that are acceptable by Tim :-)
Maybe a blog article about your findings could be nice. It may help "write fast PHP code". And will scare littel childrens and PHP programmers with a C++ background.
My findings are hectic, at the moment, and I don't want to talk too much about them, until I get decently working mediawiki. BTW, Main_Page and Special:BlankPage were both served in ~12ms. Now I have to get complex parser test cases work, and such.
Domas
On Mon, Mar 1, 2010 at 13:35, Domas Mituzas midom.lists@gmail.com wrote:
Still, the decision to merge certain changes into MediaWiki codebase (e.g. relative includes, rather than $IP-based absolute ones) would be quite invasive. Also, we'd have to enforce stricter policy on how some of the dynamic PHP features are used.
I might be revealing my lack of knowledge about PHP here but why is that invasive and why do we use $IP in includes in the first place? I did some tests here:
Which show that as long as you set_include_path() with $IP/includes/ at the front PHP will make exactly the same stat(), read() etc. calls with relative paths that it does with absolute paths.
Maybe that's only on recent versions, I tested on php 5.2.
The point of $IP is that you can use multisite environments by just having index.php and Localsettings.php (and skin crap) in the per-vhost directory, and have extensions and other stuff centralized so you can update the extension once and all the wikis automatically have it. However, the Installer could be patched, to resolve $IP automatically if the user wishes to run a HipHop environment.
Marco
On Mon, Mar 1, 2010 at 2:59 PM, Ævar Arnfjörð Bjarmason avarab@gmail.com wrote:
On Mon, Mar 1, 2010 at 13:35, Domas Mituzas midom.lists@gmail.com wrote:
Still, the decision to merge certain changes into MediaWiki codebase (e.g. relative includes, rather than $IP-based absolute ones) would be quite invasive. Also, we'd have to enforce stricter policy on how some of the dynamic PHP features are used.
I might be revealing my lack of knowledge about PHP here but why is that invasive and why do we use $IP in includes in the first place? I did some tests here:
Which show that as long as you set_include_path() with $IP/includes/ at the front PHP will make exactly the same stat(), read() etc. calls with relative paths that it does with absolute paths.
Maybe that's only on recent versions, I tested on php 5.2.
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Marco Schuster schrieb:
The point of $IP is that you can use multisite environments by just having index.php and Localsettings.php (and skin crap) in the per-vhost directory, and have extensions and other stuff centralized so you can update the extension once and all the wikis automatically have it.
That'S a silly multi-host setup. Much easier to have a single copy of everything, and just use conditionals in localsettings, based on hostname or path.
-- daniel
On Mon, Mar 1, 2010 at 3:26 PM, Daniel Kinzler daniel@brightbyte.de wrote:
Marco Schuster schrieb:
The point of $IP is that you can use multisite environments by just having index.php and Localsettings.php (and skin crap) in the per-vhost directory, and have extensions and other stuff centralized so you can update the extension once and all the wikis automatically have it.
That'S a silly multi-host setup. Much easier to have a single copy of everything, and just use conditionals in localsettings, based on hostname or path.
Downside of this: as a provider, *you* must make the change, not the customer, as it is one central file.
Marco
On Mon, Mar 1, 2010 at 10:10, Domas Mituzas midom.lists@gmail.com wrote:
Howdy,
Most of the code in MediaWiki works just fine with it (since most of it is mundane) but things like dynamically including certain files, declaring classes, eval() and so on are all out.
There're two types of includes in MediaWiki, ones I fixed for AutoLoader and ones I didn't - HPHP has all classes loaded, so AutoLoader is redundant. Generally, every include that just defines classes/functions is fine with HPHP, it is just some of MediaWiki's startup logic (Setup/WebStart) that depends on files included in certain order, so we have to make sure HipHop understands those includes. There was some different behavior with file including - in Zend you can say require("File.php"), and it will try current script's directory, but if you do require("../File.php") - it will
We don't have any eval() at the moment, and actually there's a mode when eval() works, people are just scared too much of it. We had some double class definitions (depending on whether certain components are available), as well as double function definitions ( ProfilerStub vs Profiler )
One of major problems is simply still not complete function set, that we'd need:
- session - though we could sure work around it by setting up our own Session abstraction, team at facebook is already busy implementing full support
- xdiff, mhash - the only two calls to it are from DiffHistoryBlob - so getting the feature to work is mandatory for production, not needed for testing :)
- tidy - have to call the binary now
function_exists() is somewhat crippled, as far as I understand, so I had to work around certain issues there. There're some other crippled functions, which we hit through the testing...
It is quite fun to hit all the various edge cases in PHP language (e.g. interfaces may have constants) which are broken in hiphop. Good thing is having developers carefully reading/looking at those. Some things are still broken, some can be worked around in MediaWiki.
Some of crashes I hit are quite difficult to reproduce - it is easier to bypass that code for now, and come up with good reproduction cases later.
Even if it wasn't hotspots like the parser could still be compiled with hiphop and turned into a PECL extension.
hiphop provides major boost for actual mediawiki initialization too - while Zend has to reinitialize objects and data all the time, having all that in core process image is quite efficient.
One other nice thing about hiphop is that the compiler output is relatively readable compared to most compilers. Meaning that if you
That especially helps with debugging :)
need to optimize some particular function it's easy to take the generated .cpp output and replace the generated code with something more native to C++ that doesn't lose speed because it needs to manipulate everything as a php object.
Well, that is not entirely true - if it manipulated everything as PHP object (zval), it would be as slow and inefficient as PHP. The major cost benefit here is that it does strict type inference, and falls back to Variant only when it cannot come up with decent type. And yes, one can find offending code that causes the expensive paths. I don't see manual C++ code optimizations as way to go though - because they'd be overwritten by next code build.
The case I had in mind is when you have say a function in the parser that takes a $string and munges it. If that turns out to be a bottleneck you could just get a char* out of that $string and munge it at the C level instead of calling the PHP wrappers for things like explode() and other php string/array munging.
That's some future project once it's working and those bottlenecks are found though, I was just pleasantly surprised that hphp makes this relatively easy.
One large practical upshot of this is though that hacky things like the parser which are the way they are because that's how you optimize this sort of thing in PHP could be written in some babytalk version of PHP that produces a real parse tree; It would be slower in pure php but maybe hphp's speed could make up for it.
Then you could take that component & compile it to C++ (maybe with some manual munging) and make libmediawiki-parse++ which, that would be quite awesome :)
-----Original Message----- From: wikitech-l-bounces@lists.wikimedia.org [mailto:wikitech-l-bounces@lists.wikimedia.org] On Behalf Of Ævar Arnfjörð Bjarmason Sent: 01 March 2010 13:34 To: Wikimedia developers Subject: Re: [Wikitech-l] hiphop progress
On Mon, Mar 1, 2010 at 10:10, Domas Mituzas midom.lists@gmail.com wrote:
Howdy,
Most of the code in MediaWiki works just fine with it
(since most of
it is mundane) but things like dynamically including
certain files,
declaring classes, eval() and so on are all out.
There're two types of includes in MediaWiki, ones I fixed
for AutoLoader and ones I didn't - HPHP has all classes loaded, so AutoLoader is redundant.
Generally, every include that just defines
classes/functions is fine with HPHP, it is just some of MediaWiki's startup logic (Setup/WebStart) that depends on files included in certain order, so we have to make sure HipHop understands those includes.
There was some different behavior with file including - in Zend
you
can say require("File.php"), and it will try current script's directory, but if you do require("../File.php") - it will
We don't have any eval() at the moment, and actually
there's a mode when eval() works, people are just scared too much of it.
We had some double class definitions (depending on whether certain
components are available), as well as double function definitions
(
ProfilerStub vs Profiler )
One of major problems is simply still not complete function
set, that we'd need:
- session - though we could sure work around it by setting
up our own
Session abstraction, team at facebook is already busy implementing
full support
- xdiff, mhash - the only two calls to it are from
DiffHistoryBlob -
so getting the feature to work is mandatory for production,
not needed
for testing :)
- tidy - have to call the binary now
function_exists() is somewhat crippled, as far as I
understand, so I had to work around certain issues there.
There're some other crippled functions, which we hit
through the testing...
It is quite fun to hit all the various edge cases in PHP
language (e.g. interfaces may have constants) which are broken in hiphop.
Good thing is having developers carefully reading/looking
at those. Some things are still broken, some can be worked around in MediaWiki.
Some of crashes I hit are quite difficult to reproduce - it
is easier to bypass that code for now, and come up with good reproduction cases later.
Even if it wasn't hotspots like the parser could still be
compiled
with hiphop and turned into a PECL extension.
hiphop provides major boost for actual mediawiki
initialization too - while Zend has to reinitialize objects and data all the time, having all that in core process image is quite efficient.
One other nice thing about hiphop is that the compiler output is relatively readable compared to most compilers. Meaning that if
you
That especially helps with debugging :)
need to optimize some particular function it's easy to take the generated .cpp output and replace the generated code with
something
more native to C++ that doesn't lose speed because it needs to manipulate everything as a php object.
Well, that is not entirely true - if it manipulated
everything as PHP object (zval), it would be as slow and inefficient as PHP. The major cost benefit here is that it does strict type inference, and falls back to Variant only when it cannot come up with decent type.
And yes, one can find offending code that causes the
expensive paths. I don't see manual C++ code optimizations as way to go though - because they'd be overwritten by next code build.
The case I had in mind is when you have say a function in the parser that takes a $string and munges it. If that turns out to be a bottleneck you could just get a char* out of that $string and munge it at the C level instead of calling the PHP wrappers for things like explode() and other php string/array munging.
That's some future project once it's working and those bottlenecks are found though, I was just pleasantly surprised that hphp makes this relatively easy.
I would think that getting hiphop to compile out regular expressions from preg_*() calls to C++ (like re2c), would be the idea.
Jared
-----Original Message----- From: wikitech-l-bounces@lists.wikimedia.org [mailto:wikitech-l-bounces@lists.wikimedia.org] On Behalf Of Domas Mituzas Sent: 01 March 2010 10:11 To: Wikimedia developers Subject: [Wikitech-l] hiphop progress
Howdy,
Most of the code in MediaWiki works just fine with it
(since most of
it is mundane) but things like dynamically including certain
files,
declaring classes, eval() and so on are all out.
There're two types of includes in MediaWiki, ones I fixed for AutoLoader and ones I didn't - HPHP has all classes loaded, so AutoLoader is redundant. Generally, every include that just defines classes/functions is fine with HPHP, it is just some of MediaWiki's startup logic (Setup/WebStart) that depends on files included in certain order, so we have to make sure HipHop understands those includes. There was some different behavior with file including - in Zend you can say require("File.php"), and it will try current script's directory, but if you do require("../File.php") - it will
We don't have any eval() at the moment, and actually there's a mode when eval() works, people are just scared too much of it. We had some double class definitions (depending on whether certain components are available), as well as double function definitions ( ProfilerStub vs Profiler )
One of major problems is simply still not complete function set, that we'd need:
- session - though we could sure work around it by setting up
our own Session abstraction, team at facebook is already busy implementing full support
- xdiff, mhash - the only two calls to it are from
DiffHistoryBlob - so getting the feature to work is mandatory for production, not needed for testing :)
Mhash been obsoleted by the hash extension, and HipHop has the hash extension (looking at the src).
I think mhash is implemented as a wrapper onto the hash extension for a while. (http://svn.php.net/viewvc?view=revision&revision=269961)
assert(hash('adler32', 'foo', true) === mhash(MHASH_ADLER32, 'foo'));
Jared
Domas Mituzas wrote:
Jared,
assert(hash('adler32', 'foo', true) === mhash(MHASH_ADLER32, 'foo'));
Thanks! Would get to that eventually, I guess. Still, there's xdiff and few other things.
xdiff is only needed for recompression. For page views, there is a pure-PHP port of the "patch" part.
-- Tim Starling
On Sun, Feb 28, 2010 at 4:33 PM, Domas Mituzas midom.lists@gmail.comwrote:
Nevertheless - a process isn't the same process when it's going at 10x the speed. This'll be interesting.
not 10x. I did concurrent benchmarks for API requests (e.g. opensearch) on modern boxes, and saw:
HipHop: Requests per second: 1975.39 [#/sec] (mean) Zend: Requests per second: 371.29 [#/sec] (mean)
these numbers seriously kick ass. I still can't believe I observe 2000 mediawiki requests/s from a single box ;-)
Great job Domas. It'll be exciting to see the final product.
On 02/28/2010 01:33 PM, Domas Mituzas wrote:
not 10x. I did concurrent benchmarks for API requests (e.g. opensearch) on modern boxes, and saw:
HipHop: Requests per second: 1975.39 [#/sec] (mean) Zend: Requests per second: 371.29 [#/sec] (mean)
these numbers seriously kick ass. I still can't believe I observe 2000 mediawiki requests/s from a single box ;-)
Bravo! That's fantastic. Thanks for both the work and the testing.
William
On Sun, Feb 28, 2010 at 21:33, Domas Mituzas midom.lists@gmail.com wrote:
Nevertheless - a process isn't the same process when it's going at 10x the speed. This'll be interesting.
not 10x. I did concurrent benchmarks for API requests (e.g. opensearch) on modern boxes, and saw:
HipHop: Requests per second: 1975.39 [#/sec] (mean) Zend: Requests per second: 371.29 [#/sec] (mean)
these numbers seriously kick ass. I still can't believe I observe 2000 mediawiki requests/s from a single box ;-)
Awesome. I did some tryouts with hiphop too before you started overtaking me.
Is this work on SVN yet? Maybe it would be nice to create a branch for it so that other people can poke it?
Hi!
For those of us not familiar with MediaWiki benchmarking, what kind of times were you getting without hiphop?
Zend: <!-- Served in 0.011 secs. -->
Domas, how much hacking did you have to do to MediaWiki to get it to compile in Hiphop?
Lots. I'm trying to get basic functionality/prototypes work. Some changes had to be done to HipHop itself, some had to be done to generated code, some had to be done to MediaWiki.
MediaWiki's "run wherever I can" dynamic adaptation to any environment isn't too helpful sometimes...
Domas
P.S. Zend:
Concurrency Level: 1 Time taken for tests: 1.444158 seconds Complete requests: 100 Failed requests: 0 Write errors: 0 Total transferred: 138020 bytes HTML transferred: 109600 bytes Requests per second: 69.24 [#/sec] (mean) Time per request: 14.442 [ms] (mean) Time per request: 14.442 [ms] (mean, across all concurrent requests) Transfer rate: 92.79 [Kbytes/sec] received
Connection Times (ms) min mean[+/-sd] median max Connect: 0 0 0.0 0 0 Processing: 14 14 0.0 14 14 Waiting: 10 12 1.7 14 14 Total: 14 14 0.0 14 14 WARNING: The median and mean for the waiting time are not within a normal deviation These results are probably not that reliable.
Percentage of the requests served within a certain time (ms) 50% 14 66% 14 75% 14 80% 14 90% 14 95% 14 98% 14 99% 14 100% 14 (longest request)
wikitech-l@lists.wikimedia.org