https://twitter.com/hassankhosseini/status/370212655996235776 - he says that this was produced in collaboration with Wikimedian John Vandenberg (CC'ed), who might be able to provide more information on how the numbers were generated.
On Wed, Aug 21, 2013 at 9:02 AM, Randall Farmer randall@wawd.com wrote:
Sadly they're moderating comments. I tweeted at the author, with links to WMF Ganglia as backup, and he definitely doesn't believe me; maybe something from a WMFer would help, if anyone thinks it's worth correcting: https://twitter.com/hassankhosseini/status/370090365354655744
Going by the Ganglia pages, actual Wikipedia has at lesat >2x the *RAM* that their scenario has *disk*. Pretty fun. (If you're curious, Ganglia's front page says it's tracking 14,744 cores on 988 hosts and 40T of RAM. Their scenario has <20T disk. There may be additional capacity not in that Ganglia setup, though it seemed to cover the obvious stuff.)
On Wed, Aug 21, 2013 at 8:23 AM, David Gerard dgerard@gmail.com wrote:
On 21 August 2013 16:12, hoo hoo@online.de wrote:
Am I wrong or did they actually calculate that for labs only (which would be rather funny)? At least they link to
https://wikitech.wikimedia.org/wiki/Special:Ask/-5B-5BResource-20Type::insta......] that run on up to 385 instances [...]") which AFAIK doesn't have any production servers.
Heh. Please do post a comment of correction and post it here too, so it doesn't just vanish ;-)
- d.
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l