It's not split up (sharded) across servers, a least as far as page and revision tables go. There is one active master at any given time hat handles all writes; the current host has 160GB of memory and 10 physical cores (20 with hyperthreading). The actual revision *content* for all projects is indeed split up across several servers, in an 'external storage' cluster. The current server configuration is available at https://noc.wikimedia.org/conf/highlight.php?file=db-eqiad.php
You can get basic specs as well as load information on these servers by looking up each one in grafana; here's db1067 (the current enwiki master) as an example: https://grafana.wikimedia.org/d/000000607/cluster-overview?orgId=1&var-d...
Ariel
On Wed, Nov 28, 2018 at 4:34 PM Hershel Robinson hershelsr@gmail.com wrote:
I've heard that Wikimedia splits their enwiki database up among more than one server; is that how they're able to handle several page saves per second on the master?
That is correct. See here https://meta.wikimedia.org/wiki/Wikimedia_servers for more details.
-- http://civihosting.com/ Simply the best in shared hosting
MediaWiki-l mailing list To unsubscribe, go to: https://lists.wikimedia.org/mailman/listinfo/mediawiki-l