Lee Daniel Crocker wrote:
I agree, I don't think the parser's a big
issue, although it would be
nice for a bit snappier response. In hindsight, storing the wikitext in
a database was a mistake. There's already a wonderful piece of software
highly optimized and scalable for storing randomly accessed variable-
sized chunks of text with lots of tools for backup, replication, and so
on; it's called a file system. Storing the wikitext itself in something
like Reiserfs would probably speed it up, and also speed up access to
the rest of the metadata in the database which would become much
smaller.
That's what ExternalStore is for. Moving the bulk out of the database,
or at least to a different database, is a pressing need. We need to
separate bulk, rarely accessed data from hot data, so that we can save
the highly redundant storage on the DB master for hot data. Domas has
been working on it. We're running out of disk space on Ariel again, and
another compression round is obviously only a stopgap solution.
-- Tim Starling