Ivan Krstic wrote:
it's very arguable what exactly is meant by "more professional".
Generally speaking, speed is MySQL's forte, and is also what MediaWiki
has a great need for (particularly with Wikipedia). The magazine
articles you've read were probably referring to some of the more
"enterprise" features (MySQL for a long time did not support
transactions, and replication was dubious at best) which MediaWiki
doesn't really need. I don't personally think there is much benefit in
letting the code use Postgres, but if you're willing to put in the
effort - hey, great, send a patch.
As for the MediaWiki table structure, while I'm sure we can help with
problems you might encounter, it might be easier for you to install,
temporarily, MySQL on one of your machines. MySQL binaries are available
for a slew of operating systems and architectures, so it shouldn't take
I'm really pleased that Bernhard has stepped forward to offer to work on
this. Thank you, Bernhard!
I think the question of which database is best for Wikipedia is very
much an open question; it's entirely possible that a database that is
faster under a simple load because of simple uncomplicated code may be
slower under a complex load, because of a lack of code to resolve
complex concurrency issues.
I don't know if that's true -- what I do know is that it would be very
useful to be able to run MediaWiki on either, as this will
* increase choice, particularly in the future
* reduce dependence on the SQL idioms of any given DBMS: standards are good!
* enable benchmarking to see which database is most suitable for our
current and predicted future loads.
Certainly, in the near future, the Wikipedia DB will have "enterprise
class" needs for resilience and scalability.
In particular, thorough support for replication and snapshotting are
likely to be important.