Brion Vibber wrote:
That's why I keep hoping that the people who pop up and say "hey, this would be a lot better with postgresql" can quantify the claim by making the software work with postgresql and actually comparing performance with the same operations on the same dataset on the same machine -- things we can't do until the software is able to run with postgresql.
Well, there are 2 ways, as usual. The fast way is to use the scripts coming with pg to convert tables. There are even tools to convert queries. But this will give a result you don't want to test performence with. The second one is to set up tables and views and indices as requiered by wikipedia, to get the best performance.
I think there have to be one step in the middle, the step Lee mentioned: all sql - commands in the wiki php code should replaced by functions, included from sql.inc or so. Then a change of database will affect one file, and not nearly all. I looked a short look at the sql-tables last weekend, just for fun, and at the different types mysql and pg using. Unfortunaly it is very hard to get the structure of how tables are used, if the queries are spreaded so much.
But, I will start porting the tables as soon I found some spare time. Lee, is there a loctation where I can get these mysql statements you isolated?
Smurf