On Sat, Nov 16, 2002 at 06:08:00PM +0100, Erik Moeller wrote:
You see? No
need for a separate table of dangling links, if you make sure
that every article comes into existance. Subqueries are really nice.
That's a bad idea, because often you will have articles with many links to
non-existent pages, which would all have to be created. This would blow
the CUR table out of proportion, with no real advantage.
I don't think so. With the right data definitions, the number of
entries in the cur table would no more than double, and the amount of
space taken up would definately NOT double; I doubt if it would add even
10 megs to the table size.
Note that my impression is that the main reason the
site is so slow is
simply that MySQL doesn't handle very large tables very well (maybe
subqueries would help here, I don't know). That's why we should try to
avoid making them bigger while we still use it.
Postgres on the other hand, does have mechanisms for dealing with large
tables. First, you create an index. Or several indexes. Then you run
the vacuum optimizer program every night. It just gets faster and
faster. This is the advantage of using a real database.
If we port this to Postgres, and then expand yet again, we'll be in good
shape if some kind soul wants to donate an Oracle license, because we
won't have to worry about MySQLisms or PostgreSQLisms in doing the port
to Oracle.
Jonathan
--
Geek House Productions, Ltd.
Providing Unix & Internet Contracting and Consulting,
QA Testing, Technical Documentation, Systems Design & Implementation,
General Programming, E-commerce, Web & Mail Services since 1998
Phone: 604-435-1205
Email: djw(a)reactor-core.org
Webpage:
http://reactor-core.org
Address: 2459 E 41st Ave, Vancouver, BC V5R2W2