Hello Steve,
Steve Bennett wrote:
On 8/2/06, Julien Lemoine speedblue@happycoders.org wrote:
First of all, I grabbed the whole content of english/french wikipedia in june 2006. Then, everything is compiled is a final state machine (about 200Mb for english wikipedia).
How long does the compiling take? How close to a real-time update could it be? This seems to be a recurring problem with search etc at Wikipedia - the fact that it's always at best a few days out of date, and sometimes months. Then with the problems with toolserver...
On my computer (Pentium D930, SATA HD, 1G RAM), it took about half an hour for the 1.2M of english articles (I will give you the exact processing time if you want). This time includes the reading of all files on disk.
Best Regards. Julien Lemoine