----- Original Message ----- From: "Steve Summit" scs@eskimo.com To: wikitech-l@wikimedia.org
Try running the command
ulimit -a
Doesn't look like it. Here's the output:
core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited file size (blocks, -f) unlimited max locked memory (kbytes, -l) unlimited max memory size (kbytes, -m) unlimited open files (-n) 1024 pipe size (512 bytes, -p) 8 stack size (kbytes, -s) unlimited cpu time (seconds, -t) unlimited max user processes (-u) unlimited virtual memory (kbytes, -v) unlimited
So CPU time isn't limited in this account. So it should run fine. But there still could be some bot killing the process.
I guess I'll have to ask the ISP why it happens. I can't think of any reason.
Mike O
On 10/6/06, Mike O mikeo@operamail.com wrote:
I guess I'll have to ask the ISP why it happens. I can't think of any reason.
I don't know this script at all, but another general solution to this class of problem is to split the script up so it indexes, say, 1000 articles per call. Then you could call it multiple times with a command line parameter, like "rebuildall 0", then "rebuildall 1000" etc.
Just in case that hadn't occurred to you :)
Steve
wikitech-l@lists.wikimedia.org