On Mon, Aug 12, 2013 at 3:11 PM, rupert THURNER rupert.thurner@gmail.comwrote:
On Mon, Aug 12, 2013 at 6:27 AM, rupert THURNER rupert.thurner@gmail.com wrote:
faldon, can you attach the trace to the bugzilla ticket please?
On Mon, Aug 12, 2013 at 3:58 AM, Faidon Liambotis faidon@wikimedia.org
wrote:
Hi,
On Sun, Aug 11, 2013 at 12:51:15PM +0200, rupert THURNER wrote:
As chad points out, its being served now
it's plural (robots.txt)
many thanks for getting it up quickly last time! unfortunately https://git.wikimedia.org is unresponsive again.
Thanks for the report! I just restarted it again. Root cause was the
same,
unfortunately it's not just zip files that kill it; googlebot asking for every file/revision is more than enough.
Until we have a better solution (and monitoring!) in place, I changed robots.txt to Disallow /. This means no search indexing for now, unfortunately.
its dead again. would be somebody so kind to trace this? as stated in https://bugzilla.wikimedia.org/show_bug.cgi?id=51769 one might do, _before_ restarting it:
- jps -l to find out the process id
- strace to see if it excessivly calls into the operating system
- jstack
- kill -QUIT <p> to print the stacktrace
- jmap -heap <p> to find memory usage
- jmap -histo:live <p> | head to find excessivly used classes
- if you have ui, you might try jconsole or http://visualvm.java.net as
well
as i already asked a couple of mails earlier, i d volunteer to do it as well.
None of this trace info would be useful. We know what's killing it. The fix for disallowing all indexing wasn't puppetized, so puppet reverted it.
https://gerrit.wikimedia.org/r/#/c/78919/
-Chad