rupert THURNER wrote:
https://git.wikimedia.org/ seems to be dead.
Yup. It keeps happening: https://bugzilla.wikimedia.org/51769.
MZMcBride
On Sat, Aug 10, 2013 at 4:49 PM, MZMcBride z@mzmcbride.com wrote:
rupert THURNER wrote:
https://git.wikimedia.org/ seems to be dead.
Yup. It keeps happening: https://bugzilla.wikimedia.org/51769.
would it be possible to help debugging this?
rupert.
Ahha -
looks like the robots.txt isn't being served - so googlebot is grabbing things from the zip files client denied by server configuration: /var/www/robots.txt
sadly too jetlagged to keep looking at this :(
On Sun, Aug 11, 2013 at 4:52 AM, rupert THURNER rupert.thurner@gmail.com wrote:
On Sat, Aug 10, 2013 at 4:49 PM, MZMcBride z@mzmcbride.com wrote:
rupert THURNER wrote:
https://git.wikimedia.org/ seems to be dead.
Yup. It keeps happening: https://bugzilla.wikimedia.org/51769.
would it be possible to help debugging this?
rupert.
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Sat, Aug 10, 2013 at 5:44 PM, Leslie Carr lcarr@wikimedia.org wrote:
Ahha -
looks like the robots.txt isn't being served - so googlebot is grabbing things from the zip files client denied by server configuration: /var/www/robots.txt
sadly too jetlagged to keep looking at this :(
We fixed this Thursday or Friday? https://git.wikimedia.org/robots.txt WFM.
-Chad
On Sun, Aug 11, 2013 at 10:44 AM, Leslie Carr lcarr@wikimedia.org wrote:
looks like the robots.txt isn't being served - so googlebot is grabbing things from the zip files client denied by server configuration: /var/www/robots.txt
sadly too jetlagged to keep looking at this :(
make sure you look at robot.txt and not Robot.txt,
As chad points out, its being served now
On Sun, Aug 11, 2013 at 2:25 AM, K. Peachey p858snake@gmail.com wrote:
On Sun, Aug 11, 2013 at 10:44 AM, Leslie Carr lcarr@wikimedia.org wrote:
looks like the robots.txt isn't being served - so googlebot is grabbing things from the zip files client denied by server configuration: /var/www/robots.txt
sadly too jetlagged to keep looking at this :(
make sure you look at robot.txt and not Robot.txt,
As chad points out, its being served now
it's plural (robots.txt)
On Sun, Aug 11, 2013 at 4:27 AM, Jeremy Baron jeremy@tuxmachine.com wrote:
On Sun, Aug 11, 2013 at 2:25 AM, K. Peachey p858snake@gmail.com wrote:
On Sun, Aug 11, 2013 at 10:44 AM, Leslie Carr lcarr@wikimedia.org wrote:
looks like the robots.txt isn't being served - so googlebot is grabbing things from the zip files client denied by server configuration: /var/www/robots.txt
sadly too jetlagged to keep looking at this :(
make sure you look at robot.txt and not Robot.txt,
As chad points out, its being served now
it's plural (robots.txt)
many thanks for getting it up quickly last time! unfortunately https://git.wikimedia.org is unresponsive again.
rupert
Hi,
On Sun, Aug 11, 2013 at 12:51:15PM +0200, rupert THURNER wrote:
As chad points out, its being served now
it's plural (robots.txt)
many thanks for getting it up quickly last time! unfortunately https://git.wikimedia.org is unresponsive again.
Thanks for the report! I just restarted it again. Root cause was the same, unfortunately it's not just zip files that kill it; googlebot asking for every file/revision is more than enough.
Until we have a better solution (and monitoring!) in place, I changed robots.txt to Disallow /. This means no search indexing for now, unfortunately.
Faidon
faldon, can you attach the trace to the bugzilla ticket please?
On Mon, Aug 12, 2013 at 3:58 AM, Faidon Liambotis faidon@wikimedia.org wrote:
Hi,
On Sun, Aug 11, 2013 at 12:51:15PM +0200, rupert THURNER wrote:
As chad points out, its being served now
it's plural (robots.txt)
many thanks for getting it up quickly last time! unfortunately https://git.wikimedia.org is unresponsive again.
Thanks for the report! I just restarted it again. Root cause was the same, unfortunately it's not just zip files that kill it; googlebot asking for every file/revision is more than enough.
Until we have a better solution (and monitoring!) in place, I changed robots.txt to Disallow /. This means no search indexing for now, unfortunately.
Faidon
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Mon, Aug 12, 2013 at 6:27 AM, rupert THURNER rupert.thurner@gmail.com wrote:
faldon, can you attach the trace to the bugzilla ticket please?
On Mon, Aug 12, 2013 at 3:58 AM, Faidon Liambotis faidon@wikimedia.org wrote:
Hi,
On Sun, Aug 11, 2013 at 12:51:15PM +0200, rupert THURNER wrote:
As chad points out, its being served now
it's plural (robots.txt)
many thanks for getting it up quickly last time! unfortunately https://git.wikimedia.org is unresponsive again.
Thanks for the report! I just restarted it again. Root cause was the same, unfortunately it's not just zip files that kill it; googlebot asking for every file/revision is more than enough.
Until we have a better solution (and monitoring!) in place, I changed robots.txt to Disallow /. This means no search indexing for now, unfortunately.
its dead again. would be somebody so kind to trace this? as stated in https://bugzilla.wikimedia.org/show_bug.cgi?id=51769 one might do, _before_ restarting it:
* jps -l to find out the process id * strace to see if it excessivly calls into the operating system * jstack * kill -QUIT <p> to print the stacktrace * jmap -heap <p> to find memory usage * jmap -histo:live <p> | head to find excessivly used classes * if you have ui, you might try jconsole or http://visualvm.java.net as well
as i already asked a couple of mails earlier, i d volunteer to do it as well.
rupert
On Mon, Aug 12, 2013 at 3:11 PM, rupert THURNER rupert.thurner@gmail.comwrote:
On Mon, Aug 12, 2013 at 6:27 AM, rupert THURNER rupert.thurner@gmail.com wrote:
faldon, can you attach the trace to the bugzilla ticket please?
On Mon, Aug 12, 2013 at 3:58 AM, Faidon Liambotis faidon@wikimedia.org
wrote:
Hi,
On Sun, Aug 11, 2013 at 12:51:15PM +0200, rupert THURNER wrote:
As chad points out, its being served now
it's plural (robots.txt)
many thanks for getting it up quickly last time! unfortunately https://git.wikimedia.org is unresponsive again.
Thanks for the report! I just restarted it again. Root cause was the
same,
unfortunately it's not just zip files that kill it; googlebot asking for every file/revision is more than enough.
Until we have a better solution (and monitoring!) in place, I changed robots.txt to Disallow /. This means no search indexing for now, unfortunately.
its dead again. would be somebody so kind to trace this? as stated in https://bugzilla.wikimedia.org/show_bug.cgi?id=51769 one might do, _before_ restarting it:
- jps -l to find out the process id
- strace to see if it excessivly calls into the operating system
- jstack
- kill -QUIT <p> to print the stacktrace
- jmap -heap <p> to find memory usage
- jmap -histo:live <p> | head to find excessivly used classes
- if you have ui, you might try jconsole or http://visualvm.java.net as
well
as i already asked a couple of mails earlier, i d volunteer to do it as well.
None of this trace info would be useful. We know what's killing it. The fix for disallowing all indexing wasn't puppetized, so puppet reverted it.
https://gerrit.wikimedia.org/r/#/c/78919/
-Chad
On Tue, Aug 13, 2013 at 12:12 AM, Chad innocentkiller@gmail.com wrote:
On Mon, Aug 12, 2013 at 3:11 PM, rupert THURNER rupert.thurner@gmail.comwrote:
On Mon, Aug 12, 2013 at 6:27 AM, rupert THURNER rupert.thurner@gmail.com wrote:
faldon, can you attach the trace to the bugzilla ticket please?
On Mon, Aug 12, 2013 at 3:58 AM, Faidon Liambotis faidon@wikimedia.org
wrote:
Hi,
On Sun, Aug 11, 2013 at 12:51:15PM +0200, rupert THURNER wrote:
> > As chad points out, its being served now
it's plural (robots.txt)
many thanks for getting it up quickly last time! unfortunately https://git.wikimedia.org is unresponsive again.
Thanks for the report! I just restarted it again. Root cause was the
same,
unfortunately it's not just zip files that kill it; googlebot asking for every file/revision is more than enough.
Until we have a better solution (and monitoring!) in place, I changed robots.txt to Disallow /. This means no search indexing for now, unfortunately.
its dead again. would be somebody so kind to trace this? as stated in https://bugzilla.wikimedia.org/show_bug.cgi?id=51769 one might do, _before_ restarting it:
- jps -l to find out the process id
- strace to see if it excessivly calls into the operating system
- jstack
- kill -QUIT <p> to print the stacktrace
- jmap -heap <p> to find memory usage
- jmap -histo:live <p> | head to find excessivly used classes
- if you have ui, you might try jconsole or http://visualvm.java.net as
well
as i already asked a couple of mails earlier, i d volunteer to do it as well.
None of this trace info would be useful. We know what's killing it. The fix for disallowing all indexing wasn't puppetized, so puppet reverted it.
chad, could you please anyway take the traces so we can have a look at it? an internet facing app should _not_ die like this ...
rupert.
wikitech-l@lists.wikimedia.org