Hi,
My extension External Data uses Http::get() to retrieve the contents of a URL. Unfortunately, it turns out that this function fails on some URLs, returning an HTTP return code of "0". Here's an example of a URL for which it doesn't work:
http://mouse.brain-map.org/GeneExpression/Hippocampal+region/1.xml
Http::get() uses PHP's cURL library when possible (and on my server, it's possible), and I assume that's what's causing the problem: when I replace the call to Http::get() with a call to file_get_contents(), it works fine. Does anyone know what the exact problem is? (It might be related to the fact that the page is in XML format, although other XML pages work.) Should I just use file_get_contents() instead?
Thanks, Yaron
Yaron Koren wrote:
Hi,
My extension External Data uses Http::get() to retrieve the contents of a URL. Unfortunately, it turns out that this function fails on some URLs, returning an HTTP return code of "0". Here's an example of a URL for which it doesn't work:
http://mouse.brain-map.org/GeneExpression/Hippocampal+region/1.xml
Http::get() uses PHP's cURL library when possible (and on my server, it's possible), and I assume that's what's causing the problem: when I replace the call to Http::get() with a call to file_get_contents(), it works fine. Does anyone know what the exact problem is? (It might be related to the fact that the page is in XML format, although other XML pages work.) Should I just use file_get_contents() instead?
Thanks, Yaron
Try increasing $wgHTTPTimeout It's a big file, so it's probably timeouting before fully transferring it, as it happened here. file_get_contents doesn't have a timeout, unless specifically set with a context.
Thanks, that was it!
-Yaron
mediawiki-l@lists.wikimedia.org