agent dale cooper wrote:
It sounds like you've isolated the problem to within a couple of hundred lines of code. Maybe you should spend less time searching the web for someone with your exact problem, and more time reading that code.
=) I'd agree with ya, if I wasn't so much of a PHP newbie... I'd consider myself more of a Perl and Bash type coder, but I definarely understand where you are coming from with your suggestion.
Just pretend it's perl, it's pretty much the same for these purposes.
It sounds like MediaWiki is the culprit and MW's HTTP fetch function is somehow stripping the search results
Well, Http::get() is only 68 lines. It has two branches, one of uses file_get_contents(), which should emit errors if display_errors is on, and the other uses curl_exec(), which has two error branches which return false silently:
if ( curl_getinfo( $c, CURLINFO_HTTP_CODE ) != 200 ) { if ( curl_errno( $c ) != CURLE_OK ) {
You should determine which one of these MediaWiki is using, and either enable display_errors, or add debugging statements to the two curl error branches.
Or, again, you could use tcpdump, which would probably determine the problem without dealing with the source code.
-- Tim Starling