Hello,
I want to have my job reexecuted after it fails. Is my solution of adding a clone to the job queue viable? For me it feels a bit dirty.
class CreateTicketJob extends Job { [...] public function run( ) { [...]
if (/* job failed due to not connecting to external resource */) { $job = new CreateTicketJob( $this->getTitle(), $this->params );
return JobQueueGroup::singleton()->push( $job ); } return true; } [...] }
Any help is appreciated.
Sincerely Henning
Am 02.07.2015 um 14:51 schrieb Henning Vreyborg:
Hi there,
it seems that failed jobs will never be executed again.
My extension creates jobs that talk with an external server via soap. This connection might fail (e.g. server maintenance) which then means that my job fails and needs to be reexecuted later. On my wiki I am running $wgJobRunRate = 0, my extension job is in $wgJobTypesExcludedFromDefaultQueue. To run my jobs I have a cronjob with "php runJobs.php --maxjobs 10 --type CreateTicket". My job has allowRetries() implemented with "return true".
From my observations the "job_token" and "job_token_timestamp" fields in the job table are used as a LOCK so that no job gets executed twice. The only problem is that these fields are never emptied, if the job fails (return false), so that the job scheduler can reexecute them.
Is this intended behaviour or is this a bug? Does anyone have an idea how to solve this issue?
Thank you. Regards Henning Vreyborg _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l