On Sun, Apr 7, 2013 at 2:43 PM, swuensch <swuensch@gmail.com> wrote:
As the dispatch lag is getting bigger again (nearly one day) should wikidata maybe stop all botwork for one day an discuss stricter rules for bots? Or will there be software solution to handle all edits on wikidata?

That is not at all a workable solution. What about the bots that are currently updating sitelinks for moved pages? Or ones that are adding in missing interwikis that were not fully in the circle?

We expect to deploy phase 2, yet most of the data hasn't even been imported yet simply because we don't have enough bots running fast enough. My bot alone already has over 300k claims queued and waiting to go, but it can't edit any faster 1 claim/sec (bandwith, etc.)

The solution here has to be software side, no one bot alone is the cause (though some *cough* have made it worse), and shutting them down is just not a solution.