Hi,
I have an idea for new service we could implement on tools project that would greatly save system resources. I would like to have some kind of feedback.
Imagine a daemon similar to inet.d
It would watch the recentchages of ALL wikis we have @wm and users could subscribe (using web browser or some terminal interface) to this service, so that on certain events (page X was modified), this bot dispatcher would do something (submit their bot on grid / sent some signal / tcp packet somewhere / insert data to redis etc etc).
This way bot designers could very easily hook their bots to certain events without having to write their own "wiki-watchers". This would be extremely useful, not just for bots that should be triggered on event (someone edit some page) but also for bots that run periodically.
For example: archiving bot is now running in a way, that it checks ALL pages where the template for archiving is. No matter if these talk pages are dead for years or not. Using such a dispatcher, everytime when a talk page was modified some script or process could be launched that would add it to some queue (redis like, even the dispatcher could have this as an event so that no process would need to be launched) and archiving bot would only check these pages that are active, instead of thousands dead pages.
This way we could very efficiently schedule bots and save ton of system resources (cpu / memory / IO / network / even production servers load). It would also make it far easier for bot operators to create new tasks / bots as they would not need to program "wiki-watchers" themselves.
What you think about it?