Cough cough. I really think that the preferrable way is structured talk pages. Archiving talk pages on MW side... is not something anybody wants to happen.
On Sun, Sep 6, 2015 at 9:10 AM, Tim Landscheidt tim@tim-landscheidt.de wrote:
Hi,
on Wikipedia, talk pages sections are archived by various bots run by various people on various hosts. Usually, they use a template that adds a category to the talk page, then periodically parse the template to get their parameters (where to archive, how old must a section be, etc.), parse the whole page with various forms of user signatures to de- termine for each section the last modification and then "move" that section to the archive page (if it fits the pa- rameters).
I think a preferable way would be an extension that adds a tag (or something similar) that marks the page as archivable and puts the parameters in page_props, perhaps adds a func- tion to a save hook to check if a section has been modified and manage a list of last modifications of sections in page_props based on that, and then, as a cron job/job queue something, iterate over all archivable pages and "move" sec- tions to be archived to their archive pages.
Bots performing the last part sometimes run into the problem that the page contents of either the page to be archived or the archive page is changed at the same moment the bot is moving sections around, and they (hopefully all of them) im- plement rollback mechanisms (which on the other hand will never succeed in /all/ cases).
Is it possible for a MediaWiki maintenance script/job queue task to edit two pages in a (database-/MediaWiki-level) transaction or temporarily lock a page from being edited by another process (short of protecting and unprotecting a page)? Is there a code sniplet/extension that already uses such a pattern?
Tim
MediaWiki-l mailing list To unsubscribe, go to: https://lists.wikimedia.org/mailman/listinfo/mediawiki-l