Hi, I am working for my bots tests to a depency injectioned way to provide Pages objects. I used snakeguice https://code.google.com/p/snake-guice/ to inject a Page class provider into my bots (this is not online yet, I'll commit soon, unfortunaltely the documentation is not up to date and complete, better check the snakeguice example apps for a better understanding)
The idea is : write your code as if you did not care about if the code of the page is provided by a Wikipage regular object or by a stub or test object ; at the startup the application id wired up and the type of the object to provide is decided. You will assume the page getter will return some page content, and don't care about it come from a piped command or the original page
for example in the pipe case we could decide that we are working on a page with some pipeable scripts : page_getter_script [pagename] | harvest template [pagename] | ... | remove_duplicate [pagename] | page_writer_script [pagename]
The page_getter_script page provider would be a regular Wikipage factory, whether the harvest template would be a StandardInputPage class. Wikipage and StandardInputPage would be injected by WIkipagePipeInModule, harvestTemplate and removeDuplicate by PipeInOutModule, and page_writer by some other WIkipageWriterModule.
I'll post some code to show an example using my TestModules when they are ready.
2014-07-20 12:25 GMT+02:00 Ricordisamoa ricordisamoa@openmailbox.org:
I hereby propose to create a common way for bot classes to 'pipe' the changed text to other classes without saving it directly to the wiki. So, botops could do something like: python pwb.py pipe [generator] scripts/harvest_template [options] + scripts/weblinkchecker [options] + scripts/replace [options] to make custom replacements, check external links and enrich Wikidata with a single edit (except for the Wikidata import process).
Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l