Hello again
A question about pagegenerators.PreloadingGenerator, wikipedia.getall
and related:
Would it be possible to include VersionHistory preload? On the API side
the call now used to fetch page contents, returns also at least the most
recent VersionHistory entry. This would be very useful for me!
(I can write the code needed, if such a feature would be welcome...)
Thanks and greetings!
DrTrigon
Hello all!
This is just a detail, may be this is preparation for upcomming commits,
but it seams to me that 'get_redirect' in 'extract_templates_and_params'
is useless and can be removed.
Greetings
DrTrigon
Hello
A question about wikipedia.get and wikipedia._getEditPage:
Would it be possible to add an option 'expandtemplates=False'
and the code:
if expandtemplates: params[u'rvexpandtemplates'] = u''
to enable template expansion on pages when/if requested.
I can also provide a patch if this suggestions is welcome.
Thanks and greetings!
DrTrigon
Does it mean that trunk users may use now asLink instead of page.aslink(),
or is this only a formal parameter?
2010/9/12 <xqt(a)svn.wikimedia.org>
> -----------
> wikipedia.Page(asLink=False) is implemented for easier merging to rewrite
> branch
>
Hi folks,
as requested by tracker #3062725 and #2744607 I've implemented a new -cleanup option for interwiki.py. It works like -force but only removes interwiki links to non-existent or empty pages without touching any other problematic links like disambig status or namespace mismatch. This should prevent a lot of trouble about 'controversial' changes (I hope so).
Greetings
xqt
Heute erleben, was morgen Trend wird - das kann man auf der IFA in Berlin. Oder auf arcor.de: Wir stellen Ihnen die wichtigsten News, Trends und Gadgets der IFA vor. Natürlich mit dabei: das brandneue IPTV-Angebot von Vodafone! Alles rund um die Internationale Funkausstellung in Berlin finden Sie hier: http://www.arcor.de/rd/footer.ifa2010
Hello everybody!
May it be possible, that in 'userlib.py' on lines 220 and 231 (from rev.
8522) the variables 'addr' and 'address' should match and this is an
error?!?
So the code:
------------------------------------------------------------------------
def sendMailOld(self, subject = u'', text = u'', ccMe = False):
addr = self.site().put_address('Special:EmailUser')
predata = {
"wpSubject" : subject,
"wpText" : text,
'wpSend' : "Send",
'wpCCMe' : '0',
}
if ccMe:
predata['wpCCMe'] = '1'
predata['wpEditToken'] = self.site().getToken()
response, data = self.site().postForm(address, predata, sysop =
False)
if data:
------------------------------------------------------------------------
should be
------------------------------------------------------------------------
def sendMailOld(self, subject = u'', text = u'', ccMe = False):
addr = self.site().put_address('Special:EmailUser')
predata = {
"wpSubject" : subject,
"wpText" : text,
'wpSend' : "Send",
'wpCCMe' : '0',
}
if ccMe:
predata['wpCCMe'] = '1'
predata['wpEditToken'] = self.site().getToken()
response, data = self.site().postForm(addr, predata, sysop = False)
if data:
------------------------------------------------------------------------
Thanks and greetings!
Dr. Trigon
Hello
May be this is a stupid question but what about basing
'page.isRedirectPage()' on 'getVersionHistory(revCount=1)'
instead of 'get()' because this could be faster for big
pages...?
Thanks and Greetings!
DrTrigon
I've recently noticed that category.py has a problem with big
categories and got down during getting categories or just before
saving results.
Today I treid to listify Category:Wikiprojekty at pl.wiki with recurse
option and got something like this:
Getting [[Kategoria:WikiProjekt Pallotyni]]...
Getting [[Kategoria:User projekt pallotyni]]...
Getting [[Kategoria:Wikiprojekt Chopin]]...
Getting [[Kategoria:Wikiprojekty]]...
Creating page [[Wikipedysta:AlohaBOT/wiki]] via API
HTTPError: 500 Internal Server Error
WARNING: Could not open 'http://pl.wikipedia.org/w/api.php'.
Maybe the server is down. Retrying in 1 minutes...
HTTPError: 500 Internal Server Error
WARNING: Could not open 'http://pl.wikipedia.org/w/api.php'.
Maybe the server is down. Retrying in 2 minutes...
patrol
Thank you, thank you! :-)
2010/9/8 <xqt(a)svn.wikimedia.org>
> Revision: 8504
> site generator for special:wantedcategories
> + return
> "%s?title=%s:wantedcategories&limit=%d&useskin=monobook&uselang=en" %
> (self.path(code), self.special_namespace_url(code), limit)
>
With this, will that work for all wikis? In huwiki
> + '<li><a href=".+?" class="new" title="(?P<title>.+?)
> \(page does not exist\)">.+?</a> .+?\)</li>')
>
> is "(megíratlan szócikk) "
I already wrote my version, but this one is much better.
--
Bináris
2010/9/8 <xqt(a)svn.wikimedia.org>
> Revision: 8499
> pywikipediaDir = "c:\\Projects\\Personal\\wiki\\pywikipedia"
>
>
I have never used this script, but is this really OK for everyone?