Happy Monday,
There are strange people who make such links (kindof urlencoded?):
[[Második világháború#Partrasz.C3.A1ll.C3.A1s Szic.C3.ADli.C3.A1ban
.28Huskey hadm.C5.B1velet.29|Huskey hadműveletben]]
So the section title must have been copied from the URL.
Do we have a ready tool to fix these?
--
Bináris
Hello all
>From one of my assignments as a bot operator I have some code which
does template parsing and general text parsing (e.g. Image/File tags).
It is not using regex and thus able to correctly parse nested
templates and other such nasty things. I have written those as library
classes and written tests for them which cover almost all of the code.
I would now really like to contribute that code back to the community.
Would you be interested in adding this code to the pywikibot
framework? If yes, can I send the code to someone for code review or
how do you usually operate?
Greetings
Hannes
PS: wiki userpage is http://en.wikipedia.org/wiki/User:Hannes_R%C3%B6st
Hello! Is there a bot or extension that can generate stubs of template
descriptions by template wikitext? It should be pretty simple: the bot will
just grab all template parameters {{{PARAMETER}}} and lists those
parameters.
If no such extensions exists could anybody tell how to programaticaly get
the parameters of a given template?
Sincerely yours,
-----
Yury Katkov
Hi folks,
Is there a tool to hunt for overmagnified pictures and restore them? I mean
when a JPG is originally 150 px and is rendered to 300 px in the article
which is a useless enlargement and results in an ugly picture.
Or do we have any part of it, e.g. determining the size of a picture?
I think JPG is the main subject, GIFs are perhaps less involved and SVG is
of course problemless.
--
Bináris
I just wanted to put() a simple page on a MediaWiki 1.16
instance, where I have to use screen scraping (use_api=False).
There is something strange however:
There is an API call invoked by _getBlocked:
/w/api.php?action=query&format=json&meta=userinfo&uiprop=blockinfo
Here's my backtrace:
File "pywikipedia/wikipedia.py", line 693, in get
expandtemplates = expandtemplates)
File "pywikipedia/wikipedia.py", line 743, in _getEditPage
return self._getEditPageOld(get_redirect, throttle, sysop, oldid, change_edit_time)
File "pywikipedia/wikipedia.py", line 854, in _getEditPageOld
text = self.site().getUrl(path, sysop = sysop)
File "pywikipedia/wikipedia.py", line 5881, in getUrl
self._getUserDataOld(text, sysop = sysop)
File "pywikipedia/wikipedia.py", line 6016, in _getUserDataOld
blocked = self._getBlock(sysop = sysop)
File "pywikipedia/wikipedia.py", line 5424, in _getBlock
data = query.GetData(params, self)
File "pywikipedia/query.py", line 146, in GetData
jsontext = site.getUrl( path, retry=True, sysop=sysop, data=data)
getUrl(), which is also called from API, seems always
to call _getUserDataOld(text) where text is ... API output
so it tries to do strange things on that and gives warnings
like
Note: this language does not allow global bots.
WARNING: Token not found on wikipedia:pl. You will not be able to edit any page.
which is nonsense since the analyzed text is not HTML - only API output.
If getUrl() is supposed to be a low-level call, why call _getUserDataOld()
there?
http://www.mediawiki.org/wiki/Special:Code/pywikipedia/7461
has introduced this call there.
It's easily reproducable by this:
import wikipedia
import config
config.use_api = False
wikipedia.verbose = True
s = wikipedia.getSite("pl", "wikipedia")
p = wikipedia.Page(s, u"User:Saper")
c = p.get()
c += "<!-- test -->"
p.put(c, u"Testing wiki", botflag=False)
//Saper
Having problems with pageimport.py on a third-party wiki (WikiQueer). Anyone else having issues with that script?
I'm calling it from a script I'm playing around with - but no luck. It doesn't error out - but it doesn't import and confirms that the import failed.
Here's the "test" script I'm working from:
import wikipedia as pywikibot
from pageimport import *
def main():
wanted_category_title = "Apple"
enwiki_site = pywikibot.getSite()
importerbot = Importer(enwiki_site) # Inizializing
importerbot.Import(wanted_category_title,project='wikipedia', prompt = True)
try:
main()
finally:
pywikibot.stopme()
On a related note, the ultimate goal is to import pages for "Wanted Categories" from English Wikipedia into the third-party wiki. Any ideas, tips or existing code to that end would also be appreciated.
Thanks!
-greg aka varnent
-------
Gregory Varnum
Lead, Aequalitas Project
Lead Administrator, WikiQueer
Founding Principal, VarnEnt
@GregVarnum
fb.com/GregVarnum
Hi folks,
is there a piece of code in pywiki that determines easily if the current
time zone is summer or winter time? (Last Sunday of March, 3 o'clock --
last Sunday of October, 2 o'clock)
--
Bináris
User:Avocato
;Preferred git username: avocato
;Preferred shell username: avocato
;Did you previously have SVN access?: Yes
;Email address: avocatomail(a)gmail.com
I already have the commit access (accessing to Gerrit and Wikimedia Labs),
now, I request pywikipediabot commit.
--
*User:Avocato--*
Hi,
I worked a lot on a script that created a page under my bot's user page,
finally after tests I renamed
it<http://hu.wikipedia.org/w/index.php?title=Wikip%C3%A9dia:Elavult_m%C3%A1svi…>to
project namespace. At the next run the bot worked 8 minutes (hard to
wait if you want to work with the result), and finally stopped at the last
moment instead of saving the page.
Sorry, I forgot to copy the screen, the essence is that it claimed my page
to be *locked* (what I thought to be protected). I didn't understand, and I
spent a lot of time with investigating and understanding what happened. The
page was not protected, of course.
The result of my investigation is that when the bot finds its own name in
the last edit comment (that was the fact because of renaming), it supposes
that the last edit was a revert of its previous edit (why?), and denies
saving with a fake and misleading "page locked" exception.
The code in wikipedia.py is:
elif self.comment() and username in self.comment():
raise LockedPage(
u'Not allowed to edit %s because last edit maybe
reverted'
% self.title(asLink=True))
(line 1970)
Of course, these type of messages in the argument of *raise* won't appear
on user's screen, it is hidden into the source code. For some reason,
Python does not show the message in user exceptions.
This was my first experience with this kind of error so I was first
frightened and confused and finally found out that I should use a
force=True parameter, and so after waiting another 8 minutes I could save
the page. I solved the case and lost 15-20 minutes.
Is there really a serious reason to suppose that having found the name of
the bot in the last edit comment marks a revert and to prevent the bot of
saving the result of its work?
Bináris.unhappy.hu <http://xn--binris-rta.unhappy.hu>
Is there someone who has or is wiling to share/make bot that will place
coordinates template (in decimal format rather than DMS) at top or
bottom of articles that have have not such template?