Hello! Is there a bot or extension that can generate stubs of template
descriptions by template wikitext? It should be pretty simple: the bot will
just grab all template parameters {{{PARAMETER}}} and lists those
parameters.
If no such extensions exists could anybody tell how to programaticaly get
the parameters of a given template?
Sincerely yours,
-----
Yury Katkov
Hi folks,
Is there a tool to hunt for overmagnified pictures and restore them? I mean
when a JPG is originally 150 px and is rendered to 300 px in the article
which is a useless enlargement and results in an ugly picture.
Or do we have any part of it, e.g. determining the size of a picture?
I think JPG is the main subject, GIFs are perhaps less involved and SVG is
of course problemless.
--
Bináris
I just wanted to put() a simple page on a MediaWiki 1.16
instance, where I have to use screen scraping (use_api=False).
There is something strange however:
There is an API call invoked by _getBlocked:
/w/api.php?action=query&format=json&meta=userinfo&uiprop=blockinfo
Here's my backtrace:
File "pywikipedia/wikipedia.py", line 693, in get
expandtemplates = expandtemplates)
File "pywikipedia/wikipedia.py", line 743, in _getEditPage
return self._getEditPageOld(get_redirect, throttle, sysop, oldid, change_edit_time)
File "pywikipedia/wikipedia.py", line 854, in _getEditPageOld
text = self.site().getUrl(path, sysop = sysop)
File "pywikipedia/wikipedia.py", line 5881, in getUrl
self._getUserDataOld(text, sysop = sysop)
File "pywikipedia/wikipedia.py", line 6016, in _getUserDataOld
blocked = self._getBlock(sysop = sysop)
File "pywikipedia/wikipedia.py", line 5424, in _getBlock
data = query.GetData(params, self)
File "pywikipedia/query.py", line 146, in GetData
jsontext = site.getUrl( path, retry=True, sysop=sysop, data=data)
getUrl(), which is also called from API, seems always
to call _getUserDataOld(text) where text is ... API output
so it tries to do strange things on that and gives warnings
like
Note: this language does not allow global bots.
WARNING: Token not found on wikipedia:pl. You will not be able to edit any page.
which is nonsense since the analyzed text is not HTML - only API output.
If getUrl() is supposed to be a low-level call, why call _getUserDataOld()
there?
http://www.mediawiki.org/wiki/Special:Code/pywikipedia/7461
has introduced this call there.
It's easily reproducable by this:
import wikipedia
import config
config.use_api = False
wikipedia.verbose = True
s = wikipedia.getSite("pl", "wikipedia")
p = wikipedia.Page(s, u"User:Saper")
c = p.get()
c += "<!-- test -->"
p.put(c, u"Testing wiki", botflag=False)
//Saper
Having problems with pageimport.py on a third-party wiki (WikiQueer). Anyone else having issues with that script?
I'm calling it from a script I'm playing around with - but no luck. It doesn't error out - but it doesn't import and confirms that the import failed.
Here's the "test" script I'm working from:
import wikipedia as pywikibot
from pageimport import *
def main():
wanted_category_title = "Apple"
enwiki_site = pywikibot.getSite()
importerbot = Importer(enwiki_site) # Inizializing
importerbot.Import(wanted_category_title,project='wikipedia', prompt = True)
try:
main()
finally:
pywikibot.stopme()
On a related note, the ultimate goal is to import pages for "Wanted Categories" from English Wikipedia into the third-party wiki. Any ideas, tips or existing code to that end would also be appreciated.
Thanks!
-greg aka varnent
-------
Gregory Varnum
Lead, Aequalitas Project
Lead Administrator, WikiQueer
Founding Principal, VarnEnt
@GregVarnum
fb.com/GregVarnum
Hi folks,
is there a piece of code in pywiki that determines easily if the current
time zone is summer or winter time? (Last Sunday of March, 3 o'clock --
last Sunday of October, 2 o'clock)
--
Bináris
User:Avocato
;Preferred git username: avocato
;Preferred shell username: avocato
;Did you previously have SVN access?: Yes
;Email address: avocatomail(a)gmail.com
I already have the commit access (accessing to Gerrit and Wikimedia Labs),
now, I request pywikipediabot commit.
--
*User:Avocato--*
Hi,
I worked a lot on a script that created a page under my bot's user page,
finally after tests I renamed
it<http://hu.wikipedia.org/w/index.php?title=Wikip%C3%A9dia:Elavult_m%C3%A1svi…>to
project namespace. At the next run the bot worked 8 minutes (hard to
wait if you want to work with the result), and finally stopped at the last
moment instead of saving the page.
Sorry, I forgot to copy the screen, the essence is that it claimed my page
to be *locked* (what I thought to be protected). I didn't understand, and I
spent a lot of time with investigating and understanding what happened. The
page was not protected, of course.
The result of my investigation is that when the bot finds its own name in
the last edit comment (that was the fact because of renaming), it supposes
that the last edit was a revert of its previous edit (why?), and denies
saving with a fake and misleading "page locked" exception.
The code in wikipedia.py is:
elif self.comment() and username in self.comment():
raise LockedPage(
u'Not allowed to edit %s because last edit maybe
reverted'
% self.title(asLink=True))
(line 1970)
Of course, these type of messages in the argument of *raise* won't appear
on user's screen, it is hidden into the source code. For some reason,
Python does not show the message in user exceptions.
This was my first experience with this kind of error so I was first
frightened and confused and finally found out that I should use a
force=True parameter, and so after waiting another 8 minutes I could save
the page. I solved the case and lost 15-20 minutes.
Is there really a serious reason to suppose that having found the name of
the bot in the last edit comment marks a revert and to prevent the bot of
saving the result of its work?
Bináris.unhappy.hu <http://xn--binris-rta.unhappy.hu>
Is there someone who has or is wiling to share/make bot that will place
coordinates template (in decimal format rather than DMS) at top or
bottom of articles that have have not such template?
I am preparing an easy way to create a DataPage from a Page object. This enables to integrate wikidata to interwiki and run it in just a similar way as with other pages. More follows asap...
xqt
----- Ursprüngliche Nachricht -----
Von: Amir Ladsgroup
Gesendet: 17.01.2013 17:27
An: jan.dudik(a)volny.cz; Pywikipedia discussion list
Betreff: Re: [Pywikipedia-l] Wikidata & interwiki behavior
On Thu, Jan 17, 2013 at 11:04 AM, Jan Dudík <jan.dudik(a)gmail.com> wrote:
There might be possibility to solve conflict as
it is i current interwiki.py with preffering wikidata version?
it's a very interesting idea.
Is there a way someone tell me how we can do it?
about other things i think we can simply make a list of the articles in each cases by using toolserver
--
Amir
Maybe we should define default behavior for interwiki bots with live
wikidata, because now is hu.wiki ignored and incorrect links are not
repaired and on wikidata (and hu.wiki) are missing new articles from
other wikis.
1) starting page have interwiki & no conflict found & wikidata
backlink exists (it means exactly onem without conflict, in other case
go to 3))
2) starting page have interwiki & no conflict found & wikidata
backlink does not exists
3) starting page have interwiki & conflict found
4) starting page have no interwiki & wikidata backlink exists
5) starting page have no interwiki & wikidata backlink does not exists
for cases 1) and 4) should bot update wikidata and sites without
wikidata, on wikidata sites (actually hu) remove interwiki
for case 2) shoud create wikidata item and remove/update other sites
what about case 3) ? There might be possibility to solve conflict as
it is i current interwiki.py with preffering wikidata version?
And what about case 5)? should bot create wikidata item or not?
JAnD
2013/1/17 Bináris <wikiposta(a)gmail.com>:
> Here is how to detect presence of Wikidata from API:
> https://www.wikidata.org/w/index.php?title=Wikidata:Project_chat&oldid=4469…
>
> --
> Bináris
> _______________________________________________
> Pywikipedia-l mailing list
> Pywikipedia-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
>
--
--
Ing. Jan Dudík