Hello! Is there a bot or extension that can generate stubs of template
descriptions by template wikitext? It should be pretty simple: the bot will
just grab all template parameters {{{PARAMETER}}} and lists those
parameters.
If no such extensions exists could anybody tell how to programaticaly get
the parameters of a given template?
Sincerely yours,
-----
Yury Katkov
I just wanted to put() a simple page on a MediaWiki 1.16
instance, where I have to use screen scraping (use_api=False).
There is something strange however:
There is an API call invoked by _getBlocked:
/w/api.php?action=query&format=json&meta=userinfo&uiprop=blockinfo
Here's my backtrace:
File "pywikipedia/wikipedia.py", line 693, in get
expandtemplates = expandtemplates)
File "pywikipedia/wikipedia.py", line 743, in _getEditPage
return self._getEditPageOld(get_redirect, throttle, sysop, oldid, change_edit_time)
File "pywikipedia/wikipedia.py", line 854, in _getEditPageOld
text = self.site().getUrl(path, sysop = sysop)
File "pywikipedia/wikipedia.py", line 5881, in getUrl
self._getUserDataOld(text, sysop = sysop)
File "pywikipedia/wikipedia.py", line 6016, in _getUserDataOld
blocked = self._getBlock(sysop = sysop)
File "pywikipedia/wikipedia.py", line 5424, in _getBlock
data = query.GetData(params, self)
File "pywikipedia/query.py", line 146, in GetData
jsontext = site.getUrl( path, retry=True, sysop=sysop, data=data)
getUrl(), which is also called from API, seems always
to call _getUserDataOld(text) where text is ... API output
so it tries to do strange things on that and gives warnings
like
Note: this language does not allow global bots.
WARNING: Token not found on wikipedia:pl. You will not be able to edit any page.
which is nonsense since the analyzed text is not HTML - only API output.
If getUrl() is supposed to be a low-level call, why call _getUserDataOld()
there?
http://www.mediawiki.org/wiki/Special:Code/pywikipedia/7461
has introduced this call there.
It's easily reproducable by this:
import wikipedia
import config
config.use_api = False
wikipedia.verbose = True
s = wikipedia.getSite("pl", "wikipedia")
p = wikipedia.Page(s, u"User:Saper")
c = p.get()
c += "<!-- test -->"
p.put(c, u"Testing wiki", botflag=False)
//Saper
Having problems with pageimport.py on a third-party wiki (WikiQueer). Anyone else having issues with that script?
I'm calling it from a script I'm playing around with - but no luck. It doesn't error out - but it doesn't import and confirms that the import failed.
Here's the "test" script I'm working from:
import wikipedia as pywikibot
from pageimport import *
def main():
wanted_category_title = "Apple"
enwiki_site = pywikibot.getSite()
importerbot = Importer(enwiki_site) # Inizializing
importerbot.Import(wanted_category_title,project='wikipedia', prompt = True)
try:
main()
finally:
pywikibot.stopme()
On a related note, the ultimate goal is to import pages for "Wanted Categories" from English Wikipedia into the third-party wiki. Any ideas, tips or existing code to that end would also be appreciated.
Thanks!
-greg aka varnent
-------
Gregory Varnum
Lead, Aequalitas Project
Lead Administrator, WikiQueer
Founding Principal, VarnEnt
@GregVarnum
fb.com/GregVarnum
Hi folks,
is there a piece of code in pywiki that determines easily if the current
time zone is summer or winter time? (Last Sunday of March, 3 o'clock --
last Sunday of October, 2 o'clock)
--
Bináris
Hi,
I worked a lot on a script that created a page under my bot's user page,
finally after tests I renamed
it<http://hu.wikipedia.org/w/index.php?title=Wikip%C3%A9dia:Elavult_m%C3%A1svi…>to
project namespace. At the next run the bot worked 8 minutes (hard to
wait if you want to work with the result), and finally stopped at the last
moment instead of saving the page.
Sorry, I forgot to copy the screen, the essence is that it claimed my page
to be *locked* (what I thought to be protected). I didn't understand, and I
spent a lot of time with investigating and understanding what happened. The
page was not protected, of course.
The result of my investigation is that when the bot finds its own name in
the last edit comment (that was the fact because of renaming), it supposes
that the last edit was a revert of its previous edit (why?), and denies
saving with a fake and misleading "page locked" exception.
The code in wikipedia.py is:
elif self.comment() and username in self.comment():
raise LockedPage(
u'Not allowed to edit %s because last edit maybe
reverted'
% self.title(asLink=True))
(line 1970)
Of course, these type of messages in the argument of *raise* won't appear
on user's screen, it is hidden into the source code. For some reason,
Python does not show the message in user exceptions.
This was my first experience with this kind of error so I was first
frightened and confused and finally found out that I should use a
force=True parameter, and so after waiting another 8 minutes I could save
the page. I solved the case and lost 15-20 minutes.
Is there really a serious reason to suppose that having found the name of
the bot in the last edit comment marks a revert and to prevent the bot of
saving the result of its work?
Bináris.unhappy.hu <http://xn--binris-rta.unhappy.hu>
Dear all
I have posted to this mailing list in January with a library that I
wanted to contribute to the codebase. This is part of an effort on my
side to refactor code that accumulated over various bot-operator tasks
and make it available to the community. The main part of the code
deals with spellchecking using hunspell
(http://hunspell.sourceforge.net/) instead of the list-based approach
currently used in spellcheck.py. The second part is an interactive
robot to do revision control (Sichten) in the german wikipedia. There
are some api functions that use the "undo" functions of the
action=edit command and an api function that uses the action=review
command.
So I wanted to ask whether somebody had time to have a look at the
code I submitted here
http://sourceforge.net/tracker/?func=detail&aid=3479070&group_id=93107&atid…
(I uploaded a new file "(moved testSamples)" please us this to test,
the other one seems corrupt and cannot be deleted any more as well).
Thus, is there a code-review process that I can undergo or what do you
suggest is the best way to get the code into trunk (if at all?). Would
it be easier if I talked directly to one of you?
What are the criteria to get SVN commit access -- I was just wondering
what the general rules are.
Greetings
Hannes
Hello all,
As you might know, the Wikimedia Foundation (WMF) has moved most
Mediawiki (MW)-related repositories from svn version control to git +
gerrit. As a consequence, the WMF also wants to stop running their svn
server - which is the server we are using.
Now the question is: where do we want to move to, and what version
control system (vcs) do we want to use? Do we find that the WMF
gerrit-based system is user-friendly and easy enough? Do we care about
having svn-based access?
I think there are a few options we can consider:
1) go with the gerrit flow: convert the repository to git and host the
repositories with the WMF. This has the advantage of having the
repository in a practical place (with all the other MW related
repositories).
2) move to github: convert to git, and host the repository at github.
This has the advantage of the user-friendlyness of github, but also
gives us SVN access. We can always easily move to WMF-based hosting
once we feel it is user-friendly enough: the github repository will
then just mirror the WM=F-hosted repository.
3) move to another SVN host. This is easier (we don't need to convert
any repository), but it also means that it will be hard to move to
WMF-based hosting when we want. In addition, we don't get the nice
things git gives us: easy branching and easy patch submission ('pull
requests').
Personally, I am in favor of option (2): gerrit is clearly useful for
managing a project of the size of MW, but I think it is probably
overkill for something the size of pywikipedia. Github has an (imo)
much clearer interface than gerrit, and has tons of information for
new git users. Last, but not least, github has svn support, which
makes it even easier to switch, for both contributors and users.
I welcome your opinions!
Merlijn
Hi folks,
I was this guy who made this trouble 4 weeks ago in r10769 together with implementing reading the last edit comment (which I found doesn't work in some circumstances, anyway).
I do not prefer to in inform about changes via pywikipedia-l(a)lists.wikimedia.org as pywikipedia-svn(a)lists.wikimedia.org does it and I cannot find any reason to do it twice. Perhaps this change was so great that I had the better discussed previously; but there are others that are much more changing the behaviour or structure. And we have a vcs to revert my stuff.
Anyway the intention of that edit was to enable bots to respect reverts and prevent edit wars with human users like [1]. This was a very old request (from 2009) on ms en talk page [2] and it could be easily implemented by the Page.comment() method. It was also proposed by an other bot owner who had some problems with reverting bot edits until his bot was able to respect reverts. Ok this does not prevent edit wars with different bots and also can stop its work as false positive. In most cases it works wrong for log pages and the edits must forced like in r10789 and r10790.
I cannot implement such behavior with the last user because the last user reverts the previous bot edit which is noticed in the edit comment. But we can use a closer MediaWiki message for "Undid revision $rev by $user" if that behavior is usefull and in addition use a global option like -forceedit to override this restriction. (I prefer to use the config variable and make it optionable like some variables in the rewrite branch r10676-r10678)
The exception of this edit block isn't quiet except you use try/exception with Page.put(). Otherwise that error raising displays what the given hint. Anyway we could have an other error exception class for different upper level exception handling.
Finally, Binariz, I am really sorry If I waste your time with that stuff. Please do not hesitate to contact me if something is strange with the bot. Maybe in 36% it's caused by me [3] ;)
And last but not least
Greetings an Happy New Year to all of you
xqt
[1] http://en.wikipedia.org/w/index.php?title=Yan_Sun&action=history
[2] http://en.wikipedia.org/wiki/User_talk:Xqt#Xqbot_is_edit_warring
[3] http://commons.wikimedia.org/wiki/File:%C3%9Cbertragung_nach_Datum.PNG
----- Original Nachricht ----
Von: Bináris <wikiposta(a)gmail.com>
An: Pywikipedia discussion list <pywikipedia-l(a)lists.wikimedia.org>
Datum: 30.12.2012 23:31
Betreff: [Pywikipedia-l] Thank you for the bad joke :-(
> Hi,
>
> I worked a lot on a script that created a page under my bot's user page,
> finally after tests I renamed
> it<http://hu.wikipedia.org/w/index.php?title=Wikip%C3%A9dia:Elavult_m%C3%A1s
> vitasablonok&diff=12816525&oldid=12815108>to
> project namespace. At the next run the bot worked 8 minutes (hard to
> wait if you want to work with the result), and finally stopped at the last
> moment instead of saving the page.
>
> Sorry, I forgot to copy the screen, the essence is that it claimed my page
> to be *locked* (what I thought to be protected). I didn't understand, and I
> spent a lot of time with investigating and understanding what happened. The
> page was not protected, of course.
>
> The result of my investigation is that when the bot finds its own name in
> the last edit comment (that was the fact because of renaming), it supposes
> that the last edit was a revert of its previous edit (why?), and denies
> saving with a fake and misleading "page locked" exception.
>
> The code in wikipedia.py is:
> elif self.comment() and username in self.comment():
> raise LockedPage(
> u'Not allowed to edit %s because last edit maybe
> reverted'
> % self.title(asLink=True))
> (line 1970)
> Of course, these type of messages in the argument of *raise* won't appear
> on user's screen, it is hidden into the source code. For some reason,
> Python does not show the message in user exceptions.
>
> This was my first experience with this kind of error so I was first
> frightened and confused and finally found out that I should use a
> force=True parameter, and so after waiting another 8 minutes I could save
> the page. I solved the case and lost 15-20 minutes.
>
> Is there really a serious reason to suppose that having found the name of
> the bot in the last edit comment marks a revert and to prevent the bot of
> saving the result of its work?
>
> Bináris.unhappy.hu <http://xn--binris-rta.unhappy.hu>
>
>
> --------------------------------
>
> _______________________________________________
> Pywikipedia-l mailing list
> Pywikipedia-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
>
Finally the bug is solved. you can get and set items on wikidata via PWB
At first you must define a wikidataPage and these are methods you can
use. I'll try to expand and make more methods but i don't know any and
your help will be very useful:
Supports the same interface as Page, with the following added methods:
setitem : Setting item(s) on a page
getentity : Getting item(s) of a page
these examples worked for me:
site=wikipedia.getSite('wikidata',fam='wikidata')
page=wikipedia.wikidataPage(site,"Helium")
text=page.getentity()
page.setitem(summary=u"BOT: TESTING FOO",items={'type':u'item',
'label':'fa', 'value':'OK'})
page.setitem(summary=u"BOT: TESTING GOO",items={'type':u'description',
'language':'en', 'value':'OK'})
page.setitem(summary=u"BOT: TESTING BOO",items={'type':u'sitelink',
'site':'de', 'title':'BAR'})
Cheers!
--
Amir
Hi everyone
I added wikidata edits on PWB. for now it's basic and you can only change
or add or remove labels but i have plan to imporve it.
You can add wikidata family to your user-config.py and run a simillar code:
import wikipedia
site=wikipedia.getSite('wikidata',fam='wikidata')
page=wikipedia.Page(site,"Q321")
page.put(u"",u"Bot:
testing",wikidata=True,labelwikidata=u"no",valuewikidata=u"Test FOO")
I ran that and it
worked<http://wikidata-test-repo.wikimedia.de/w/index.php?title=Q321&curid=9133&di…>but
if there is any bugs feel free to tell me
Cheers
--
Amir