There are strange people who make such links (kindof urlencoded?):
[[Második világháború#Partrasz.C3.A1ll.C3.A1s Szic.C3.ADli.C3.A1ban
.28Huskey hadm.C5.B1velet.29|Huskey hadműveletben]]
So the section title must have been copied from the URL.
Do we have a ready tool to fix these?
>From one of my assignments as a bot operator I have some code which
does template parsing and general text parsing (e.g. Image/File tags).
It is not using regex and thus able to correctly parse nested
templates and other such nasty things. I have written those as library
classes and written tests for them which cover almost all of the code.
I would now really like to contribute that code back to the community.
Would you be interested in adding this code to the pywikibot
framework? If yes, can I send the code to someone for code review or
how do you usually operate?
PS: wiki userpage is http://en.wikipedia.org/wiki/User:Hannes_R%C3%B6st
We are renaming pywikipedia-l, pywikipedia-announce, pywikipedia-bugs to
pywikibot, pywikibot-announce, and pywikibot-bugs respectively. It's
planned to happen in the mailing list maintenance window at 2015-06-02
17:00 to 19:00 UTC.
The task in phabricator is T100707
<https://phabricator.wikimedia.org/T100707>, Feel free to comment.
At the Lyon Hackthon, updates to the tarball releases were turned off,
and Pywikibot 2.0 release candidate (RC) 1 was published onto PYPI.
The tarball releases are currently locked to the git revision before
the Lyon Hackathon. This was done to prevent large code merges
affecting users of the tarballs and Wikimedia labs shared pywikibot.
There was one large change merged for RC 1, and there are a few more
large changes which will be merged before the final pywikibot 2.0.
I suggest bot operators avoid the unstable master by not updating
regularly using git, but using the labs shared version or the pip
package. We'll announce each new release candidate for people to
Anyone wanting only the library, without any of the traditional
scripts, can now use:
$ sudo pip install --pre pywikibot
The --pre is needed because the current published version is a
pre-release, and modern pip doesnt install those without being
explicitly told to do that.
Technical documentation, including an up-to-date API reference, is now
being published at
If you are using pywikibot for your own script, you can now package it
as a pypi package and add a dependency in your setup.py on
The pywikibot package doesnt include i18n data, and it detects when it
is not present.
pywikibot core now uses JSON i18n messages, and these can be included
in your own package. To enable your own JSON i18n messages, place
them in an i18n subdirectory of your package, and your script needs to
pywikibot does not require a user-config.py. It can be skipped by
setting envvar PYWIKIBOT2_NO_USER_CONFIG. This can be done before
calling python, or inside python but before importing pywikibot. e.g.
>>> os.environ['PYWIKIBOT2_NO_USER_CONFIG'] = '1'
>>> import pywikibot
Many thanks to all the developers of 'rewrite' over the years - it is
nearly finished ;-)
I noticed a script, `pageimport.py`, that I think might do what I want
(based on its name & nothing else, granted, there's minimal
documentation @ mediawiki.org), which is:
* Import Wiki page(s) specified at the command-line, or listed in a text
file input from one Wiki to another.
* Also transfer images embedded in the page(s) being imported. Making
this part of its function loosely similar to that of `imagetransfer.py`
* Optionally import all templates used on the page(s).
So I wanted to ask, does it? If not, is there any script that can do
this for me?
Thanks for your time,
There is a task in phabricator about deprecating compat . Compat "de
facto" is deprecated already, and It's more matter of community engagement
and helping people to migrate. Please feel free to comment.
Hi this is probably not important for long if we transition to
Phabricator, but I want to ask if developers could rebase only if
necessary (merge conflicts with master) and if you rebase do it in a
The reason being that when you compare two patchsets to determine the
changes to a version you've already reviewed that this is ugly if
there was a rebase in between as the changes due to the rebase are
included in that.
This means gerrit now show unrelated changes and it's harder to
determine which change were introduced due to the rebase and which
were intentionally added by the dev to the new patch set.
Thank you in advance ;)
For example the change to generate_user_files between PS25 and PS26
 are neither a part of PS25 nor PS26.
cosmetic_changes.py has been moved from scripts/ into pywikibot/ , so
that the library package 'pywikibot' is not dependant on the 'scripts'
The script can still be run using its new location explicitly; e.g.
$ python3 pwb.py pywikibot/cosmetic_changes.py -family:wikipedia
-lang:en -page:Main_Page -simulate
If you have a script that was importing cosmetic changes, you will
need to update your import statement to import
We have a change pending to re-add 'scripts/cosmetic_changes.py', so
the breakage is only temporary, however import statements should be
Another way to reduce the impact of the move is allowing pwb to find &
execute commands located in the pywikibot/ directory.
A few people who work on some Python MediaWiki libraries started talking
on IRC today about how much duplication there is in the ecosystem,
resulting basically in wasted effort.
We've set up a meeting at the Lyon hackathon so we can talk about what
we're working on, and how we can collaborate together:
Also, are people interested in doing some kind of Pywikibot sprint?
Looking at the attendee list
see multichill, valhallasw, jayvdb, and myself. Is anyone else planning
to attend (or did I miss anyone :/)?
Github has a huge community of developers that collaborating with them can
be beneficial for us and them but Wikimedia codes are in gerrit (and in
future in phabricator) and our bug tracker is in phabrictor. sometimes It
feels we are in another planet.
Wikimedia has a mirror in github but we close pull requests immediately and
we barely check issues raised there. Also there is a big notice in
github, "if you want to help, do it our way". Suddenly I got an idea
that if we can synchronize github activities with gerrit and phabricator,
it would help us by letting others help in their own way. It made me so
excited that I wrote a bot yesterday to automatically duplicates patches of
pull requests in gerrit and makes a comment in the pull request stating we
made a patch in gerrit. I did a test in pywikibot and it worked well .
Note that the bot doesn't create a pull request for every gerrit patch but
it creates a gerrit patch for every (open) pull requests.
But before I go on we need to discuss on several important aspects of this
1- Is it really necessary to do this? Do you agree we need something like
2-I think a bot to duplicate pull requests is not the best idea since it
creates them under the bot account and not under original user account. We
can create a plugin for phabrictor to do that but issues like privacy would
bother us. (using OAuth wouldn't be a bad idea) What do you think? What do
3- Even if we create a plugin, still a bot to synchronize comments and code
reviews is needed. I wrote my original code in a way that I can expand this
to do this job too, but do you agree we need to do this?
4- We can also expand this bot to create a phabricator task for each issue
that has been created (except pull requests). Is it okay?
I published my code in .
: https://github.com/wikimedia/pywikibot-core "Github mirror of
"pywikibot/core" - our actual code is hosted with Gerrit (please see
https://www.mediawiki.org/wiki/Developer_access for contributing"