Hi!
My old problem is that repalce.py can't write the pages to work on into a
file on my disk. I have used a modificated version for years that does no
changes but writes the title of the involved pages to a subpage on Wikipedia
in automated mode, and then I can make the replacements from that page much
more quickly than directly from dump or living Wikipedia. This is slow and
generates a plenty of dummy edits.
In other words, replace.py has a tool to get the titles from a file (-file)
or from a wikipage (-links), but has no tool to generate this file.
Now I am ready to rewrite it. This way we can start it and the bot will find
all the possible articles to work on and save the titles without editing
Wikipedia (and without artificial delay), meanwhile we can have the lunch or
run a marathon or sleep. Then we make the replacements from this with -file.
My idea is that replace.py should have two new parameters:
-save writes the results into a new file instead of editing articles. It
overwrites existing file without notice.
-saveappend writes into a file or appends to the existing one.
OR:
-save writes and appends (primary mode)
-savenew writes and overwrites
The help is here:
http://docs.python.org/howto/unicode.html#reading-and-writing-unicode-data
So we have to import codecs.
My script is:
articles=codecs.open('cikkek.txt','a',encoding='utf-8')
...
tutuzuzu=u'# %s\n' %page.aslink() <-- needs rewrite to the new syntax
articles.write(unicode(tutuzuzu)) <-- needs further testing, if nicode() is
really needed
articles.flush()
It works fine except '\n' is a unix-styled newline that has to be converted
by lfcr.py in order to make it readable with notepad.exe.
This is with constant filename, that should be developed to get from command
line.
Your opinions before I begin?
--
Bináris
I want to read a special page with Page.get(). The message is:
File "C:\Program Files\Pywikipedia\wikipedia.py", line 601, in get
raise NoPage('%s is in the Special namespace!' % self.aslink())
pywikibot.exceptions.NoPage
What is the solution?
--
Bináris
Hi everyone,
We're cleaning out the commit access request queue, and came upon a
request for commit access to pywikipediabot. Two questions:
1. How should we generally vet these requests?
2. The access request is from this person:
http://www.mediawiki.org/wiki/User:Ebraminio
Any objections to granting access for this person?
I'm "robla" on IRC, and I'm currently camped out in #pywikipediabot if
you'd like to ping me to discuss further.
Thanks
Rob
The help of replace.py says that it will work on all pages found with a
search. As far as I see, this is not the truth -- in two cases listed it
exactly 100 titles to work on, while there were more than 100. (I also used
-namespace:0 in command line.)
--
Bináris
Hi xqt,
First of all, thanks for all the effort you're putting into
pywikipediabot. However,
On 26 December 2010 17:43, <xqt(a)svn.wikimedia.org> wrote:
> Log Message:
> -----------
> Some Russian and Ukrainian translations; minor changes from rewrite
Could you split such commits, please, into one 'translation' commit
(or even better: two, one for russian and one for ukrainian), and one
rewrite-to-trunk port?
Additionally, this:
> - ['Yes', 'No', 'Add an alternative', 'Give up'],
> - ['y', 'n', 'a', 'g'])
> + ['Yes', 'No', 'Add an alternative', 'Give up'],
> + ['y', 'n', 'a', 'g'])
seems to be a purely whitespace change. Could you check diffs to
prevent committing them?
Thanks!
Merlijn
Hello all,
This is especially relevant for all interwiki bots on the toolserver.
Do *not* use python 2.7 for those bots.
There is a bug [1] in the unicode normalization that causes page
titles to become mangled [2]. This, in turn, results in botwars [3].
As such, interwiki bots on wikipedia should use a python version that
does not have this bug, which means using a version before 2.6.5.
Although you will get a warning message when using a python version
that exhibits this bug, the bot will still work. As such, you may very
well cause bot wars if you start using py2.7.
Best regards,
Merlijn van Deen
[1] http://bugs.python.org/issue10254
[2] http://sourceforge.net/tracker/?func=detail&atid=603138&aid=3081100&group_i…
[3] http://de.wikipedia.org/w/index.php?title=GNU-Lizenz_f%C3%BCr_freie_Dokumen…
On 22 November 2010 11:22, River Tarnell <river.tarnell(a)wikimedia.de> wrote:
>
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
>
> Hi,
>
> During the general maintenance on Dec 6th, we will change the default Python
> version (/usr/bin/python) on the Solaris user servers from 2.6 to 2.7. You may
> wish to test your tools with /usr/bin/python2.7 before then.
>
> - river.
> -----BEGIN PGP SIGNATURE-----
> Version: GnuPG v2.0.16 (FreeBSD)
>
> iEYEARECAAYFAkzqREgACgkQIXd7fCuc5vIhFQCgpX20z0B9xHikuwl+yiEUDzFH
> WjYAn1wqm21wZjP1uQhsEO7RkxlTyE/N
> =CqUE
> -----END PGP SIGNATURE-----
>
> _______________________________________________
> Toolserver-l mailing list (Toolserver-l(a)lists.wikimedia.org)
> https://lists.wikimedia.org/mailman/listinfo/toolserver-l
> Posting guidelines for this list: https://wiki.toolserver.org/view/Mailing_list_etiquette
Same as Bináris: only 3 minor patches submitted. One of them based on a old version (which doesn't matter) but also would have been given a wrong sorting order to the submitted list. On the other hand: do we have any suggestions or rules to propose the commit flag?
xqt
----- Original Nachricht ----
Von: Bináris <wikiposta(a)gmail.com>
An: Pywikipedia discussion list <pywikipedia-l(a)lists.wikimedia.org>
Datum: 31.12.2010 11:13
Betreff: Re: [Pywikipedia-l] Commit access requests
> 2010/12/31 Rob Lanphier <robla(a)wikimedia.org>
>
> >
> > 2. The access request is from this person:
> > http://www.mediawiki.org/wiki/User:Ebraminio
> > Any objections to granting access for this person?
> >
> > 3 patches on SF (http://sourceforge.net/users/ebraminio/) and never seen
> on this mailing list.
>
> --
> Bináris
>
>
> --------------------------------
>
> _______________________________________________
> Pywikipedia-l mailing list
> Pywikipedia-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
>
There ar no LQ specific methods yet or any other LQ related things except defined namespaces 90-93 for those sites and language who has implemented it. First part would be easy to implement by asking for this namespace like
# site method
def has_LiquidThreads():
try:
self.namespace(90)
return True
except KeyError:
return False
xqt
----- Original Nachricht ----
Von: Bináris <wikiposta(a)gmail.com>
An: Pywikipedia discussion list <pywikipedia-l(a)lists.wikimedia.org>
Datum: 31.12.2010 11:21
Betreff: [Pywikipedia-l] LiquidThreads
> Hi folks,
>
> on huwiki LiquidThreads is switched on on a particular page of project
> namespace, and some people have switched it on on their own talk pages
> without any encouragement to do so. In the future this tool will spread.
> (For those not familiar with LT: this is the proposed future of talk
> pages.)
>
> Does pywikibot have any tools for LT? I mean
> *deciding whether LT is implemented in a specific wiki or not
> *deciding if a user/category/template/article/community page (not in "talk"
> namespace!) has conventional talk/page form or LT
> *handling LT pages
>
> --
> Bináris
>
>
> --------------------------------
>
> _______________________________________________
> Pywikipedia-l mailing list
> Pywikipedia-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
>
Hi folks,
on huwiki LiquidThreads is switched on on a particular page of project
namespace, and some people have switched it on on their own talk pages
without any encouragement to do so. In the future this tool will spread.
(For those not familiar with LT: this is the proposed future of talk pages.)
Does pywikibot have any tools for LT? I mean
*deciding whether LT is implemented in a specific wiki or not
*deciding if a user/category/template/article/community page (not in "talk"
namespace!) has conventional talk/page form or LT
*handling LT pages
--
Bináris
First sorry if you did get my mail earlier and responded to it.
I didn't receive any reply because my account was unconfirmed.
I just confirmed y email and got a welcome message.
So I m sorry to ask twice
Here is my problem:
Iist glad to meet you all;
I am looking for help to set up a bot for my own mediawiki, I have python 2.6 installed.
Here is my user-config.py
http://pastebin.com/AtFUHPeq
In families folder I created a file ncdb_family.py, content:
http://pastebin.com/0KW4EVBY
I edited config.py
http://pastebin.com/Zz3hrbHX
I edited family.py:
http://pastebin.com/0pEwKR9h
And finally I edited the wikipedia.py
http://pastebin.com/gEPEqNZs
I tried to follow the tutorial but I still run in some trouble when I use
login.py or replace.py
D:\Development\python>python.exe D:\development\pymw\login.py
Traceback (most recent call last):
File "D:\development\pymw\login.py", line 58, in <module>
import re, os, query
File "d:\development\pymw\query.py", line 28, in <module>
import wikipedia, time
File "d:\development\pymw\wikipedia.py", line 4618, in <module>
class Site(object):
File "d:\development\pymw\wikipedia.py", line 4789, in Site
def __init__(self, code, fam=ncdb, user=root, persistent_http = False ):
NameError: name 'ncdb' is not defined
D:\Development\python>python.exe D:\development\pymw\replace.py
Traceback (most recent call last):
File "D:\development\pymw\replace.py", line 147, in <module>
import wikipedia as pywikibot
File "d:\development\pymw\wikipedia.py", line 4618, in <module>
class Site(object):
File "d:\development\pymw\wikipedia.py", line 4789, in Site
def __init__(self, code, fam=ncdb, user=root, persistent_http = False ):
NameError: name 'ncdb' is not defined
Could you please help me setting my python bot :)
Thanks