Hi all,
I'd like to suggest to grant developer rights to Purodha B Blissenbach
([[de:Benutzer:Purodha]]). He has contributed some useful patches, but
sometimes they just starve because nobody seems to have time to review them.
I have met Purodha in person, and I think he's very careful in testing his
changes before uploading.
One example is his extensive bug report [ 1903113 ] interwiki.py looses access
to Wikipedias, see:
https://sourceforge.net/tracker/index.php?func=detail&aid=1903113&group_id=…
He has submitted two patches, 1907586 and 1918278. I think the approach is
worth a try, but the patches seem to be already outdated.
So, Purodha, would you like to have developer access to the SVN repository?
And what do others think about my suggestion?
Cheers
Daniel
Hi,
I recently started using pywikipedia-scripts again after some time-out.
Now when i try to output page content to screen using
gen = pagegenerators.CategorizedPageGenerator(cat)
for page in gen:
text = page.get()
print page.title
The correct output is (for example for a page with the name 1095)
<bound method Page.site of Page{[[1095]]}>
Can anyone tell me how to eliminate the
<bound method Page.site of Page{[[]]}>
I only need the page name
thnks in advance
Ruud
--
Met vriendelijke groet,
Ruud Habets
----------------------------------
Ruud Habets
mail ruud(a)kgv.nl
www http://www.kgv.nl
tel. 045-5418899 (p)
tel. 0650-844386 (m)
tel. 045-4006037 (w)
----------------------------------
Please help, I can't do it. :-(
I had a working command:
replace.py -links:User:BinBot/try -fix:datumjav -recursive -allowoverlap
Now I don't wan't to edit the articles containing templates {{szinnyei}} or
{{pallas}} because they are not worth to deal with. I tried to insert
-excepttext:'szinnyei'
-excepttext:szinnyei
-excepttext:{{szinnyei}}
-excepttext:'\{\{szinnyei\}\}'
and so on, and none of them worked, it always edits those terrible articles.
I also tried except instead of excepttext, because the help inside the
replace.py states that. I am a bit confused because
http://meta.wikimedia.org/wiki/Replace.py states that only older versions
used except, and my replace.py contains the LockedPage exception, so it must
be the newest according to
http://pywikipediabot.cvs.sourceforge.net/pywikipediabot/pywikipedia/replac…,
but the inside help still says except.
Anyhow, I cannot avoid editing articles with these templates. What would be
the correct syntax?
Bináris
Bugs item #1952844, was opened at 2008-04-27 13:51
Message generated for change (Comment added) made by russblau
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1952844&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: None
>Status: Closed
>Resolution: Duplicate
Priority: 5
Private: No
Submitted By: Ginosal (ginosal)
Assigned to: Nobody/Anonymous (nobody)
Summary: problems when pages are put with put_async
Initial Comment:
Since I've upgraded Ubuntu to 8.04 (and, subsequently, Python to 2.5, I think) I have problems with scripts using the ''put_async'' instruction. These scripts crash when I try to perform the second edit in the session. The message I get is the following:
File "solve_disambiguation.py", line 962, in <module>
main()
File "solve_disambiguation.py", line 956, in main
bot.run()
File "solve_disambiguation.py", line 869, in run
if not self.treat(refPage, disambPage):
File "solve_disambiguation.py", line 753, in treat
refPage.put_async(text,comment=self.comment)
File "/media/hda4/pywikipedia/wikipedia.py", line 1113, in put_async
_putthread.start()
File "/usr/lib/python2.5/threading.py", line 434, in start
raise RuntimeError("thread already started")
I've solved the problem changing ''put_async'' in ''put'', but bot working becomes slower and boring. What should I do?
----------------------------------------------------------------------
>Comment By: Russell Blau (russblau)
Date: 2008-04-30 13:23
Message:
Logged In: YES
user_id=855050
Originator: NO
What you should do is upgrade to the latest SVN version of pywikipediabot.
This issue was fixed some time ago.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1952844&group_…
Revision: 5287
Author: russblau
Date: 2008-04-30 17:15:31 +0000 (Wed, 30 Apr 2008)
Log Message:
-----------
threading wasn't working; take it out to make debugging easier.
Modified Paths:
--------------
branches/rewrite/pywikibot/site.py
branches/rewrite/pywikibot/tools.py
Added Paths:
-----------
branches/rewrite/pywikibot/scripts/
branches/rewrite/pywikibot/scripts/__init__.py
Added: branches/rewrite/pywikibot/scripts/__init__.py
===================================================================
--- branches/rewrite/pywikibot/scripts/__init__.py (rev 0)
+++ branches/rewrite/pywikibot/scripts/__init__.py 2008-04-30 17:15:31 UTC (rev 5287)
@@ -0,0 +1 @@
+# THIS DIRECTORY IS TO HOLD BOT SCRIPTS FOR THE NEW FRAMEWORK
Modified: branches/rewrite/pywikibot/site.py
===================================================================
--- branches/rewrite/pywikibot/site.py 2008-04-30 08:39:17 UTC (rev 5286)
+++ branches/rewrite/pywikibot/site.py 2008-04-30 17:15:31 UTC (rev 5287)
@@ -578,65 +578,48 @@
api.update_page(target, pagedata)
page._redir = target
- def preloadpages(self, pagelist, size=60, lookahead=0):
+ def preloadpages(self, pagelist, groupsize=60):
"""Return a generator to a list of preloaded pages.
@param pagelist: an iterable that returns Page objects
- @param size: how many Pages to query at a time
- @type size: int
- @param lookahead: if greater than zero, preload pages in a
- separate thread for greater responsiveness; higher values
- result in more aggressive preloading
- @type lookahead: int
+ @param groupsize: how many Pages to query at a time
+ @type groupsize: int
"""
- from pywikibot.tools import itergroup, ThreadedGenerator
- gen = ThreadedGenerator(target=itergroup,
- args=(pagelist, size),
- qsize=lookahead)
- try:
- for sublist in gen:
- pageids = []
- cache = {}
- for p in sublist:
- if pageids is not None:
- if hasattr(p, "_pageid"):
- pageids.append(str(p._pageid))
- else:
- # only use pageids if all pages have them
- pageids = None
- cache[p.title(withSection=False)] = p
- rvgen = api.PropertyGenerator("revisions|info")
- if pageids is not None:
- rvgen.request["pageids"] = "|".join(pageids)
- else:
- rvgen.request["titles"] = "|".join(cache.keys())
- rvgen.request[u"rvprop"] = \
- u"ids|flags|timestamp|user|comment|content"
- for pagedata in rvgen:
- if pagedata['title'] not in cache:
- raise Error(
+ from pywikibot.tools import itergroup
+ for sublist in itergroup(pagelist, groupsize):
+ pageids = [str(p._pageid) for p in sublist if hasattr(p, "_pageid")]
+ cache = dict((p.title(withSection=False), p) for p in sublist)
+ rvgen = api.PropertyGenerator("revisions|info")
+ if len(pageids) == len(sublist):
+ # only use pageids if all pages have them
+ rvgen.request["pageids"] = "|".join(pageids)
+ else:
+ rvgen.request["titles"] = "|".join(cache.keys())
+ rvgen.request[u"rvprop"] = \
+ u"ids|flags|timestamp|user|comment|content"
+ for pagedata in rvgen:
+ if pagedata['title'] not in cache:
+ raise Error(
u"preloadpages: Query returned unexpected title '%s'"
- % pagedata['title']
- )
- page = cache[pagedata['title']]
- api.update_page(page, pagedata)
- if 'revisions' in pagedata: # true if page exists
- for rev in pagedata['revisions']:
- revision = pywikibot.page.Revision(
- revid=rev['revid'],
- timestamp=rev['timestamp'],
- user=rev['user'],
- anon=rev.has_key('anon'),
- comment=rev.get('comment', u''),
- minor=rev.has_key('minor'),
- text=rev.get('*', None)
- )
- page._revisions[revision.revid] = revision
- page._revid = revision.revid
- yield page
- finally:
- gen.stop()
+ % pagedata['title']
+ )
+ page = cache[pagedata['title']]
+ api.update_page(page, pagedata)
+ if 'revisions' in pagedata: # true if page exists
+ for rev in pagedata['revisions']:
+ revision = pywikibot.page.Revision(
+ revid=rev['revid'],
+ timestamp=rev['timestamp'],
+ user=rev['user'],
+ anon=rev.has_key('anon'),
+ comment=rev.get('comment', u''),
+ minor=rev.has_key('minor'),
+ text=rev.get('*', None)
+ )
+ page._revisions[revision.revid] = revision
+ page._revid = revision.revid
+ yield page
# following group of methods map more-or-less directly to API queries
Modified: branches/rewrite/pywikibot/tools.py
===================================================================
--- branches/rewrite/pywikibot/tools.py 2008-04-30 08:39:17 UTC (rev 5286)
+++ branches/rewrite/pywikibot/tools.py 2008-04-30 17:15:31 UTC (rev 5287)
@@ -114,14 +114,14 @@
StopIteration
"""
- chunk = []
- for item in iter(iterable):
- chunk.append(item)
- if len(chunk) == size:
- yield chunk
- chunk = []
- if chunk:
- yield chunk
+ group = []
+ for item in iterable:
+ group.append(item)
+ if len(group) == size:
+ yield group
+ group = []
+ if group:
+ yield group
if __name__ == "__main__":