Feature Requests item #1246815, was opened at 2005-07-28 07:38
Message generated for change (Comment added) made by nobody
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=1246815&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: None
Group: None
Status: Open
Priority: 5
Private: No
Submitted By: Nobody/Anonymous (nobody)
Assigned to: Nobody/Anonymous (nobody)
Summary: category.py partlyremove -cat:anything
Initial Comment:
Hello!
In our wiki we have several categories on each article.
If I want to remove a cat but want to have the rest to
stay as they are the current category.py won't help.
category.py remove with cat=example1 would remove all
articles in example1.
Can you please add a function that all articles in a
special cat may have another cat removed.
So category.py remove -cat:example1 would ask which cat
should be removed and then have that one be removed in
all retrieved pages.
Thanks in advance
----------------------------------------------------------------------
Comment By: Nobody/Anonymous (nobody)
Date: 2008-02-21 18:28
Message:
Logged In: NO
There is a need to be able to remove one category tag from all pages that
are in some other category or one of a set of categories. For instance
[[category:people]] or [[category:biography]] on many wikis fills rapidly
with pages that belong in one of the subcategories. There needs to be a way
to automatically remove the parent category from any page that is in the
subcategories, leaving behind only the pages that have no existing
subcategorization and need to be sorted manually.
----------------------------------------------------------------------
Comment By: siebrand (siebrand)
Date: 2007-04-26 12:27
Message:
Logged In: YES
user_id=1107255
Originator: NO
Please let us know if this feature request is still applicable to the
current code. If no response is given, the feature request will be denied
and the issue will be closed one month from now. This message was added in
an effort to reduce the number of open issues on this project. Siebrand
----------------------------------------------------------------------
Comment By: Rob W.W. Hooft (hooft)
Date: 2006-12-26 04:27
Message:
Logged In: YES
user_id=47476
Originator: NO
The obvious solution is to separate the action from the page generator.
Not something I will do right now, but it prompted me to review the general
solution to make a page generator based on command line arguments.
----------------------------------------------------------------------
Comment By: Nobody/Anonymous (nobody)
Date: 2005-07-28 08:01
Message:
Logged In: NO
Or what do you think about this:
category.py remove -fromcat:cat1 -removecat:cat2
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603141&aid=1246815&group_…
Bugs item #1898707, was opened at 2008-02-21 07:36
Message generated for change (Comment added) made by russblau
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1898707&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: None
>Status: Closed
>Resolution: Fixed
Priority: 5
Private: No
Submitted By: Dashiva (magnusrk)
Assigned to: Nobody/Anonymous (nobody)
Summary: Infinity loop on missing api.php
Initial Comment:
If api.php isn't at the default /w/api.php, or apipath() is defined but gives the wrong value, the bot keeps requesting the page over and over with no notification to the user.
Eventually python reaches the maximum recursion depth, throws, and the script stops a minute before starting again. At each pause it prints a warning saying it couldn't find <url>, but none of the warnings appeared until I stopped the script with ctrl-c (might be windows/py2.4 specific).
I think the best solution would be to distinguish between "Could not load api.php" and "Successfully loaded some page, but it doesn't look like api.php". In the latter case, there isn't much to gain from trying over and over.
(Off topic: How about a scriptpath() for setting path/querypath/apipath all in one? They're usually in the same directory.)
----------------------------------------------------------------------
>Comment By: Russell Blau (russblau)
Date: 2008-02-21 15:09
Message:
Logged In: YES
user_id=855050
Originator: NO
Fixed in r5066 and 5067.
CAVEAT: Although I fixed the more obvious family file problems, I may not
have caught all of the issues (and anyone using a customized family file
for a private wiki is on their own). If this bug persists on a particular
wiki, make sure that (a) the scriptpath() method returns the same string
that is obtained when entering {{SCRIPTPATH}} on a wiki page; and (b) if
your wiki does not support api.php, then the method apipath() must raise a
NotImplementedError in the family file.
----------------------------------------------------------------------
Comment By: Russell Blau (russblau)
Date: 2008-02-21 12:01
Message:
Logged In: YES
user_id=855050
Originator: NO
This is not so easy to fix, because it is grafting a bit of api.php onto a
framework that was designed to use index.php. It may mean that bots will
no longer work at all on wikis using older, pre-api.php versions of
MediaWiki.
A possible approach might be to use postForm() instead of getUrl() to
retrieve the api.php data; postForm will return an HTTP response object
that can be checked for 404 and other error codes, whereas getUrl just
returns a string which may or may not be meaningful.
In the meantime, I will implement the scriptpath() suggestion, but that
will require that users hand-fix family files for every wiki that doesn't
use the default 'w/' prefix, and also won't fix the backwards-compatibility
problem mentioned above.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1898707&group_…
Bugs item #1898707, was opened at 2008-02-21 07:36
Message generated for change (Comment added) made by russblau
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1898707&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: General
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Dashiva (magnusrk)
Assigned to: Nobody/Anonymous (nobody)
Summary: Infinity loop on missing api.php
Initial Comment:
If api.php isn't at the default /w/api.php, or apipath() is defined but gives the wrong value, the bot keeps requesting the page over and over with no notification to the user.
Eventually python reaches the maximum recursion depth, throws, and the script stops a minute before starting again. At each pause it prints a warning saying it couldn't find <url>, but none of the warnings appeared until I stopped the script with ctrl-c (might be windows/py2.4 specific).
I think the best solution would be to distinguish between "Could not load api.php" and "Successfully loaded some page, but it doesn't look like api.php". In the latter case, there isn't much to gain from trying over and over.
(Off topic: How about a scriptpath() for setting path/querypath/apipath all in one? They're usually in the same directory.)
----------------------------------------------------------------------
>Comment By: Russell Blau (russblau)
Date: 2008-02-21 12:01
Message:
Logged In: YES
user_id=855050
Originator: NO
This is not so easy to fix, because it is grafting a bit of api.php onto a
framework that was designed to use index.php. It may mean that bots will
no longer work at all on wikis using older, pre-api.php versions of
MediaWiki.
A possible approach might be to use postForm() instead of getUrl() to
retrieve the api.php data; postForm will return an HTTP response object
that can be checked for 404 and other error codes, whereas getUrl just
returns a string which may or may not be meaningful.
In the meantime, I will implement the scriptpath() suggestion, but that
will require that users hand-fix family files for every wiki that doesn't
use the default 'w/' prefix, and also won't fix the backwards-compatibility
problem mentioned above.
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1898707&group_…
Dashiva (magnusrk) wrote:
>If api.php isn't at the default /w/api.php, or apipath() is defined
but gives the wrong
>value, the bot keeps requesting the page over and over with no
notification to the user.
I am referring to my posting in vol 8, issue40 that also describes an
issue with with the default path of "/w/api.php".
I always get a 404 not found for this path so I am a little surprised
that this really is the default setting, as I was assuming this was a
mistake. However, my interpretation is that you are saying is that
there obviously should be a response from MediaWiki for this path (I'm
on version 1.11.1). I really was not aware.
However, I agree with your off topic comment. I also think there is a
need for a scriptpath() method or similar that should be called from
path(), querypath() and apipath().
Regards
Lee Francis
--
_____
In theory, there is no difference between theory and practice. But, in
practice, there is.
-- Jan L.A. van de Snepscheut
Bugs item #1898827, was opened at 2008-02-21 17:30
Message generated for change (Tracker Item Submitted) made by Item Submitter
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1898827&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: interwiki
Group: None
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Šarūnas Šimkus (hugoarg)
Assigned to: Nobody/Anonymous (nobody)
Summary: pms wiki bug
Initial Comment:
There is a bug at interwiki.py. When my bot adds any interwiki link to Piemonteis (pms) wikipedia, bot stops work and shows following message:
Changing page [[pms:Ligurin]]
Dump pms (wikipedia) saved
Traceback (most recent call last):
File "C:\Python25\pywikipedia\interwiki.py", line 1645, in <module>
bot.run()
File "C:\Python25\pywikipedia\interwiki.py", line 1409, in run
self.queryStep()
File "C:\Python25\pywikipedia\interwiki.py", line 1388, in queryStep
subj.finish(self)
File "C:\Python25\pywikipedia\interwiki.py", line 976, in finish
if self.replaceLinks(page, new, bot):
File "C:\Python25\pywikipedia\interwiki.py", line 1127, in replaceLinks
status, reason, data = page.put(newtext, comment = wikipedia.translate(page.
site().lang, msg)[0] + mods)
File "C:\Python25\pywikipedia\wikipedia.py", line 1215, in put
return self._putPage(newtext, comment, watchArticle, minorEdit, newPage, sel
f.site().getToken(sysop = sysop), sysop = sysop)
File "C:\Python25\pywikipedia\wikipedia.py", line 1307, in _putPage
if self.site().has_mediawiki_message("spamprotectiontitle")\
File "C:\Python25\pywikipedia\wikipedia.py", line 4302, in has_mediawiki_messa
ge
v = self.mediawiki_message(key)
File "C:\Python25\pywikipedia\wikipedia.py", line 4258, in mediawiki_message
value = tree.textarea.string.strip()
AttributeError: 'NoneType' object has no attribute 'strip'
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1898827&group_…
Bugs item #1683132, was opened at 2007-03-18 12:40
Message generated for change (Settings changed) made by russblau
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1683132&group_…
Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: solve_disambiguation
Group: None
>Status: Closed
>Resolution: Works For Me
Priority: 5
Private: No
Submitted By: Robin Green (greenrd)
Assigned to: Nobody/Anonymous (nobody)
Summary: gui.py: Unsupported character
Initial Comment:
I can't edit Computers using the 'e' option in solve_disambiguation.py:
>>> Computer <<<
Other ||
|-
| rowspan="9" | [[Application]] || [[Office suite]] || [[Word
Option (#, r#, s=skip link, e=edit page, n=next page, u=unlink, q=quit
m=more context, d=show disambiguation page, l=list, a=add new): e
Traceback (most recent call last):
File "solve_disambiguation.py", line 867, in <module>
main()
File "solve_disambiguation.py", line 863, in main
bot.run()
File "solve_disambiguation.py", line 781, in run
if not self.treat(refPage, disambPage):
File "solve_disambiguation.py", line 580, in treat
newText = editor.edit(text, jumpIndex = m.start(), highlight = disambPage.title())
File "/home/greenrd/pywikipedia/editarticle.py", line 78, in edit
return wikipedia.ui.editText(text, jumpIndex = jumpIndex, highlight = highlight)
File "./userinterfaces/terminal_interface.py", line 218, in editText
return editor.edit(text, jumpIndex = jumpIndex, highlight = highlight)
File "/home/greenrd/pywikipedia/gui.py", line 87, in edit
self.editbox.insert(END, text)
File "/usr/lib64/python2.5/lib-tk/Tkinter.py", line 2988, in insert
self.tk.call((self._w, 'insert', index, chars) + args)
ValueError: unsupported character
----------------------------------------------------------------------
Comment By: Russell Blau (russblau)
Date: 2007-11-21 10:16
Message:
Logged In: YES
user_id=855050
Originator: NO
Works for me. "editarticle.py Computer" on wikipedia:en pops up a Tkinter
edit box.
If your bug occurs on some other site, please provide more details.
----------------------------------------------------------------------
Comment By: Leonardo Gregianin (leogregianin)
Date: 2007-06-20 10:40
Message:
Logged In: YES
user_id=1136737
Originator: NO
I never had this, still I happen this problem?
----------------------------------------------------------------------
You can respond by visiting:
https://sourceforge.net/tracker/?func=detail&atid=603138&aid=1683132&group_…