On Sun, Aug 23, 2009 at 6:13 PM, <multichill(a)svn.wikimedia.org> wrote:
> Revision: 7169
> Author: multichill
> Date: 2009-08-23 16:13:23 +0000 (Sun, 23 Aug 2009)
>
> Log Message:
> -----------
> First version of flickrripper.
> It works, but I still have to do a lot to get a nice program.
>
> Added Paths:
> -----------
> trunk/pywikipedia/flickrripper.py
>
> +import flickrapi
It seems you forgot to include that one as well - or should it be
downloaded from elsewhere?
--
André Engels, andreengels(a)gmail.com
Is there a way I can use replace.py to replace an entire page with certain
text? There are pages I need to remove for copyright reasons, and I can list
them in a file (to call with "-file:pagelist.txt").
I tried using *, for the search text, with and without quotation marks, but
it didn't work - the command I used was of this form:
$ python replace.py -regex "*" "{{XYZ}}" -file:filelist.txt
Thanks!
--
Chris Watkins
Appropedia.org - Sharing knowledge to build rich, sustainable lives.
identi.ca/appropedia / twitter.com/appropediablogs.appropedia.org
I like this: five.sentenc.es
On Tue, August 11, 2009 12:53 pm, alexsh(a)svn.wikimedia.org wrote:
> Revision: 7141
> Author: alexsh
> Date: 2009-08-11 10:53:06 +0000 (Tue, 11 Aug 2009)
>
> Log Message:
> -----------
> *Page()._putPage(): change to use API, old move to _putPageOld()
> --All possible error in API mode should be handled
> --need to set config.use_api to enable this, otherwise it will use
> _putPageOld()
Alex,
There is a rewrite branch that has been build from the ground up to use
the API. Please use and improve that branch if you want to work with the
API instead of hacking more API functions into the trunk.
Thanks,
Merlijn
But sometimes is usefull to use both force and autonomous, when there
is present many nonexisting links...
JAn
2009/8/4, SourceForge.net <noreply(a)sourceforge.net>:
> Bugs item #2822705, was opened at 2009-07-16 21:57
> Message generated for change (Comment added) made by valhallasw
> You can respond by visiting:
> https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2822705&group_…
>
> Please note that this message will contain a full copy of the comment
> thread,
> including the initial issue submission, for this request,
> not just the latest update.
> Category: None
> Group: None
>>Status: Closed
>>Resolution: Fixed
> Priority: 8
> Private: No
> Submitted By: xqt (xqt)
> Assigned to: Nobody/Anonymous (nobody)
> Summary: ar-wiki crossing namespace
>
> Initial Comment:
> Pywikipedia [http] trunk/pywikipedia (r7067, 2009/07/15, 19:25:32)
> Python 2.5.2 (r252:60911, Feb 21 2008, 13:11:45) [MSC v.1310 32 bit (Intel)]
>
> Please give the ability for crossing namespace to ar-wiki. The reason is on
> ar-wiki all pages of years are seated in the namespace 104 whereas on all
> other projects they didn't.
> See http://ar.wikipedia.org/wiki/%D9%85%D9%84%D8%AD%D9%82:1347 for example.
>
> ----------------------------------------------------------------------
>
>>Comment By: Merlijn S. van Deen (valhallasw)
> Date: 2009-08-04 10:21
>
> Message:
> Fixed in r7107. Please note the edit wars are caused by incompetent bot
> owners (in -autonomous, this is never changed, unless -force is activated,
> which is a bad idea anyway)
>
> ----------------------------------------------------------------------
>
> Comment By: xqt (xqt)
> Date: 2009-07-20 15:16
>
> Message:
> I found a lot of malfunctions and edit wars of bots. This should be fixed
> asap. Thx.
>
> ----------------------------------------------------------------------
>
> Comment By: xqt (xqt)
> Date: 2009-07-19 19:01
>
> Message:
> Raising the priority due to malfunction of the bots., see
> http://de.wikipedia.org/w/index.php?title=2009&action=history
>
> ----------------------------------------------------------------------
>
> You can respond by visiting:
> https://sourceforge.net/tracker/?func=detail&atid=603138&aid=2822705&group_…
>
> _______________________________________________
> Pywikipedia-bugs mailing list
> Pywikipedia-bugs(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/pywikipedia-bugs
>
--
Odesláno z mobilního zařízení
--
Ing. Jan Dudík
While running add_text.py on pl.wiki to add template {{Przypisy}}
(referencences) I get following error:
python pywikipedia/add_text.py -transcludes:"Związek chemiczny infobox"
-except:"\{\{[Pp]rzypisy" -text:"{{Przypisy}}"
Getting references to [[Szablon:Związek chemiczny infobox]]
Loading Aceton...
Exception! regex (or word) used with -except is in the page. Skip!
Traceback (most recent call last):
File "pywikipedia/add_text.py", line 311, in <module>
main()
File "pywikipedia/add_text.py", line 307, in main
(text, newtext, always) = add_text(page, addText, summary,
regexSkip, regexSkipUrl, always, up, True)
ValueError: need more than 2 values to unpack
python pywikipedia/version.py
Pywikipedia (r7133 (wikipedia.py), 2009/08/08, 04:25:51)
Python 2.6 (r26:66714, Jun 8 2009, 16:07:26)
[GCC 4.4.0 20090506 (Red Hat 4.4.0-4)]
masti
Related to this:
On Sun, August 2, 2009 2:05 pm, valhallasw(a)svn.wikimedia.org wrote:
> Fixed incorrect translations (using wrong number of %s-es)
(...)
> 'ca':u'Robot: Afegint [[Categoria:%s]]',
> - 'cs':u'Robot pÅidal kategorii [[Kategorie:%s|%s]]',
> + 'cs':u'Robot pÅidal [[Kategorie:%s]]',
> 'da':u'Robot: Tilføjer [[Kategori:%s]]',
I think it would be better to use named substitution
(i.e. u'Robot: Afegint [[Categoria:%(cat)s]]', which makes it possible to
have 'cs':u'Robot pÅidal kategorii [[Kategorie:%(cat)s|%(cat)s]]'),
but I am not sure how big the effect will be on existing scripts. Would it
be OK to introduce a possibly breaking change, or should some sort of
wrapper be thought of?
-Merlijn
Hello. I have a bot running on wikipedias and wiktionaries. Pywikipedia works fine with wikipedias, but is terrible on wiktionaries, since it makes many mistakes, mainly indicating wrong links to add in pages. There's a very efficient bot named Interwicket, that uses another files to run. Is there anything that can be done to improve pywikipedia ?
Lucas.
_________________________________________________________________
Novo Internet Explorer 8. Baixe agora, é grátis!
http://brasil.microsoft.com.br/IE8/mergulhe/?utm_source=MSN%3BHotmail&utm_m…
Hi,
I want to list all pages in our wiki that use tables. This seems like it
should be simple, but I'm not sure how to do it. Any ideas?
I know that "{|" (the beginning of a table) works as a search term, as I
tried it with replace.py. However, I don't want to replace anything, and I
don't want to sit there pressing "n" for each result.
Ideally I could capture just the names of the pages, without extended
details (such as a proposed diff given by replace.py).
Any help much appreciated.
Cheers
--
Chris Watkins
Appropedia.org - Sharing knowledge to build rich, sustainable lives.
identi.ca/appropedia / twitter.com/appropediablogs.appropedia.org
I like this: five.sentenc.es