I want to read a special page with Page.get(). The message is:
File "C:\Program Files\Pywikipedia\wikipedia.py", line 601, in get raise NoPage('%s is in the Special namespace!' % self.aslink()) pywikibot.exceptions.NoPage
What is the solution?
I have read wikipedia.py, and I found that class Site has some methods for reading special pages, but not for all of them! I need * Special:WantedCategories* which is not listed there. Please help!
2010/9/5 Bináris wikiposta@gmail.com
I want to read a special page with Page.get(). The message is:
File "C:\Program Files\Pywikipedia\wikipedia.py", line 601, in get raise NoPage('%s is in the Special namespace!' % self.aslink()) pywikibot.exceptions.NoPage
What is the solution?
-- Bináris
Well, I wrote it for myself during Starwars III. Not easily, but I learnt something. Nice conversation with myself. :-) I don't understand, why are there only some special pages in wikipedia.py while others are not.
2010/9/5 Bináris wikiposta@gmail.com
I have read wikipedia.py, and I found that class Site has some methods for reading special pages, but not for all of them! I need * Special:WantedCategories* which is not listed there. Please help!
2010/9/6 Bináris wikiposta@gmail.com
Well, I wrote it for myself during Starwars III. Not easily, but I learnt something. Nice conversation with myself. :-) I don't understand, why are there only some special pages in wikipedia.py while others are not.
2010/9/5 Bináris wikiposta@gmail.com
I have read wikipedia.py, and I found that class Site has some methods for reading special pages, but not for all of them! I need * Special:WantedCategories* which is not listed there. Please help!
I'm a very rough user of API's... so when I can't find what I search, I simply go beck to the old system:* I read html*.... so I did when I needed a list of WantedCategories.
I guess that this horrible confession will elicit some reaction, and the solution you'll are searching for.
;-)
Alex
As far as I understand, the existing special page reader methods do the same, i. e. they read HTML. They belong to the Site object rather than to the Page. But there is a terrible regex in them which I had to copy and modify. I copied from the reader of another special page, but I could not make the urlpath properly, so it is now a constant that points to huwiki. I will upload it for those who are not looking for elegance in the scripts. :-) It may be useful for others: looks for red (unexisting) categories in user subpages, and puts them in 'nowiki', while sending a message to the owner. This is useful because Special:WantedCategories is flooded with these and people don't feel like correcting the others, if they have to choose from so many redlinks.
2010/9/6 Alex Brollo alex.brollo@gmail.com
I'm a very rough user of API's... so when I can't find what I search, I simply go beck to the old system:* I read html*.... so I did when I needed a list of WantedCategories.
I guess that this horrible confession will elicit some reaction, and the solution you'll are searching for.
;-)
Alex
Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
site generator for special:wantedcategories is implemented in r8504
Greetings
xqt
----- Original Nachricht ---- Von: Bináris wikiposta@gmail.com An: Pywikipedia discussion list pywikipedia-l@lists.wikimedia.org Datum: 06.09.2010 11:27 Betreff: Re: [Pywikipedia-l] How to read special pages?
As far as I understand, the existing special page reader methods do the same, i. e. they read HTML. They belong to the Site object rather than to the Page. But there is a terrible regex in them which I had to copy and modify. I copied from the reader of another special page, but I could not make the urlpath properly, so it is now a constant that points to huwiki. I will upload it for those who are not looking for elegance in the scripts. :-) It may be useful for others: looks for red (unexisting) categories in user subpages, and puts them in 'nowiki', while sending a message to the owner. This is useful because Special:WantedCategories is flooded with these and people don't feel like correcting the others, if they have to choose from so many redlinks.
2010/9/6 Alex Brollo alex.brollo@gmail.com
I'm a very rough user of API's... so when I can't find what I search, I simply go beck to the old system:* I read html*.... so I did when I
needed
a list of WantedCategories.
I guess that this horrible confession will elicit some reaction, and the solution you'll are searching for.
;-)
Alex
Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
-- Bináris
Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Heute erleben, was morgen Trend wird - das kann man auf der IFA in Berlin. Oder auf arcor.de: Wir stellen Ihnen die wichtigsten News, Trends und Gadgets der IFA vor. Natürlich mit dabei: das brandneue IPTV-Angebot von Vodafone! Alles rund um die Internationale Funkausstellung in Berlin finden Sie hier: http://www.arcor.de/rd/footer.ifa2010
2010/9/8 info@gno.de
site generator for special:wantedcategories is implemented in r8504
Thank you again! My script is here:
http://hu.wikipedia.org/wiki/Szerkeszt%C5%91:BinBot/piroskat-userallap.py This uses my own version, but I will update it to version 8504.
This script goes through wantedcategories and whenever they are inserted into a user's subpage (possibly because he/she copied an article from enwiki and began to translate, or just wrote a non-existing category to an article beeing prepared), the script puts the red category between <nowiki> tags and notifies the owner on his/her talk page. I have a community decesion to run this bot, this is important. This process makes the wantedcategories cleaner and more readable, and makes the editors to clean them up with more energy. :-)
The results may be seen at http://hu.wikipedia.org/wiki/Speci%C3%A1lis:Szerkeszt%C5%91_k%C3%B6zrem%C5%B..., see edits on 8 September.
If anyone is interested in it, let me know, and I will help to translate.
Looks interesting for other guys. But please note the Python Style guide:
Python coders from non-English speaking countries: please write your comments in English, unless you are 120% sure that the code will never be read by people who don't speak your language.
http://www.python.org/dev/peps/pep-0008/
Otherwise this couldn't become part of the framework ;)
Greetings xqt
----- Original Nachricht ---- Von: Bináris wikiposta@gmail.com An: Pywikipedia discussion list pywikipedia-l@lists.wikimedia.org Datum: 08.09.2010 20:40 Betreff: Re: [Pywikipedia-l] How to read special pages?
2010/9/8 info@gno.de
site generator for special:wantedcategories is implemented in r8504
Thank you again! My script is here:
http://hu.wikipedia.org/wiki/Szerkeszt%C5%91:BinBot/piroskat-userallap.py This uses my own version, but I will update it to version 8504.
This script goes through wantedcategories and whenever they are inserted into a user's subpage (possibly because he/she copied an article from enwiki and began to translate, or just wrote a non-existing category to an article beeing prepared), the script puts the red category between <nowiki> tags and notifies the owner on his/her talk page. I have a community decesion to run this bot, this is important. This process makes the wantedcategories cleaner and more readable, and makes the editors to clean them up with more energy. :-)
The results may be seen at http://hu.wikipedia.org/wiki/Speci%C3%A1lis:Szerkeszt%C5%91_k%C3%B6zrem%C5%B 1k%C3%B6d%C3%A9sei/BinBott, see edits on 8 September.
If anyone is interested in it, let me know, and I will help to translate.
-- Bináris
Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Heute erleben, was morgen Trend wird - das kann man auf der IFA in Berlin. Oder auf arcor.de: Wir stellen Ihnen die wichtigsten News, Trends und Gadgets der IFA vor. Natürlich mit dabei: das brandneue IPTV-Angebot von Vodafone! Alles rund um die Internationale Funkausstellung in Berlin finden Sie hier: http://www.arcor.de/rd/footer.ifa2010
I didn't think it has a chance to become a part of the framework. But if you say it has, I will rewrite it, of course.
2010/9/11 info@gno.de
Looks interesting for other guys. But please note the Python Style guide:
Otherwise this couldn't become part of the framework ;)
This thread originates from September 2010. Since nobody wrote a library to handle special pages, and we say in Hungary* "Yourself, mister, if you have no servants"*, I began to write apispec.py that tries to give interfaces for special pages through API whenever possible. Special:Blocklist is in good state, and I will use it to send a mail to admins' list when blocks (and protections in the next phase) expire. I have just sent my commit access request, and if I get it, I will commit the first parts of the library.
Thank you! :)
2012/1/31 Bináris wikiposta@gmail.com:
This thread originates from September 2010. Since nobody wrote a library to handle special pages, and we say in Hungary "Yourself, mister, if you have no servants", I began to write apispec.py that tries to give interfaces for special pages through API whenever possible. Special:Blocklist is in good state, and I will use it to send a mail to admins' list when blocks (and protections in the next phase) expire. I have just sent my commit access request, and if I get it, I will commit the first parts of the library.
-- Bináris
Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
A sample output from the list of blocks expiring within 24 hours:
Data for block #37919 Blocked user: Teszteszter (#147405) Admin: Bináris (#9541) Beginning in UTC: 2012-01-31 15:17:07 Expiry in UTC: 2012-02-01 01:17:07 Flags: nocreate, noemail, allowusertalk Reason: Műszaki teszt, botfejlesztés
2012/1/31 Nickanc Wikipedia nickanc.wiki@gmail.com
Thank you! :)
This capability was just added to the rewrite branch earlier this week.
----- Original Message ----- From: "Nickanc Wikipedia" nickanc.wiki@gmail.com To: "Pywikipedia discussion list" pywikipedia-l@lists.wikimedia.org Sent: Tuesday, January 31, 2012 8:43 AM Subject: Re: [Pywikipedia-l] How to read special pages?
Thank you! :)
2012/1/31 Bináris wikiposta@gmail.com:
This thread originates from September 2010. Since nobody wrote a library to handle special pages, and we say in Hungary "Yourself, mister, if you have no servants", I began to write apispec.py that tries to give interfaces for special pages through API whenever possible. Special:Blocklist is in good state, and I will use it to send a mail to admins' list when blocks (and protections in the next phase) expire. I have just sent my commit access request, and if I get it, I will commit the first parts of the library.
-- Bináris
Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
_______________________________________________ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
2012/2/2 Russell Blau russblau@imapmail.org
This capability was just added to the rewrite branch earlier this week.
Sorry if I was late. :-) But there are some bot owners to use the trunk version. Please check the knowledge of apispec (at his time it handles only Special:Blocklist, but will be continued).
2012/2/2 Russell Blau russblau@imapmail.org
This capability was just added to the rewrite branch earlier this week.
Russell, which script do you mean? I don't use rewrite but tried to find it, please help.
Bináris wrote:
2012/2/2 Russell Blau russblau@imapmail.org
This capability was just added to the rewrite branch earlier this week.
Russell, which script do you mean? I don't use rewrite but tried to find it, please help.
various methods were added to pywikibot/site.py to implement the methods already in trunk that depend on Special: pages; for example, Site().longpages()
Russ
2012/2/2 Russell Blau russblau@imapmail.org
various methods were added to pywikibot/site.py to implement the methods already in trunk that depend on Special: pages; for example, Site().longpages()
I reviewed these methods in wikipedia.py and a part of them uses API, while lot of them analyzes HTML source.
On 2 February 2012 22:20, Bináris wikiposta@gmail.com wrote:
various methods were added to pywikibot/site.py to implement the methods
already in trunk that depend on Special: pages; for example, Site().longpages()
I reviewed these methods in wikipedia.py and a part of them uses API, while lot of them analyzes HTML source.
Russell is talking about the rewrite branch, at http://svn.wikimedia.org/viewvc/pywikipedia/branches/rewrite/pywikibot/ .
Merlijn
2012/2/2 Merlijn van Deen valhallasw@arctus.nl
I reviewed these methods in wikipedia.py and a part of them uses API,
while lot of them analyzes HTML source.
Russell is talking about the rewrite branch, at http://svn.wikimedia.org/viewvc/pywikipedia/branches/rewrite/pywikibot/ .
Yes, I realized that but he told these were taken from trunk. At the moment, I am only interested in trunk and try to guess which special pages are subject to write an API interface and which are already written.
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Hello Binaris! (Hello all!)
Just to be up-to-date; Are you planning to include API things like:
* action=parse&text= * action=expandtemplates&text=
at all? If not I have some code that could be included into wikipedia.site if there is any interesst.
Greetings DrTrigon
On 31.01.2012 00:05, Bináris wrote:
This thread originates from September 2010. Since nobody wrote a library to handle special pages, and we say in Hungary/"Yourself, mister, if you have no servants"/, I began to write apispec.py that tries to give interfaces for special pages through API whenever possible. Special:Blocklist is in good state, and I will use it to send a mail to admins' list when blocks (and protections in the next phase) expire. I have just sent my commit access request, and if I get it, I will commit the first parts of the library.
-- Bináris
_______________________________________________ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
2012/2/17 Dr. Trigon dr.trigon@surfeu.ch
Just to be up-to-date; Are you planning to include API things like:
- action=parse&text=
- action=expandtemplates&text=
I think the script should contain every special page one day that is not realized in other scripts, and a list of related functions in other scripts. But these are not prioritized in my diary.
at all? If not I have some code that could be included into wikipedia.site if there is any interesst.
Feel free to do it. :-) If you have a code ready, don't wait for others write it again.
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Ok done in r9902. May be you want to move them to your apispec.py or other place - this sould be done right now (because it will be very bad style latter, when it is used by other code already... ;)
Another "question" I have;
You are defining and using a bunch of date/time related functions in apispec.py. As far as I know there are already some defined in the framework. Then I myself have also defined some additional to introduce functions I was missing... May be it would be a good thing to compare all those date/time related functions (with the ones already defined in trunk) and merge ours where needed...
What do you think?
Greetings DrTrigon
On 17.02.2012 03:02, Dr. Trigon wrote:
Hello Binaris! (Hello all!)
Just to be up-to-date; Are you planning to include API things like:
- action=parse&text= * action=expandtemplates&text=
at all? If not I have some code that could be included into wikipedia.site if there is any interesst.
Greetings DrTrigon
On 31.01.2012 00:05, Bináris wrote:
This thread originates from September 2010. Since nobody wrote a library to handle special pages, and we say in Hungary/"Yourself, mister, if you have no servants"/, I began to write apispec.py that tries to give interfaces for special pages through API whenever possible. Special:Blocklist is in good state, and I will use it to send a mail to admins' list when blocks (and protections in the next phase) expire. I have just sent my commit access request, and if I get it, I will commit the first parts of the library.
-- Bináris
_______________________________________________ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
_______________________________________________ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
pywikipedia-l@lists.wikimedia.org