I send this back to the list:
2011/3/7 Andre Engels andreengels@gmail.com
I don't know about that, but I think you can work the other way around, using a bit of regular expression magic:
import re ... existing = [wikipedia.Page(wikipedia.getSite(), pname).title() for pname in re.findall(r"title=(.*?)&action=edit", fullsourcetext)]
def exists(page): return page.title() in existing
This works fine! I didn't know that titles could be encoded in Page(). There are already some regexes in my code. Thank you!
A slight correction:
This lists red titles. :-)
not_existing = [wikipedia.Page(wikipedia.getSite(), pname).title() for
pname in re.findall(r"title=(.*?)&action=edit", fullsourcetext)]
def exists(page): return page.title() *not *in not_existing
Bináris wikiposta@gmail.com wrote:
--===============5825385777771113991== Content-Type: multipart/alternative; boundary=000e0cd29986b4b151049de804f7
--000e0cd29986b4b151049de804f7 Content-Type: text/plain; charset=ISO-8859-1
A slight correction:
This lists red titles. :-)
What are you trying to do? Wouldn't using SQL query be much simpler?
An example of this (my equivalent of the Special:WantedPages) is in /home/saper/sql/wantedpages/query.sh
//Marcin
2011/3/7 Marcin Cieslak saper@saper.info
What are you trying to do? Wouldn't using SQL query be much simpler?
Perhaps it would be, if *I were familiar with th whole process, so didn't have to learn it *I had all the necessary tools installed *we got dumps regularly. But even in that case I would have to download a lot of dumps off different wikis what I don't need otherwise. So not. My solution in pywiki is ready, this was just a refinement.
I make lists like http://hu.wikipedia.org/wiki/Wikip%C3%A9dia:K%C3%A9rt_cikkek/es, http://hu.wikipedia.org/wiki/Wikip%C3%A9dia:K%C3%A9rt_cikkek/en, http://hu.wikipedia.org/wiki/Wikip%C3%A9dia:K%C3%A9rt_cikkek/ru etc.These are articles of foreign Wikipedias from category:Hungary which have no interwiki to huwiki. Either they must be supplied wit an iw or they are good ideas to write new articles in huwiki. Besides a lot of interwikis, there are already 6 new articles that were born due to these lists. Now I write one more thing (merging new titles into an existing list), and afterwards I would like to see my first scipt in the framework. :-)
pywikipedia-l@lists.wikimedia.org