We could allow the site to parse for us, but that would involve getting: http://en.wikipedia.org/wiki/fr:User:AllyUnion instead of http://en.wikipedia.org/w/index.php?title=fr%3AUser%3AAllyUnion&action=e... (The edit doesn't follow the redirect);
Although, we could parse the parts of the MediaWiki files and get our information from that...
---- Jason Y. Lee
On Tue, Nov 29, 2005 at 12:12:23AM -0600, Scot Wilcoxon wrote:
Looks like you're working within the existing regular expression and string methods. My citations bot has somewhat complex needs, so I've been looking at some powerful approaches to parsing. At the moment I think mxTextTools looks useful, with parsing.py being less applicable. However, as in my case it will be useful to parse much of WikiSyntax...has someone already looked at whether MediaWiki's parser can be accessed by pywikipedia tools?
Jason Y. Lee wrote:
I've been wondering about some kind of parser to add to the python wikipedia project such that it knows how to handle transwiki links as well as trans-interwiki links.
Wikibots-l mailing list Wikibots-l@wikimedia.org http://mail.wikipedia.org/mailman/listinfo/wikibots-l