-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
At the moment it looks like using 'site._load(...)' after 'site.getUrl'
help to solve the issue, e.g.:
>> if self.site.loggedInAs() is None:
self.site._load(force=True)
but I feel a little bit uncomfortable having the bot log out and the
re-login again - would be nice if it could STAY logged in... *hope*
Thanks for your ideas or suggestions in advance!
Greetings
On 28.01.2012 00:17, Merlijn van Deen wrote:
On 27 January 2012 17:31, Dr. Trigon
<dr.trigon(a)surfeu.ch
<mailto:dr.trigon@surfeu.ch>> wrote:
I have a quite confusing situation happening to my bot when trying
to access any URL that points to foreign (but mediawiki software)
wiki, like this:
>>
pywikibot.getSite().getUrl(foreign_wiki_url, no_hostname =
>> True)
But while doing this the bot seems to log-out since afterwards it
is not able to edit any page anymore the "traceback" from the logs
is:
Random guess: the bot sends the old site's cookies to the foreign
wiki, gets new cookies back and writes those to the user-data file.
Then in the next request it tries to use those cookies, which
fails.
Check your cookie data file in user-data to confirm.
In any case: why are you trying to use a function that is clearly
not made for this purpose, instead of using, say, urlopen,
directly, or creating a family file?
Merlijn
_______________________________________________ Pywikipedia-l
mailing list Pywikipedia-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.11 (GNU/Linux)
Comment: Using GnuPG with Mozilla -
http://enigmail.mozdev.org/
iEYEARECAAYFAk8kCNsACgkQAXWvBxzBrDCz0gCfeYHVqPVyLyrVuNmh9ocOy/fN
TccAoIEX8w/bkacqecwMK1QVUUNGAoZC
=qUZz
-----END PGP SIGNATURE-----