-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
First; thanks a lot for your reply!!
Random guess: the bot sends the old site's cookies to the foreign wiki, gets new cookies back and writes those to the user-data file. Then in the next request it tries to use those cookies, which fails.
Check your cookie data file in user-data to confirm.
What do you mean by 'user-data'? I looked at 'login-data', since there are some files stored... I am not sure if they are changed but as mentioned in the last mail the bot is still able to login under some circumstances. Also 'python login.py -test' claims to be logged in... But what would be the best to do in your optinion? Wipe out all those files and re-login once and then store a copy of the files to compare?
In any case: why are you trying to use a function that is clearly not made for this purpose, instead of using, say, urlopen, directly, or creating a family file?
You are right, that is true. But the function works very well except under the rare occasions mentioned here. The main reason why I use this function is; it does re-loading attempts AND it applies correct unicode encoding to the html page contents. Both is not done by urlopen as far as I know...(?) Also creating a family file is not what I want (sorry ;) since I would like to handle this url like any arbitrary url from the web and not as a wiki. As far as I can see the point where things are going wrong is at the very end of 'getUrl':
* # If a wiki page, get user data * self._getUserDataOld(text, sysop = sysop)
everything else seems to be fine.
Greetings DrTrigon