Hi!
I've customised the AuthPlugin file in order to authenticaticate users using another user database. It seems to work, and the users are able to log in and edit articles. However, they are not able to save the user preferences. (I tried to change mine, and by checking user_options, I made it clear that it is not a caching issue.)
The code: http://pastey.net/8021 (The actual authentication) http://pastey.net/8022 (JuventeAuth.php, modified AuthPlugin.php)
The extension was installed by putting these two lines in LocalSettings.php require_once('extensions/JuventeAuth.php'); $wgAuth = new JuventeAuth();
When I disabled the last line, preference saving worked, so it is likely that the error is in my files somewhere. I'm quite a newbie to MediaWiki, and skipped everything that contained the word "domain" when configuring.
Then, I had a look through SpecialUserlogin.php to check whether real name and e-mail could mandatory, as my code doesn't fetch that. That was a blind shot, as my profile is the same as before the change, and my username/password is the same in both databases.
Finally, I disabled strict(), and asked two users to log in using their original wiki accounts (ie. not the new ones they got after we added the external authentication), but they weren't able to save their prefs either.
A helping person in #mediawiki@freenode suggested that it might be a session issue, and that I was responsible for setting the session variables.
Things to note: - autoCreate() returns true - my auth. database is not the same as the wiki database - MW version 1.6.8 (As we're only using PHP 4)
I'm grateful for any comments or suggestions!
På Fri, 09 Mar 2007 18:08:03 +0100, skrev Martin Strand martin.strand@gmail.com:
However, they are not able to save the user preferences.
Thanks to brion and _syphilis_ in #MediaWiki, who pointed out that updateExternalDB() had to return true. A minor mistake, creating a lot of confusion.
Hi,
Regarding the image dumps for the several wikipedia languages...
Why are all the latest dumps from 2005 ? Have they sttoped for any reason ? Are they expected to be available again any time soon ?
The thing is I have imported the pt.wiki xml dump into a fresh mediawiki install and now it doesn't have most of the images. Is there an alternative to showing empty image place holders ? Getting the images dump another way ? Removing the images wiki markup inclusion code from the articles ?
Thanks.
Carlos Jorge Andrade wrote:
Hi,
Regarding the image dumps for the several wikipedia languages...
Why are all the latest dumps from 2005 ? Have they sttoped for any reason ? Are they expected to be available again any time soon ?
The thing is I have imported the pt.wiki xml dump into a fresh mediawiki install and now it doesn't have most of the images. Is there an alternative to showing empty image place holders ? Getting the images dump another way ? Removing the images wiki markup inclusion code from the articles ?
Thanks.
select il_to from imagelinks; and grab all referenced images? Though probably not very good for server load... :/
On Mar 12, 2007, at 11:34 AM, Platonides wrote:
Carlos Jorge Andrade wrote:
Hi,
Regarding the image dumps for the several wikipedia languages...
Why are all the latest dumps from 2005 ? Have they sttoped for any reason ? Are they expected to be available again any time soon ?
The thing is I have imported the pt.wiki xml dump into a fresh mediawiki install and now it doesn't have most of the images. Is there an alternative to showing empty image place holders ? Getting the images dump another way ? Removing the images wiki markup inclusion code from the articles ?
Thanks.
select il_to from imagelinks; and grab all referenced images? Though probably not very good for server load... :/
Thought of that... ;-) but is that allowed ? Can I download allf individual files from que upload.wikimedia.org site ?
On 12/03/07, Carlos Jorge Andrade carlos.andrade@gmail.com wrote:
Thought of that... ;-) but is that allowed ? Can I download allf individual files from que upload.wikimedia.org site ?
No, please don't.
I would guess that image dumps stopped being provided due to file sizes and bandwidth needed for distribution. There have been relatively recent discussions about providing BitTorrents for this sort of thing, to take advantage of all the benefits from those sorts of things, but no concrete plans have come to fruition (that I can recall or am aware of at this time)..."watch this space" might be good advice.
With a bit of luck, this thread could reopen the issue and perhaps produce some viable proposals and a schedule for implementation.
Rob Church
Rob Church wrote:
I would guess that image dumps stopped being provided due to file sizes and bandwidth needed for distribution. There have been relatively recent discussions about providing BitTorrents for this sort of thing, to take advantage of all the benefits from those sorts of things, but no concrete plans have come to fruition (that I can recall or am aware of at this time)..."watch this space" might be good advice.
Proabably a system like jigdo http://atterer.net/jigdo/ performs better. I doubt the peer to peer can help in sharing full tars. If each image is shared individually... might help a bit.
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Carlos Jorge Andrade wrote:
Regarding the image dumps for the several wikipedia languages...
Why are all the latest dumps from 2005 ? Have they sttoped for any reason ?
Insufficient working space to create dumps.
Are they expected to be available again any time soon ?
At some point. New storage servers are in part present, but this work is still in progress.
- -- brion vibber (brion @ pobox.com / brion @ wikimedia.org)
Hello,
What is the alternative if we want the images? I was told on IRC awhile back that it was safe to download the files individually, and I do not see what to do otherwise. I do realize that it is not vital for there to even be distributions of Wikimedia projects, and that they just make things easier for everyone, but the distributions really do not work without images.
What should everyone using dumps do in the meantime, before image dumps are available?
Thanks, Kasimir
On 3/12/07, Brion Vibber brion@pobox.com wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Carlos Jorge Andrade wrote:
Regarding the image dumps for the several wikipedia languages...
Why are all the latest dumps from 2005 ? Have they sttoped for any reason ?
Insufficient working space to create dumps.
Are they expected to be available again any time soon ?
At some point. New storage servers are in part present, but this work is still in progress.
- -- brion vibber (brion @ pobox.com / brion @ wikimedia.org)
-----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.2.2 (Darwin) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFF9WVwwRnhpk1wk44RAoWJAKCLhYj5jxbG2coLWAx2/pTRhElwjQCgj/Il qmzKk7t3weuxevHmLsWboTE= =G8Md -----END PGP SIGNATURE-----
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
On Mon, 2007-03-12 at 08:59 -0600, Kasimir Gabert wrote:
What should everyone using dumps do in the meantime, before image dumps are available?
Use the 2005 dump, or don't use images.
Besides, at 400+ GiB for the current enwiki images, why would you want to download this? Do you really have 1/2TB of storage to hold them? And can you wait the 2 months it will take to download them all?
It would be cheaper/faster/easier on bandwidth to just send them a drive and have them copy the files onto it, at some charge for the labor/time to do so.
Hello David,
I was not talking specifically about the English Wikipedia dumps, sorry for the confusion. I was just talking about dumps in general.
Thanks, Kasimir
On 3/12/07, David A. Desrosiers desrod@gnu-designs.com wrote:
On Mon, 2007-03-12 at 08:59 -0600, Kasimir Gabert wrote:
What should everyone using dumps do in the meantime, before image dumps are available?
Use the 2005 dump, or don't use images.
Besides, at 400+ GiB for the current enwiki images, why would you want to download this? Do you really have 1/2TB of storage to hold them? And can you wait the 2 months it will take to download them all?
It would be cheaper/faster/easier on bandwidth to just send them a drive and have them copy the files onto it, at some charge for the labor/time to do so.
-- David A. Desrosiers desrod@gnu-designs.com Skype username: setuid http://gnu-designs.com "Strive to silence the voices that oppose you."
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
mediawiki-l@lists.wikimedia.org