I've attached an email that I sent to the group in April as to how I hacked
this into User.php to allow transparent authentication. Hopefully that will
help.
Cheers,
Al.
-----Original Message-----
From: Chris Blake [mailto:cblake@pembroke.sa.edu.au]
Sent: Sunday, 4 September 2005 5:44 p.m.
To: mediawiki-l(a)Wikimedia.org
Subject: [Mediawiki-l] RE: MediaWiki-l Digest, Vol 24, Issue 11
login id as parameter ?
Here's an apparently simple question who's solution is evading me.
We have mediawiki running on Ubuntu in a windows intranet (in a school).
Everyone who gains access to the wiki has already logged on. I can easily
grab their login name and pass it as a parameter to the weblink to the
wiki.
So I want a use to launch the wiki with something like
http://localintranetserver/wiki?mylogin=fredbloggs which would mean that
all editing that Fred Bloggs does bears his name. Sounds easy... would
solve lots of issues... but can't quite figure it out.
-----Original Message-----
From: Alistair Johnson [mailto:JohnsonA@rembrandt.co.nz]
Sent: Thursday, 28 April 2005 8:24 a.m.
To: MediaWiki announcements and site admin list
Subject: RE: [Mediawiki-l] How to require Sign In
I posted info on how to do this back at the end of March (based on info
posted by David Cameron) . Below is the modification I made to User.php to
achieve this. You need to enable Windows authentication in IIS to make this
work.
You can also look at AuthPlugin to seamlessly create mediawiki users based
on another authentication mechanism, but as far as I can tell that didn't
also offer automatic logon which the below will do for you.
Al.
function loadFromSession() {
global $wgMemc, $wgDBname;
if ( isset( $_SESSION['wsUserID'] ) ) {
if ( 0 != $_SESSION['wsUserID'] ) {
$sId = $_SESSION['wsUserID'];
} else {
return new User();
}
} else if ( isset( $_COOKIE["{$wgDBname}UserID"] ) ) {
$sId = IntVal( $_COOKIE["{$wgDBname}UserID"] );
$_SESSION['wsUserID'] = $sId;
} else if ( isset($_SERVER["AUTH_USER"])) {
//modification to allow logon via authentication
information
//passed from IIS
global $wgUser;
global $wgDeferredUpdateList;
//get the username
$temp = explode('DOMAINNAME', $_SERVER["AUTH_USER"]); //remove the
domain name from AUTH_USER
if ($temp[1] == "") {
$name = $temp[0];
} else {
$name = $temp[1];
}
//pull in the usernames and passwords we'll need for the database
lookup
global $wgDBprefix;
global $wgDBuser;
global $wgDBpassword;
global $wgDBserver;
global $wgDBname;
//we'll use PHP's MYSQL module to access the mediawiki database as
it's Q&D
$link = mysql_connect($wgDBserver,$wgDBuser,$wgDBpassword);
@mysql_select_db($wgDBname, $link) or die( "Unable to select user
database for NTLM authentication");
$query="SELECT * FROM " . $wgDBprefix . "user WHERE LOWER(user_name) =
'" . strtolower($name) . "'";
$result = mysql_query($query, $link);
$row = mysql_fetch_array($result, MYSQL_ASSOC);
mysql_close($link);
//set the variables we need to transparently authenticate
$sId = $row['user_id'];
$_SESSION['wsUserID'] = $row['user_id'];
$_SESSION['wsUserName'] = $row['user_name'];
$_SESSION['wsToken'] = $row['user_token'];
//set cookies with this info to make life easier for us in the future
global $wgCookieExpiration, $wgCookiePath, $wgCookieDomain,
$wgDBname;
setcookie( $wgDBname.'UserID', $row['user_id'], 0,
$wgCookiePath, $wgCookieDomain );
setcookie( $wgDBname.'UserName', $row['user_name'], 0,
$wgCookiePath, $wgCookieDomain );
setcookie( $wgDBname.'Token', $row['user_token'], 0,
$wgCookiePath, $wgCookieDomain );
} else {
return new User();
}
if ( isset( $_SESSION['wsUserName'] ) ) {
$sName = $_SESSION['wsUserName'];
} else if ( isset( $_COOKIE["{$wgDBname}UserName"] ) ) {
$sName = $_COOKIE["{$wgDBname}UserName"];
$_SESSION['wsUserName'] = $sName;
} else {
return new User();
}
$passwordCorrect = FALSE;
$user = $wgMemc->get( $key = "$wgDBname:user:id:$sId" );
if($makenew = !$user) {
wfDebug( "User::loadFromSession() unable to load
from memcached\n" );
$user = new User();
$user->mId = $sId;
$user->loadFromDatabase();
} else {
wfDebug( "User::loadFromSession() got from cache!\n"
);
}
if ( isset( $_SESSION['wsToken'] ) ) {
$passwordCorrect = $_SESSION['wsToken'] ==
$user->mToken;
} else if ( isset( $_COOKIE["{$wgDBname}Token"] ) ) {
$passwordCorrect = $user->mToken ==
$_COOKIE["{$wgDBname}Token"];
} else {
return new User(); # Can't log in from session
}
if ( ( strtolower($sName) == strtolower($user->mName) ) &&
$passwordCorrect ) { //modified to allow for case differences between
mediawiki and NTLM usernames
if($makenew) {
if($wgMemc->set( $key, $user )) {
wfDebug( "User::loadFromSession()
successfully saved user\n" );
} else {
wfDebug( "User::loadFromSession()
unable to save to memcached\n" );
}
}
$user->spreadBlock();
return $user;
}
return new User(); # Can't log in from session
}
-----Original Message-----
From: Toscano, Ashley [mailto:atoscano@edmunds.com]
Sent: Thursday, 28 April 2005 7:49 a.m.
To: MediaWiki announcements and site admin list
Subject: [Mediawiki-l] How to require Sign In
Is there a way to hook the Sign In function to Active Directory on a
corporate Windows network? Also, how do I require that users sign in before
updating content on the wiki?
- Ashley Toscano Office: 310-309-6431
Edmunds.com "where smart car buyers start"
Hi all!
Is there actual a way to execute the scripts in the maintenance folder
without the commandline. the reason for my question is, that i don't
have a shell access for my host.
Thanks for replys!
I just discovered this problem with my MW 1.4.0 site:
* UserA begins editing section1 of a wiki page.
* UserA also begins editing section2 of a wiki page.
* While the section2 edit is still open, section1 edits are saved.
* Section2 edits are then saved.
* The changes made to section1 are lost.
Is this fixed in successive (to 1.4.0) versions of MediaWiki?
-Matt
Last night we had about 59 new accounts created on our Wiki. They then
proceeded to do multiple edits to about 40 or so different pages. Each
page had about 15-20 changes made.
The weird thing is that the last spider/bot to visit each page erased
all the stuff that the others had created, leaving each page (from what
I can tell so far) as it was originally.
The sites they were injecting into the pages were:
http://WTHP5.disney.com
Now for my questions:
I had enabled "registered users can only edit", but this didn't help
because they obviously automated this process. Is there something
stronger I can do while still enabling the spirit of the Wiki?
Some of the pages won't seem to revert back to my last edit. Can I
somehow completely delete any changes they made to the system and get
their record of touching the pages out?
Is there a faster way to revert a bunch of pages at once? This is taking
forever to read each page and verify things are OK.
I'm pretty disheartened by this. If it continues, I'll have to turn off
external internet access to our wiki (this is for an academic library).
We already see quite a few visits from other libraries, and we have some
valuable information to share.
-Tony
I have a simple, 100% effective way of dealing with spambots: change
the login page. (Oh yes -- this only works if you restrict edits to
logged in users.)
On http://www.IslandSeeds.org/wiki, I wanted to get first and last
names, so I hacked that page to include the extra field, and added it
to the database schema, and changed the username function so that it
concatenated those two fields.
This wiki is fairly well known and linked, and yet, NEVER gets hit by
spambots. I think they get confused about the login requirements.
This suggests that doing something as trivial as changing the NAME
attribute of either the username or password field may frustrate
spambots.
But what I REALLY want is an email or Turing authentication scheme.
Is that in 1.5, or coming down the road?
:::: Any sufficiently advanced political correctness is
indistinguishable from irony. -- Erik Naggum
:::: Jan Steinman <http://www.Bytesmiths.com/Item/85AE31>
I'm trying to debug an extension. I found the wfDebug function which
seems to be widely used in the rest of the mediawiki code, but I can't
seem to get it to do anything. I know that I'm running through at
least one wfDebug() call but it seems to have no effect.
I've set $wgDebugLogFile to the path of a file and made sure that the
web server has write access to it. I don't get any output. I then set
$wgDebugComments to true but still no output.
Is there something I'm missing?
That only included a link to "Whatlinkshere", not the actual links of
pages linking to that page.
-----Original Message-----
From: mediawiki-l-bounces(a)Wikimedia.org
[mailto:mediawiki-l-bounces@Wikimedia.org] On Behalf Of Joshua Oreman
Sent: Sunday, September 04, 2005 1:44 PM
To: MediaWiki announcements and site admin list
Subject: Re: [Mediawiki-l] Embed "What Links Here" on pages?
On 9/4/05, Bass, Joshua L <joshua.l.bass(a)lmco.com> wrote:
> How can I embed the listing of pages that link to a page? I would like
> to have a section at the bottom of certain pages displaying all the
> pages linking to it.
In new (1.5+, I believe) MediaWIki versions, use
{{Special:Whatlinkshere}}.
-- Josh
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)Wikimedia.org
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
How can I embed the listing of pages that link to a page? I would like
to have a section at the bottom of certain pages displaying all the
pages linking to it.
-----Original Message-----
From: mediawiki-l-bounces(a)Wikimedia.org
[mailto:mediawiki-l-bounces@Wikimedia.org] On Behalf Of Robert Hartmann
Sent: Sunday, September 04, 2005 11:34 AM
To: mediawiki-l(a)Wikimedia.org
Subject: [Mediawiki-l] More Memory Usage for MW 1.5?
I heard, that the MediaWiki 1.5 requires more memory than the last
versions. so can one say the server, where you host the wiki, must have
more memory (about 40M or more...) ?
Greets, Robert Hartmann _______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)Wikimedia.org
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
I heard, that the MediaWiki 1.5 requires more memory than the last
versions. so can one say the server, where you host the wiki, must have
more memory (about 40M or more...) ?
Greets, Robert Hartmann
How can I force the server to provide a fresh page for each page request?
I know it is not the most efficient way of doing things, but I have a lot
people commenting they get blank pages.
I have done it before and know it fixes the problem ... but my memory fails
me at the moment.
Bryan