Hi,
I'm evaluating the use of MediaWiki in a corporate server.
In such environment, my need for the "login" stage is only to identify
people. I do not need them to authenticate (no need to password). The
need is to know who made what, in a "confident" world.
Is it possible to configure MediaWiki to only require a login (without
password) and only check that this login exists in a LDAP database?
Thanks in advance for any tips.
--
Guilhem BONNEFILLE
-=- #UIN: 15146515 JID: guyou(a)im.apinc.org MSN: guilhem_bonnefille(a)hotmail.com
-=- mailto:guilhem.bonnefille@gmail.com
-=- http://nathguil.free.fr/
Hi,
Does anyone has an idea how to embed an extension in another one? For
example I have a simple extension:
<?php
$wgExtensionFunctions[] = "wfLCExtension";
function wfLCExtension() {
global $wgParser;
# register the extension with the WikiText parser
# the first parameter is the name of the new tag.
# In this case it defines the tag <mlc> ... </mlc>
# the second parameter is the callback function for
# processing the text between the tags
$wgParser->setHook( "mlc", "rendercontent" );
}
# The callback function for converting the input text to HTML output
function rendercontent( $input, $argv, &$parser ) {
$link = '';
$linking = '';
$localParser = new Parser();
$linking = $localParser->parse($input, $parser->mTitle,
$parser->mOptions);
$link .= $linking->getText();
return $link;
}
?>
It just returns the text put between the <mlc></mlc> tags. I use it to mark
this section while the article is being imported to xml (which I use
further).
The problem is that I'd like make it possible for users to add different
content between these tags (including other extensions). For example:
http://meta.wikimedia.org/wiki/Google_extension
but instead of properly rendered content I get sth like:
standard content
<google></google>
standard content
thx in advance,
Regards,
Aretai
The "Remember me" checkbox is not working on our wiki. If users terminate
and restart their browser, they have to log into the wiki again. This
doesn't happen on Wikipedia, so I assume our wiki is having a session
problem.
The PHP session directory (session.save_path) exists, is writable by the
webserver, and session files ARE being created within it:
session.save_path="C:\temp\php\Session"
(and I have already tried moving the directory underneath the PHP
installation directory, but that didn't help).
If I watch the browser's cookies and the session directory, the browser
switches to a new session without preserving the username information. A
detailed case study is below.
Setup is: MediaWiki 1.9.3 on Windows 2003 server; Apache 2.2.4; PHP 5.2.1;
mySQL 5.
Case study:
1. I visit our MediaWiki site and log in, taking care to check the
checkbox, "Remember my login on this computer". Note: Our wiki requires
logins to edit a page:
# LocalSettings.php:
$wgGroupPermissions['*']['edit'] = false;
Authentication is done using normal MediaWiki accounts.
2. Confirm that a PHP session file was created, its name is stored in a
browser cookie (wikidb_tablePrefix__session), and it contains this
information:
kw_bread_crumbs: TechWiki (1.9.3.vp.2.e)|a:2:{i:0;s:17:"Special:Userlogin";i:1;s:16:"Name of page";}wsUserID|i:16;wsUserName|s:4:"Danb";wsToken|s:32:"f12a2d7154f0ea29190a5c0fb91a6e1a";
3. Close the browser. Confirm the session file still exists as above.
4. Launch the browser and visit the wiki's home page. The correct username
is displayed at the top of the page, as if the user were still logged
in. Everything looks good so far, but........
5. Visit ANY OTHER page on the wiki. The username at the top of the page
disappears, the IP address is shown instead, and a new session file gets
associated with the browser (as verified in the session cookie above). The
new session file contains no username information:
kw_bread_crumbs: TechWiki (1.9.3.vp.2.e)|a:1:{i:0;s:36:"Name of page";}
and the old session file still exists, unchanged.
Now, if I repeat this experiment but use en.wikipedia.org instead of my
wiki, the session information is remembered even after I close and reopen
the browser.
This problem exists in both IE 7 and Firefox 2.0. And the problem existed
before we installed the "kw_bread_crumbs" extension mentioned in the
session data above.
Any help appreciated!
- Dan Barrett
dbarrett(a)vistaprint.com
I just discovered DekiWiki (http://doc.opengarden.org/DekiWiki) and thought
I'd share it with others on this list (I don't believe it has been mentioned
on here before).
DekiWiki is a fork of MediaWiki (appears to be based on the 1.4.x line).
They have provided numerous features that many on this list desire from
MediaWiki, including WYSIWYG, granular access permissions, a definite page
hierarchy, and per-page attachments. It is worth noting that the target
audience and desired usage of DekiWiki is substantially different from
MediaWiki's. I would say DekiWiki is more like Confluence or XWiki than
MediaWiki. As such, many of the MediaWiki features you have grown to love
(including the wiki syntax) are not included. So, evaluate with care.
I'm not trying to incite rage against MediaWiki or actively praise DekiWiki,
I'm just trying to spread the word. The MediaWiki developers will
acknowledge that MediaWiki serves a targeted audience and getting MediaWiki
to play nice in some environments (per-page access permissions, for example)
is like pounding a square peg in a round hole. Well, DekiWiki might be your
round peg and you might even be able to salvage your MediaWiki knowledge
when it comes to hacking DekiWiki.
Cheers,
Gregory Szorc
gregory.szorc(a)gmail.com
I would like to add a banner/ad at the top of each page of my wiki site (http://PESWiki.com)
Could someone tell me how to do this?
I've been able to add such things in the left and bottom of the site, but can't see in the code how to add something at the top.
I'm guessing it has something to do with the "siteNotice" function, but I can't find anything relevant about that at MediaWiki.com
Sterling
I'd like to know if MediaWiki is compatible with a PHP compiler such
as Zend Platform.
We want our wiki pages to be very dynamic and adapt its content
according to the user so the native MediaWiki cache does not help us.
I'd like to investigate the possible speed gain by using a code optimizer.
Any hint will be welcome.
Date: Fri, 09 Mar 2007 09:47:46 -0500
From: Brion Vibber <brion(a)pobox.com>
Subject: Re: [Mediawiki-l] Bulk import 6500 records into the template
namespace
To: MediaWiki announcements and site admin list
<mediawiki-l(a)lists.wikimedia.org>
Message-ID: <45F17392.3040700(a)pobox.com>
Content-Type: text/plain; charset=ISO-8859-1
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
khuffman(a)NGS.ORG wrote:
> Hi, I need advice on how to bulk import 6500 records/articles from Excel
> into the MediaWiki's Template namespace. I'm using MediaWiki 1.8.x (but
> may be upgrading/installing a new version for this project, so version
may
> be 1.9.x).
brion vibber (brion @ pobox.com / brion @ wikimedia.org) wrote:
> Use Special:Import or importDump.php with the XML wrapper format, or
> importTextFile.php with individual text files, or have one of many
> client-side bot tools do the batch-editing.
Additional question:
Are there more detailed instructions on how to use/access the
importTextFile.php or client-side bot tools? I see the file
importTextFile.php in the maintenance folder but not sure how to get to it
on my Web hoster. When I navigate through my browser, I don't have
access/permissions to these files even if I change permissions to 0777.
So, any pointers to more detailed directions on how to use these files
would be great. Thank you!
On IRC chat yesterday, Ryan Lane gave me some advice regarding my
attempt to write an extension that would append some wikitext to
articles prior to parsing. As he and a couple of other people
suggested, the hook that I was trying to use for this purpose --
ArticleAfterFetchContent -- didn't work as hoped because of some
caching issues. However, I have been able to get something working
using the ParserBeforeStrip hook. Since Ryan said that he was
interested in knowing how to do this for something he was trying to
do, I thought I should document my technique here.
Here's a simplified version of the code I settled on:
<?php
$wgHooks['ParserBeforeStrip'][] = 'wfMyTextAppend';
function wfMyTextAppend(&$parser, &$text, &$strip_state) {
global $wgMyTextAppendFlag, $wgRequest;
if (!$wgMyTextAppendFlag && $parser->getTitle()->getNamespace() ==
NS_MAIN) {
$action = $wgRequest->getVal( 'action', 'view' );
if (empty( $action ) || $action == 'submit' || $action == 'view'
|| $action == 'purge') {
$text .= "\nThis article is titled '''" . $parser->getTitle()-
>getText() . "'''";
}
$wgMyTextAppendFlag = true;
}
}
?>
Explanation:
(1) I created the $wgMyTextAppendFlag global because other extensions
(such as the GIS/Wiki RSS extension) invoke the parser separately
from the parsing of the entire page, in order to parse their input.
Since my hook will be triggered at the beginning of parsing the
wikitext for the entire page, I use the $wgMyTextAppendFlag as a way
of signalling subsequent invocations of the parser that the
wfMyTextAppend function has already run, thus ensuring that my text
will only be appended at the end of the entire article and not
elsewhere.
(2) The test $parser->getTitle()->getNamespace() == NS_MAIN ensures
that my text only gets appended on articles in the main namespace and
not on Special pages, Talk pages, etc.
(3) I use $wgRequest to test what kind of action is being performed,
thus ensuring that my text is not appended when action=edit or some
other unwanted context.
(4) The line that begins "$text .=" appends wikitext that produces a
phrase followed by the name of the article in bold. This, of course,
is pointless. In my actual extension, I did some further munging to
get wikitext that displays results of an RSS feed which pulls in a
search result based on the article title.
--------------------------------
| Sheldon Rampton
| Research director, Center for Media & Democracy (www.prwatch.org)
| Author of books including:
| Friends In Deed: The Story of US-Nicaragua Sister Cities
| Toxic Sludge Is Good For You
| Mad Cow USA
| Trust Us, We're Experts
| Weapons of Mass Deception
| Banana Republicans
| The Best War Ever
--------------------------------
| Subscribe to our free weekly list serve by visiting:
| http://www.prwatch.org/cmd/subscribe_sotd.html
|
| Donate now to support independent, public interest reporting:
| https://secure.groundspring.org/dn/index.php?id=1118
--------------------------------
We discovered this when we had a split wiki with private and public
parts. And then we ran a Nutch search engine against our own wiki
and saw all these hits that weren't supposed to be there.
The watermark idea sounds like it might be interesting. You wouldn't
have to map the IPs...so what if the public users see the fact that
it's a public wiki?
Or, maybe you could just set up your private users to use a different
skin that displays the warning.
=====================================
Jim Hu
Associate Professor
Dept. of Biochemistry and Biophysics
2128 TAMU
Texas A&M Univ.
College Station, TX 77843-2128
979-862-4054
On Mar 9, 2007, at 8:12 PM, Jason Armistead wrote:
> Jim Hu wrote:
>
>> Unless your users are a lot more disciplined than mine, new page
>> creation will leak across the namespaces.
>>
>> Actually, what will happen is that everyone in the protected
>> namespaces will create new pages in Main.
>
> Maybe the answer is to somehow use the IP address of the "Private"
> users to map the page background of the "Public" wiki to a
> watermarked background that says "Warning: Public wiki". Everyone
> outside that IP net-block just gets a regular background.
>
> Or maybe it's possible to re-map the standard NS_MAIN namespace to
> the NS_PRIVATE (custom) one.
>
> It all sounds like it's getting messy ...
>
> Sigh !
>
> Anyone got any other bright ideas ???
>
>
>
>
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l(a)lists.wikimedia.org
> http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Referring to discussions about this topic
http://lists.wikimedia.org/mailman/htdig/mediawiki-l/2005-October/007450.ht…
I want to have two wikis installed on the one server.
Both will use the same MySQL database, wikidb
Both will have the same database prefix (or both will have NO database prefix).
Both the wikis will have separate virtual paths e.g. /wikiA and /wikiB configured in Apache
I want to configure one (wiki "A") with extra Custom Namespaces (via $wgExtraNamespaces in LocalSettings.php), and limit Apache to serving those pages only to users within a specific in-house IP address block. Consider this to be the "Private:" namespace, plus an associateed "Private_Talk:" namespace for discussion pages.
The other wiki (wiki "B") will have just the standard MediaWiki namespaces. This is the "Ordinary user" namespace.
Each wiki will have its own LocalSettings.php to account for this., but all the other files will be shared between the two wikis.
Some questions:
1. Will searches by users on wiki "B" only see articles in the "Ordinary user" namespaces ? Or will it also see the "Private" namespaces too ?
2. What takes precedence: interwiki prefixes or namespace prefixes ? If I set up the interwiki table so that Private: and Private_Talk: redirect to wiki "B"'s http address, will that also affect wiki "A"'s users ?
It seems, perhaps simplistically, to be an easy way of having one set of wiki tables, yet giving some simple security to the "Private" namespaces that wiki "A"'s users can edit and view.
Is this possible ? Or am I miles off track ? Are there any security aspects I didn't consider ?
Cheers
Jason