An automated run of parserTests.php showed the following failures:
This is MediaWiki version 1.10alpha (r20506).
Reading tests from "maintenance/parserTests.txt"...
Reading tests from "extensions/Cite/citeParserTests.txt"...
Reading tests from "extensions/Poem/poemParserTests.txt"...
17 still FAILING test(s) :(
* URL-encoding in URL functions (single parameter) [Has never passed]
* URL-encoding in URL functions (multiple parameters) [Has never passed]
* TODO: Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html) [Has never passed]
* TODO: Link containing double-single-quotes '' (bug 4598) [Has never passed]
* TODO: message transform: <noinclude> in transcluded template (bug 4926) [Has never passed]
* TODO: message transform: <onlyinclude> in transcluded template (bug 4926) [Has never passed]
* BUG 1887, part 2: A <math> with a thumbnail- math enabled [Has never passed]
* TODO: HTML bullet list, unclosed tags (bug 5497) [Has never passed]
* TODO: HTML ordered list, unclosed tags (bug 5497) [Has never passed]
* TODO: HTML nested bullet list, open tags (bug 5497) [Has never passed]
* TODO: HTML nested ordered list, open tags (bug 5497) [Has never passed]
* TODO: Inline HTML vs wiki block nesting [Has never passed]
* TODO: Mixing markup for italics and bold [Has never passed]
* TODO: dt/dd/dl test [Has never passed]
* TODO: Images with the "|" character in the comment [Has never passed]
* TODO: Parents of subpages, two levels up, without trailing slash or name. [Has never passed]
* TODO: Parents of subpages, two levels up, with lots of extra trailing slashes. [Has never passed]
Passed 494 of 511 tests (96.67%)... 17 tests failed!
When i click xyz, a template should be created where user can enter any
details(say schedule)
when i click abc, a different template should be created where user can enter
some other details(say news)
This should be done using mediawiki.
i will create one page called Template:<xyz>
and another page called Template:<abc>
Should i do this in function performAction in wiki.php using switch case 'raw'
or is it something else?
Hi,
I've been working on running MediaWIki on SQL Server 2000. It's mostly
working and it feels close to all working, but I still have a few major
wrinkles to iron out.
1. Almost nothing that involves a "pager" is working yet, with just a couple
of exceptions.
2. Interlanguage links are showing up at the end of the pages instead of in
a sidebar as on Wikipedia.
3. Template documentation that's referenced from a template as {{/doc}}
doesn't get transcluded.
4. Some tags generated by templates that are obviously meant to be
interpreted as HTML are instead getting interpreted as text (for example, if
I use the text & templates of the page [[Arnold Schwarzenegger]] from
Wikipedia, I will see <small/> under Arnold's picture within the infobox at
the top of the page).
5. Not convinced that absolutely all the tables are working yet -- for
example, I have yet to see a record created in the redirect table.
I can code a bit, but I'm not a programmer by any means, and I kind of
need some help (ruby on rails is hard) or just answers to questions
about how to do stuff.
I posted a blog entry on why I think the filter would be so cool; I'll
copy it here:
- - - - -
Here's why I think wikimouse would be incredibly cool:
1. There are more great, quirky, insightful wikipedia articles than
any individual could blog in a lifetime.
2. We won't need to worry at all about spam prevention,
self-promotion, etc., which cuts down the workload a lot -- if users
are only allowed to link to wikipedia, all that stuff is taken care of
already inside the wikipedia project. Basically, there's a floor on
how bad the user-submitted content can be, and no ceiling (or is that
the other way around?).
3. It could be an open-source, non-profit project on donated server
space, but if it's for-profit, it would be cool to programatically
generate merchandise based on the wikipedia article text (I've already
written a program that does this). That means we don't have to bog
down the site with banner and text ads (which wouldn't work too well
anyway, since the content is very diverse).
I was just looking at the code for the edit toolbar and it's hardcoded
to take the images from the common skin rather than being
customisable, is there a reason for that?
Hi.
We created a translation file of MediaWiki and would like to spread
it. It's not based on the very last version, but on a previous one.
I've been reading
http://meta.wikimedia.org/wiki/MediaWiki_localization
shall we create a patch for that version and then submit it to
'MediaWiki bugzilla'?
Is there any method to announce the creation of this new translation?
Thank you very much.
--
Miguel A. Cuesta
Alianzo Networks
"We make social networks"
tel: (+34) 944 371 684 - skype me: cuesta-alianzo
mailto:cuesta@alianzo.com - IM/MSN: cuesta(a)alianzo.com
Hi,
I'm trying to add LDAP Authentication to my MediaWiki 1.6.5 server and
downloaded LdapAuthentication.php v1.0h.
I'm having a lot of trouble when trying to restrict the wiki to a specific
LDAP group...
here's the block I've added to LocalSettings.php:
# begin LDAP authentication part
require_once( 'includes/LdapAuthentication.php' );
$wgAuth = new LdapAuthenticationPlugin();
$wgLDAPDomainNames = array( "NVIDIA.COM" );
$wgLDAPServerNames = array( "NVIDIA.COM"=>"ldap.nvidia.com" );
//$wgLDAPUseSSL = true;
$wgLDAPUseSSL = false;
$wgLDAPUseLocal = false;
$wgMinimalPasswordLength = 1;
$wgLDAPAddLDAPUsers = false;
$wgLDAPUpdateLDAP = false;
$wgLDAPMailPassword = false;
$wgLDAPRetrievePrefs = true;
$wgLDAPDebug = 99;
//$wgLDAPSearchAttributes = array( "NVIDIA.COM"=>"sAMAccountName" );
$wgLDAPSearchStrings = array( "NVIDIA.COM"=>"NVIDIA.COM\\USER-NAME" );
# testing group restriction below
$wgLDAPRequiredGroups = array(
"NVIDIA.COM"=>array("cn=neteng-contractors,ou=departments,ou=distribution
lists,ou=groups,dc=nvidia,dc=com") );
$wgLDAPGroupUseFullDN = array( "NVIDIA.COM"=>true );
$wgLDAPGroupObjectclass = array( "NVIDIA.COM"=>"group" );
$wgLDAPGroupAttribute = array( "NVIDIA.COM"=>"member" );
$wgLDAPGroupSearchNestedGroups = array( "NVIDIA.COM"=>false );
$wgLDAPBaseDNs = array( "NVIDIA.COM"=>"ou=groups,dc=nvidia,dc=com" );
Here is the error output that I'm getting:
{\rtf1\ansi\ansicpg1252\deff0\deflang1033{\fonttbl{\f0\fswiss\fcharset0
Arial;}} {\*\generator Msftedit 5.41.15.1507;}\viewkind4\uc1\pard\f0\fs20
Entering validDomain
User is using a valid domain
Entering getCanonicalName
Munged username: Cvo
Entering Connect
Entering Connect
Not Using SSL
Using servers: ldap://ldap.nvidia.com
Connected successfully
Entering getSearchString
Doing a straight bind
userdn is: NVIDIA.COM\\Cvo
Binding as the user
Binded successfully
Checking for (new style) group membership
Entering isMemberOfRequiredLdapGroup
Required groups:cn=neteng-contractors,ou=departments,ou=distribution
lists,ou=groups,dc=nvidia,dc=com
Entering getGroups
Search string: (&(member=NVIDIA.COM\\Cvo)(objectclass=group))
\par *Warning*: ldap_get_entries(): supplied argument is not a valid ldap
result resource in */srv/www/htdocs/wiki/includes/LdapAuthentication.php* on
line *857*
\par
\par *Warning*: array_shift() [function.array-shift]: The argument should be
an array in */srv/www/htdocs/wiki/includes/LdapAuthentication.php* on line *
860*
\par
\par *Warning*: Invalid argument supplied for foreach() in *
/srv/www/htdocs/wiki/includes/LdapAuthentication.php* on line *863*
\par Returned groups:
Couldn't find the user in any groups (1).
\par \par }
If I uncomment the $wgLDAPSearchAttributes line and comment out
$wgLDAPSearchStrings, I get the following:
{\rtf1\ansi\ansicpg1252\deff0\deflang1033{\fonttbl{\f0\fswiss\fcharset0
Arial;}} {\*\generator Msftedit 5.41.15.1507;}\viewkind4\uc1\pard\f0\fs20
Entering validDomain
User is using a valid domain
Entering getCanonicalName
Munged username: Cvo
Entering Connect
Entering Connect
Not Using SSL
Using servers: ldap://ldap.nvidia.com
Connected successfully
Entering getSearchString
Doing a proxy or anonymous bind
Entering getUserDN
Doing an anonymous bind
Created a regular filter: (sAMAccountName=Cvo)
Using base: ou=groups,dc=nvidia,dc=com
Couldn't find an entry
userdn is:
User DN is blank
\par \par }
Can someone please help out? I can run dsquery on a Win2k3 server against
my accountname (cvo) and it returns fine...
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Wikimedia's been accepted as a mentoring organization for the 2007
Google Summer of Code program.
Here's our organization page:
http://code.google.com/soc/wikimedia/about.html
I put up an initial project list here:
http://meta.wikimedia.org/wiki/Summer_of_Code_2007
It's semi-protected so it won't be too vandalized ;) but additional
suggestions are welcome. I'd like to ask that people who aren't directly
involved in development not add too much to the main page directly,
though; last year we ended up with lots of project submissions for
things that weren't really considered high priority, so I'd like to keep
the list a little more ordered this time.
We don't know for sure how many projects we'll get assigned, so we'll
see. :) At least Tim and I will serve as mentors for the student
projects; if a couple more experienced developers would like to help out
with that too that would be super.
Last year's projects went really well up to the public demo stage but
never quite got integrated into the mainline; I'm hoping that this year
we can stick with projects that will be easier to slip in and take live
much earlier in the process, which should help keep the students
interested and the projects active.
- -- brion vibber (brion @ pobox.com / brion @ wikimedia.org)
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.2.2 (Darwin)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFF+UgqwRnhpk1wk44RAhh9AKCPp/5MJd8OICbdMM79DdzCP7WsywCfSIbn
jNizZK1TYIBpSlxZ9Ph6R0o=
=uAms
-----END PGP SIGNATURE-----
Using BitTorrent Python Package
BitTorrent files:
ftp://www.wikigadugi.org/wiki/torrents/
Tracker URL:
http://www.wikigadugi.org:60292
These torrent files have only been tested against BitTorrent, Azureus,
Ctorrent, and rTorrent. ctorrent canot read files over 4 GB without
crashing. Azureus have changed their DHT distributed tracking formats
and have problems with generic BitTorrent .torrent files.
It may take a day or so to fully flush this out, but the torrents are up.
Jeff