Hi there,
what about implementing OpenID accross some Wikimedia projects? I know this might be rather "political" than "technical" decision, but I think it would bring only positives. I tried to bring it up on Metapub and Village pump, but quite little interest was showed...
Jan
Nostalgia is a MediaWiki skin that has been introduced before 1.3, when
MonoBook became the default. It is available for use in Wikipedia and
other Wikimedia sites for registered users. It is the skin that I like
best, but I think it is in need of some improvement.
The skin appears to have some usability problems that other skins don't
have: you cannot easily find your user pages when using Nostalgia, and
there is no easy link for uploads apart from a drop down menu which is
too unorganised.
I would like to know whether there is any interest from other
users/developers about Nostalgia (or whether I am the only one using
it!), and if so what you would like to see in an improved more modern
version of Nostalgia (with the same general looks, but not putting the
user in a disadvamtaged position from a usability standpoint comprared
to the other skins, particularly the default one). What would you think
can be considered wrong with the current Nostalgia, and how would you
like it to be?
PS: I have put a quick patch (15868) on bugzilla to include user pages
in Nostalgia.
--
Thanks,
NSK Nikolaos S. Karastathis, http://karastathis.org/
Possibly off-topic.
Heres is a script that replace normal whitespace with one of the
whitespaces supported by UTF8 ( Others are
          ​  ).
I have made a few vandalization test here:
http://en.wikipedia.org/wiki/User:Tei/lalaland
What do you guys think? could this be a problem? You can break links
like [[Mr Thonson]] replacing it by [[Mr Thonson]]
while(<DATA>){
@chars = split(//,$_);
foreach $ch (@chars){
if ( $ch eq " "){
print pack("ccc",0xe2,0x80,0x80);
}else {
print $ch;
}
}
}
__DATA__
Text to be vandalized goes here
I've googled, I've MediaWiki.org'd, I've searched the mailing list archives and I haven't found an answer to this so I'm taking it to the sources.
I have a wiki family with interwiki links between each wiki. I often have to dump my wikis to make flat deliverables for people to run off a CD (or DVD). When I dump this wiki family I get <iw_prefix>:/f/i/l/0.html when it tries to parse an interwiki link. I would like to be able to rewrite those links during the dump to whatever destination directory I tell it to use.
I thought this was being made in the getHashedFilename function of dumpHTML, but adding some print statements there I don't seem to see any of the interwiki links.
My ultimate problem is that when the wiki is dumped the interwiki links are all dead and I don't know any way to hook them back up short of going through each page by hand. I was hoping, since I know the destination directory of the wiki family dumps, I could rewrite the links mid-dump.
Any solutions, pointers, encouragement, or other comment that is not negative is very appreciated!
Courtney Christensen
Hi there,
I've recently built my first extension for Mediawiki using the
CustomUserCreateForm extension as a base and a bit of the ConfirmEdit
extension. My goal was to have a checkbox on the register form for new
user where they have to agree to the wiki's terms before they can become
users. Using the previous mentioned extensions as a base I succeeded in
this, yet I've the feeling the extension could/should use some (probably
a lot actually) changes that would make it better.
For instance I replace the php template for the registration of new
users with my own, but I have hardcode the link to the terms. I would
prefer this to be set by a user of this plugin using a variable instead
of forcing them to change a template. Actually I would rather not
replace the register php template at all, but just add a new form
element to it.
Anyway, I hope some of you have some time to review my work so far and
point me in the right direction to do things the mediawiki way :)
The plugin code can found as a zip file on my site:
http://www.burobjorn.nl/customUserCreateForm.zip (approx 5Kb)
Looking forward to your feedback and comments,
--
met vriendelijke groet,
Bjorn Wijers
* b u r o b j o r n .nl *
digitaal vakmanschap | digital craftsmanship
Concordiastraat 68-126
3551 EM Utrecht
The Netherlands
http://www.burobjorn.nl
brion(a)svn.wikimedia.org wrote:
> Revision: 41264
> Author: brion
> Date: 2008-09-25 18:43:33 +0000 (Thu, 25 Sep 2008)
>
> Log Message:
> -----------
> * Improved upload file type detection for OpenDocument formats
>
> Added a check for the magic value header in OpenDocument zip archives which specifies which subtype it is. Such files will get detected with the appropriate mime type and matching extension, so ODT etc uploads will work again where enabled.
>
> (Previously the general ZIP check and blacklist would disable them.)
>
I think you're missing the point. It's trivial to make a file which is
both a valid OpenDocument file, and a valid JAR file subject to the same
origin policy.
http://noc.wikimedia.org/~tstarling/odjar/
> print $mm->guessMimeType('.../odjar.odt')
application/vnd.oasis.opendocument.text
Just done with zip/unzip, no hex editing involved.
-- Tim Starling
On Fri, Oct 3, 2008 at 2:38 AM, <brion(a)svn.wikimedia.org> wrote:
> Revision: 41584
> Author: brion
> Date: 2008-10-03 00:38:33 +0000 (Fri, 03 Oct 2008)
>
> Log Message:
> -----------
> Back out r41548 "Classes derived from SpecialPage can now specify a run() method, which will be executed after all magic performed by SpecialPage::execute()"
>
> It's unclear what benefit this brings, and the magic calling is scary and icky.
>
> Modified Paths:
> --------------
> trunk/phase3/RELEASE-NOTES
> trunk/phase3/includes/SpecialPage.php
>
> Modified: trunk/phase3/RELEASE-NOTES
> ===================================================================
> --- trunk/phase3/RELEASE-NOTES 2008-10-03 00:28:52 UTC (rev 41583)
> +++ trunk/phase3/RELEASE-NOTES 2008-10-03 00:38:33 UTC (rev 41584)
> @@ -145,8 +145,6 @@
> $wgExternalLinkTarget
> * api.php now sends "Retry-After" and "X-Database-Lag" HTTP headers if the maxlag
> check fails, just like index.php does
> -* Classes derived from SpecialPage can now specify a run() method, which will
> - be executed after all magic performed by SpecialPage::execute()
>
>
The problem currently is that in case you have your own special page
overriding SpecialPage::execute, you have to do all setHeaders() and
similar calls yourself, as well as the hooks. This basically means
that your are copying code from the parent, which sucks and also is
not futureproof in case somebody decides to add more magic to
SpecialPage::execute.
Bryan
Dear devs,
I would like to enquire about
https://bugzilla.wikimedia.org/show_bug.cgi?id=709. While it is marked as
"fixed", image renaming is not yet activated on WMF wikis as far as I see. It
has been said on this bugzilla page that image renaming needs more testing, on
test.wikipedia. It also says that extensive testing was done back in July.
So, is there more testing to be done ? Anything lambda users could help with ?
This is one of the 10 most-voted bugs so it would be great if we'd see it
activated sometime :-)
Thanks for what's being done already !
Rémi Kaupp
(User:Korrigan)