> MinuteElectron wrote:
>> vasilvv at gmail.com <https://lists.wikimedia.org/mailman/listinfo/wikitech-l> wrote:
>> It is for displaying (or not displaying) tabs, I think.
>> Yes, I thought about this as I was going to bed last night (already
>> switched the computer off). That's the only logical explanation, and
>> there are no tabs on special pages which explains the reduced number of
>> calls. It is still odd why the hook is not called on any special pages
>> while logged out though.
>>
>> MinuteElectron.
>>
> Fixed in r32324 -- a hack was circumventing checks on read locked wikis.
Can this fix be backported to stable version ?
This feature is important for us and we can't upgrade without this.
In other way is it possible, in the future, to add feature to wgnamespaceprotection system (such as read access) protection directly in core ?
I i look down the code for the lockdown extension
it seems that it is not very intrusive and such a feature are very simple to implement and don't need a lot of lines of code.
In a second way it would be perfect if we can "hide the existence of pages" too (don't list the page from search or list query where user don't have read access) like this:
http://www.mediawiki.org/wiki/Extension:Lockdown/hiding_pages (as an example of simple implementation)
Other question :
Is there possible to checkout a svn "stable" branche (if exist) who follow stable release of mediawiki to update more easily our installation ?
Regards
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
rainman(a)svn.wikimedia.org wrote:
> Search frontend:
> * let the backend provide snippets and other info, fill only what is not
> provided
> * wrap textual results in a div, should make the snippets look more
> compact and consistent over hits
> * added a did you mean.. container
> * show total number of hits if available
> * added messages for "redirects to article", and "relevant section" hits
A good start. :) Probably some of this code for default size, date,
snippets etc should get moved into the base SearchResult class, though,
so it doesn't have to get duplicated in the Special:Search display.
- -- brion vibber (brion @ wikimedia.org)
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.8 (Darwin)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iEYEARECAAYFAkfn7q0ACgkQwRnhpk1wk45sYgCgxl7R13suvt2tOBC82FaA1dUV
c54An15NZyPE6MFZutXmG1JeCgw9i1mr
=/1k5
-----END PGP SIGNATURE-----
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
LuceneSearch is dead, long live MWSearch!
I've made a couple more cleanup fixes to the core search UI behavior, so
namespace selections work more consistently, and have gone ahead and
switched it in as the sole search interface on all Wikimedia wikis.
This means the LuceneSearch extension is officially obsolete. The
MWSearch extension provides a back-end plugin for MediaWiki's core
search user interface, and all further front-end work should be done in
core where it'll benefit everybody.
Note that many Wikimedia sites have put in local JavaScript hacks to add
extra external search options to the form; unfortunately they have used
particular form IDs specific to the old, obsolete extension.
I took the liberty of adapting the English Wikipedia's JS to work with
either case:
http://en.wikipedia.org/w/index.php?diff=199925186&oldid=194328556
Please feel free to pass that fix on to other wikis.
- -- brion vibber (brion @ wikimedia.org)
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.8 (Darwin)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iEYEARECAAYFAkfkMbYACgkQwRnhpk1wk45FjACgllh+h8ZYSOMbevZWWwLr7rn5
dvEAoKuMQKrNspTowUG2e9r1tzDTpJq0
=NyNN
-----END PGP SIGNATURE-----
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hello,
Since MediaWiki 1.12 the Lockdown extension has been non-functional for
some unknown reason. I've been trying to track it down and have found
that the userCan hook is not being run on all requests time. For
example, when logged out, it does not run on any special page; when
logged in it runs on some special pages but not others (it ran on
Special:Whatlinkshere, but not Special:Specialpages). However, it
consistently runs on all normal namespace pages.
I do not have the necessary experience and it's getting late here so I
just wanted to jot down what I had uncovered so far. Perhaps someone
with more experience could take a look at why the userCan hook is not
always executing.
Thanks,
MinuteElectron.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.8 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iEYEARECAAYFAkfkV8sACgkQkJvUlhoE3wQwqwCeJDQHoH9J9hjNSFmY3J71BQqX
2OUAoMP7Sxd9B5VhSYQiHF+hOQKnyzBI
=2Vum
-----END PGP SIGNATURE-----
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hello,
Sorry for the repeat post, but I've uncovered further problems with the
userCan hook.
For some reason it is passing the $action parameter as 'move' or 'edit'
(the hook is run several times per a request) when it is 'read' -- this
in itself makes no sense to me, on top of the failure of execution
experienced before.
Is this intentional?
Thanks,
MinuteElectron.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.8 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iEYEARECAAYFAkfkWVwACgkQkJvUlhoE3wQc0ACeK+xooE5oza5QflhJoiSx9DtM
3jEAniU5iFlTKwgEpVwl1cv+pW9Y05aK
=NAyW
-----END PGP SIGNATURE-----
Please translate this announcement into other languages and forward it
to other mailing lists and village pumps. (The translators list has
already been notified and will help with this process.)
The Wikimedia Foundation and Wikimedia Germany have collaborated, with
financial support from Wikimedia France, to support development of a
new extension to our software
which makes it possible to flag versions of wiki articles as having
reached a certain quality. This new toolset could mark the beginning
of a new era for Wikipedia and its sister projects, giving readers
more transparency than ever about the quality of a given article. A
special note of thanks to Aaron Schulz, who has developed much of the
functionality as a volunteer -- we would not be where we are today
without him. The ongoing support and patience of Philipp Birken from
the German chapter was also critical.
Before this functionality will be enabled on any Wikimedia project, it
needs to be tested thoroughly for usability, bugs, security and
performance. Test wikis have been set up in English and German
(because the German Wikimedia community has been driving the
development of this functionality from the beginning).
http://en.labs.wikimedia.org/http://de.labs.wikimedia.org/
These wikis contain a copy of the Wikibooks database. This copy is
completely separate from the "real" Wikibooks, so do not worry about
destroying anything of value. Please follow the instructions on the
Main Pages to participate. If you do not speak English or German, we
encourage you right now to
- set up test wikis independently using the open source extension
available from http://www.mediawiki.org/wiki/Extension:FlaggedRevs ,
or
- change the user interface preference, and create pages in the
English test wiki in your language.
This is due to our limited capacity to set up additional wikis. If you
feel you really, absolutely, strongly need a test wiki in your
language, please file a request through:
https://bugzilla.wikimedia.org/
Wikimedia communities will also have to decide what kind of
configuration to use for their project. Key questions to answer
include:
- What quality attributes should there be?
- Who should be permitted to flag changes as having been reviewed for
vandalism, or for other quality attributes?
- Should the default view for unregistered users change to the "stable
version" on all pages, some pages, or no pages?
The German Wikimedia community has implemented a particular
long-standing community proposal and will probably go live the soonest
with this configuration; other communities will still have to develop
consensus.
== What's next? ==
The test will run at least until April 10, 2008 before the extension
is implemented live on any wiki. This is to allow any serious problems
to be surfaced by the community. If there are no critical open issues
as of April 10, any language/project community will be permitted to
file a request through https://bugzilla.wikimedia.org/ to activate the
extension. This request will have to point to pages in the project
indicating a consensus to move forward. Detailed instructions to do so
will be posted on the test wikis.
--
Erik Möller
Deputy Director, Wikimedia Foundation
Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate
On Thu, Mar 20, 2008 at 7:00 PM, <brion(a)svn.wikimedia.org> wrote:
> Don't force a white background on <table>s appearing in a <fieldset>;
> this'll take care of most uses of table layouts in special forms on
> sites like English Wikipedia which add a non-white background color
> to the special page namespace.
I never understood why Wikipedia doesn't just fix its broken styles.
It missed some instances where the background was declared white when
it changed the colors, is all.
On Fri, Mar 21, 2008 at 4:51 AM, <werdna(a)svn.wikimedia.org> wrote:
> Log Message:
> -----------
> * Disallow deletion of big pages by means of moving a page to its title and using the
> "delete and move" option.
Surely these checks should be lower down so that they don't have to be
duplicated.
On Thu, Mar 20, 2008 at 9:10 PM, <greg(a)svn.wikimedia.org> wrote:
> Log Message:
> -----------
> Don't attempt to insert if there are no rows.
>
> Modified Paths:
> --------------
> trunk/phase3/includes/Article.php
>
> Modified: trunk/phase3/includes/Article.php
> ===================================================================
> --- trunk/phase3/includes/Article.php 2008-03-21 00:39:32 UTC (rev 32274)
> +++ trunk/phase3/includes/Article.php 2008-03-21 01:10:14 UTC (rev 32275)
> @@ -3375,7 +3375,8 @@
> foreach( $insertCats as $cat ) {
> $insertRows[] = array( 'cat_title' => $cat );
> }
> - $dbw->insert( 'category', $insertRows, __METHOD__, 'IGNORE' );
> + if ( count( $insertRows ) )
> + $dbw->insert( 'category', $insertRows, __METHOD__, 'IGNORE' );
How could there be no rows at this point? The few lines before this are:
$insertCats = array_merge( $added, $deleted );
if( !$insertCats ) {
# Okay, nothing to do
return;
}
$insertRows = array();
$insertRows is guaranteed to have one element per element of
$insertCats, and if $insertCats is empty, we've already returned.
Note that it was only today or last night or so that I added the check
on $insertCats, though, so if you were seeing problems maybe it's
because you didn't svn up recently enough.
Gentlemen, I am having a problem in that I am ending up reading the
same articles over and over, only to get halfway through them before
realizing "didn't I read something like this last month?"
How could that be? My browser (actually WWWOFFLE) keeps track of what
links I've already clicked on. They will be in an "already clicked"
color so I don't end up clicking again.
Ah, it is all because MediaWiki insists on calling the same article
different names. Consider these three cases:
1) [[ADSL]]
2) [[Asymmetric Digital Subscriber Line]]
3) [[Asymmetric Digital Subscriber Line|ADSL]]
which produce
1) <a href="/wiki/ADSL" class="mw-redirect" title="ADSL">ADSL</a>
2) <a href="/wiki/Asymmetric_Digital_Subscriber_Line" title="Asymmetric
Digital Subscriber Line">Asymmetric Digital Subscriber Line</a>
3) <a href="/wiki/Asymmetric_Digital_Subscriber_Line" title="Asymmetric
Digital Subscriber Line">ADSL</a>
I hereby propose input 1 now make output 3 instead of output 1.
'But what about the "(Redirected from ADSL)" message?' you ask.
Is that really all the big difference is? Seems so. Well, losing it
would be a small price to pay vs. all the worldwide cache space and
network traffic caused by the same article with many names needing a
separate copy. Implementing this might even delay new hardware
purchase needs a year.
Note that no, we are not asking the user to change their writing
habits. They can still go ahead and use their favorite redirect names,
the more the merrier. All we are doing is canonicalizing the HTTP
link. As we see MediaWiki is quite aware (class="mw-redirect") that it
is a redirect, we bridge the gap and remove the runaround by going
further and linking directly.