I just returned from vacation, and find the new software/skin in place.
Great! And thanks to everyone involved.
However...
* There is no longer a "section edit" link for the very first section
(before the first heading).
* In Mono, I cannot turn off the underlined links. When I'm not logged
in, links are not underlined, as I prefer. Logged in, they always are
(ugh!), and fiddling with the "underline links" setting in my
preferences doesn't help. Strangely, on de.wikipedia, I don't get the
underlines even when logged in.
Anyone up for some hotfixes? :-)
Magnus
I've made some changes to Parser.php on my local install to make
category pages a little neater. You can take a look at
http://meta.wikipedia.org/wiki/Image:Category_sort_example.png for a
fairly recent screenshot.
Things changed are
* Three column view of sub categories and articles
* List view of subcategories and articles (if there are only a few)
* headings for the first letter of sort keys in article and subcategory
lists
* "Category" doesn't appear before subcategories and
* Display of the number of articles and subcategories.
I have no idea if this code is safe/well thought out/whatever but it
works fine on my local mediawiki installation.
And no picking on my php skills ;-)
Tobin.
My version of the categoryMagic function follows. Just about everything
after "# For all pages that link to this category" is changed.
----
# This method generates the list of subcategories and pages for a
category
function categoryMagic ()
{
global $wgLang , $wgUser ;
if ( !$this->mOptions->getUseCategoryMagic() ) return ; # Doesn't use
categories at all
$cns = Namespace::getCategory() ;
if ( $this->mTitle->getNamespace() != $cns ) return "" ; # This ain't
a category page
$r = "<br style=\"clear:both;\"/>\n";
$sk =& $wgUser->getSkin() ;
$articles = array() ;
$articles_start_char = array();
$children = array() ;
$children_start_char = array();
$data = array () ;
$id = $this->mTitle->getArticleID() ;
# FIXME: add limits
$t = wfStrencode( $this->mTitle->getDBKey() );
$sql = "SELECT DISTINCT cur_title,cur_namespace,cl_sortkey FROM
cur,categorylinks WHERE cl_to='$t' AND cl_from=cur_id ORDER BY
cl_sortkey" ;
$res = wfQuery ( $sql, DB_READ ) ;
while ( $x = wfFetchObject ( $res ) ) $data[] = $x ;
# For all pages that link to this category
foreach ( $data AS $x )
{
$t = $wgLang->getNsText ( $x->cur_namespace ) ;
if ( $t != "" ) $t .= ":" ;
$t .= $x->cur_title ;
if ( $x->cur_namespace == $cns ) {
array_push ( $children, $sk->makeKnownLink ( $t, $x->cur_title) ) ;
# Subcategory
array_push ( $children_start_char, $x->cl_sortkey[0]) ;
} else {
array_push ( $articles , $sk->makeLink ( $t ) ) ; # Page in this
category
array_push ( $articles_start_char, $x->cl_sortkey[0]) ;
}
}
wfFreeResult ( $res ) ;
# Showing subcategories
if ( count ( $children ) > 20) {
$ti = $this->mTitle->getText() ;
// divide list into three equal chunks
$chunk = (int) (count ( $children ) / 3);
// get and display header
$h = wfMsg( "subcategories" );
$r .= "<h2>{$h}</h2>\n" ;
$r .= "There are ".count( $children )." subcategories to this
category<br>" ;
$r .= "<table width=\"100%\"><tr valign=\"top\">";
$startChunk = 0;
$endChunk = $chunk;
// loop through the chunks
for($startChunk = 0, $endChunk = $chunk, $chunkIndex = 0;
$chunkIndex < 3; $chunkIndex++, $startChunk = $endChunk, $endChunk +=
$chunk + 1)
{
$r .= "<td>";
$r .= "<ul>";
// output all subcategories to category
for ($index = $startChunk ; $index < $endChunk && $index <
count($children); $index++ )
{
// check for change of starting letter or begging of chunk
if ($children_start_char[$index] != $children_start_char[$index -
1] || $index == $startChunk )
{
$r .= "</ul>";
$r .= "<h3>".$children_start_char[$index]."</h3>\n";
$r .= "<ul>";
}
$r .= "<li>".$children[$index]."</li>";
}
$r .= "</ul>";
$r .= "</td>";
}
$r .= "</tr></table>";
} else {
// for short lists of subcategories to category.
$ti = $this->mTitle->getText() ;
$h = wfMsg( "subcategories" );
$r .= "<h2>{$h}</h2>\n" ;
$r .= "There are ".count( $children )." subcategories to this
category<br>" ;
$r .= "<h3>".$children_start_char[0]."</h3>\n";
$r .= "<ul><li>".$children[0]."</li>";
for ($index = 1; $index < count($children); $index++ )
{
if ($children_start_char[$index] != $children_start_char[$index -
1])
{
$r .= "</ul>";
$r .= "<h3>".$children_start_char[$index]."</h3>\n";
$r .= "<ul>";
}
$r .= "<li>".$children[$index]."</li>";
}
$r .= "</ul>";
}
# Showing articles in this category
if ( count ( $articles ) > 20) {
$ti = $this->mTitle->getText() ;
// divide list into three equal chunks
$chunk = (int) (count ( $articles ) / 3);
// get and display header
$h = wfMsg( "category_header", $ti );
$r .= "<h2>{$h}</h2>\n" ;
$r .= "There are ".count( $articles )." articles in this
category<br>";
$r .= "<table width=\"100%\"><tr valign=\"top\">";
// loop through the chunks
for($startChunk = 0, $endChunk = $chunk, $chunkIndex = 0;
$chunkIndex < 3; $chunkIndex++, $startChunk = $endChunk, $endChunk +=
$chunk + 1)
{
$r .= "<td>";
$r .= "<ul>";
// output all articles in category
for ($index = $startChunk ; $index < $endChunk && $index <
count($articles); $index++ )
{
// check for change of starting letter or begging of chunk
if ($articles_start_char[$index] != $articles_start_char[$index -
1] || $index == $startChunk )
{
$r .= "</ul>";
$r .= "<h3>".$articles_start_char[$index]."</h3>\n";
$r .= "<ul>";
}
$r .= "<li>".$articles[$index]."</li>";
}
$r .= "</ul>";
$r .= "</td>";
}
$r .= "</tr></table>";
} else {
// for short lists of articles in categories.
$ti = $this->mTitle->getText() ;
$h = wfMsg( "category_header", $ti );
$r .= "<h2>{$h}</h2>\n" ;
$r .= "There are ".count( $articles )." articles in this
category<br>" ;
$r .= "<h3>".$articles_start_char[0]."</h3>\n";
$r .= "<ul><li>".$articles[0]."</li>";
for ($index = 1; $index < count($articles); $index++ )
{
if ($articles_start_char[$index] != $articles_start_char[$index -
1])
{
$r .= "</ul>";
$r .= "<h3>".$articles_start_char[$index]."</h3>\n";
$r .= "<ul>";
}
$r .= "<li>".$articles[$index]."</li>";
}
$r .= "</ul>";
}
return $r ;
}
I'd like to report a major bug
I lost contributions (ie some contributions I made are not listed in my
contributions list) and I have contributions listed, that I never made.
See my contributions list
http://fr.wikipedia.org/w/wiki.phtml?title=Special:Contributions&target=Ant…
------
According to that list, I made only 4 contribs these past 3 days
* 2 jun 2004 à 19:52 (hist) Discuter:Métazoaire (dernière) [révoquer]
* 2 jun 2004 à 19:50 (hist) Zoologie (dernière) [révoquer]
* 2 jun 2004 à 19:48 (hist) Animal (dernière) [révoquer]
* 1 jun 2004 à 13:59 (hist) M Wikipédia:Statuts de la Fondation
Wikimedia (test) (dernière) [révoquer]
Well, I am sure I did many more.
See for example
http://fr.wikipedia.org/wiki/Impuissance
History :
http://fr.wikipedia.org/w/wiki.phtml?title=Impuissance&action=history
That article I created is listed no where in my contributions.
------
According to that contribution list, I made no contribution between the
17th of may to the 28th of may. Which I very strongly doubt, as I make
edits basically everyday.
------
According to my contribution list, I contributed to
http://fr.wikipedia.org/wiki/Wikipedia-garbage
while history shows...
http://fr.wikipedia.org/w/wiki.phtml?title=Wikipedia-garbage&action=history
And I am really sure I never did anything on that article
-------
According to that list, I have not edited the bistro since many many
days, when I made a lot of clean up on the 30th.
http://fr.wikipedia.org/w/wiki.phtml?title=Wikip%C3%A9dia:Le+Bistro&limit=1…
In short, the contribution list is ALL MESSED UP.
Is it the same for other users ?
"Tim Starling" <ts4294967296(a)hotmail.com> schrieb:
> The compromise agreed to by Erik and Timwi is to allow the Klingon
> Wikipedia, but to avoid interlanguage links. Accordingly, I have
> commented out the Klingon entry in $wgLanguageNames. This means that
> markup of the form [[tlh:wIqIpe'DIya]] will create an external link
> rather than an interlanguage link, just like links to meta or sep11.
> Otherwise the wiki is fully functional.
I find this a rather disappointing compromise. Either we consider this
a valid Wikipedia, and then I would like to use it as such. Or we
consider this not a valid Wikipedia, and then I have to wonder why
we put it on Wikimedia in the first place. Can I get my own Wiki here
too? I really get the feeling that the compromise is worse than either
of the alternatives it is compromising between.
And I also want to reiterate that if we would find Klingon not worthy
of a Wikipedia, then this should be doubly so for Tokipona.
Andre Engels
"Jay Bowks" <jjbowks(a)adam.cheshire.net> schrieb:
> The Ethnologue lists
> http://www.ethnologue.com/show_family.asp?subid=827
> Esperanto, Europanto, and Interlingua.
> It further mentions that Interlingua is
> a language of France...
> http://www.ethnologue.com/show_language.asp?code=INR
> It also claims that Esperanto is a language
> of France, and that it has "200 to 2,000 people who
> speak it as first language". If so it would be a
> natural and non-artificial language for them
> wouldn't it, those French native speakers of
> Esperanto.... Highly irregular!
As strange as it may sound, it is not fully nonsense. As
I understand, our own contributor and Steward Arno Lagrange
grew up in a family where Esperanto was the language spoken
at home, and thus can be considered a native speaker of the
language.
Andre Engels
I ran some aggregate live profiling on en, and noticed that wfMsg was
responsible for 70% of request time. Each wfMsg() call was leading to a
database query. Investigating, I noticed that the memcached key for messages
was set to "error" -- this indicates that the script attempted to load
messages from the database and save them to memcached, but setting the large
value failed. This error is typical of the old slabs reassignment problem.
However that wasn't the problem in this case, none of the memcached servers
were at their memory quota.
In 1.2, only the internal messages from the MediaWiki namespace were cached.
In 1.3, custom messages are moved to the Template namespace and the whole
namespace is cached. But since I only just started running the move script,
we've been attempting to cache the whole namespace, including all the
templates, for the last week. Presumably the size of the namespace was
larger than memcached's value size limit, so it failed.
To restore site performance adversely affected by loading the messages, I
temporarily switched off $wgUseDatabaseMessages. Then I ran the move script
on en. It is still running as I type. When it is finished, I will clear the
error value from memcached and re-enable $wgUseDatabaseMessages. The first
web request after that should then cache the namespace successfully.
-- Tim Starling