Hello,
Mediazilla seems broken somehow. On attempting to login (I cleared my
cookies), I receive the following backtrace:
---------------------------------------------------------------------
Software error:
Undef to trick_taint at Bugzilla/Util.pm line 67
Bugzilla::Util::trick_taint('undef') called at
Bugzilla/Auth/Persist/Cookie.pm line 61
Bugzilla::Auth::Persist::Cookie::persist_login('Bugzilla::Auth::Persist::Cookie=ARRAY(0x8d15218)',
'Bugzilla::User=HASH(0x8d900dc)') called at Bugzilla/Auth.pm line 147
Bugzilla::Auth::_handle_login_result('Bugzilla::Auth=ARRAY(0x8c3221c)',
'HASH(0x8d8fe00)', 2) called at Bugzilla/Auth.pm line 92
Bugzilla::Auth::login('Bugzilla::Auth=ARRAY(0x8c3221c)', 2) called at
Bugzilla.pm line 218
Bugzilla::login('Bugzilla') called at
/srv/org/wikimedia/bugzilla/show_bug.cgi line 38
---------------------------------------------------------------------
--
Ashar Voultoiz - WP++++
http://en.wikipedia.org/wiki/User:Hasharhttp://www.livejournal.com/community/wikitech/
IM: hashar(a)jabber.org
I am looking at how (among other things) OpenID is integrated into
MediaWiki. Some questions around that:
-> Is the current strategy (pick a 'fake' username in the lcoal wiki;
upgrade it once a nick is passed -or- ask the user to make
a choise/enter one) generally the proper way any user managing API
should do its deed ? Or is there another path/set of thoughts
towards externalizing some of the account mngt ?
-> As 'sreg' gets expanded - could you see a generic/API-ed way of
making such available, say to a rendering or skinning layer; or
is the consensus that normally the first registration would
update/create the users data within media wiki and that would
drive such ?
Thanks,
Dw
--
Dirk-Willem van Gulik
2008/6/14 Kat Walsh <kat(a)mindspillage.org>:
> On Sat, Jun 14, 2008 at 5:28 PM, Gregory Maxwell <gmaxwell(a)gmail.com> wrote:
>> People here might be interested in my post at
>> http://en.wikipedia.org/wiki/Wikipedia:Village_pump_%28policy%29#Image_subm…
> After reading his post I'm disappointed to see this also; I think this
> consequence wasn't in the minds of the people supporting this
> proposal. Since I do take a hardline position on image copyright
> issues, I'd also like it to be easy for us to get free images from
> people who aren't our regular user base and who do already find it
> difficult to contribute media.
> I'm interested in hearing thoughts from David, Mark, or someone else
> actively involved in soliciting free images from outside the usual
> Wikimedians...
My thought is "what the fuck?"
Is there any way this can be reversed, quickly, for en:wp?
- d.
Nikola Smolenski wrote:
> On Wednesday 11 June 2008 23:03:28 Platonides wrote:
>> I'm Ccing Wikitech, i suggest we follow this thread there.
>
> I'm answering on the foundation-l, given that I don't follow wikitech-l, you
> do follow foundation-l, and the issues you raise are more community than
> software related.
I'm not, and i disagree on that, but if the messages arrive to some of
the other two lists, i'll read them anyway :)
>> Nikola Smolenski wrote:
>> > (thread about interwiki bots at toolserver)
>>> Coincidentally, yesterday I released a MediaWiki extension which, if
>>> accepted on Wikimedia projects, may make interwiki bots much less busy.
>>> See http://meta.wikimedia.org/wiki/A_newer_look_at_the_interwiki_link
>> It also works by manual writing of the interwikis. I don't think it's
>> the good way.
>> *You're not taking into account page moves. What will you do when a page
>> is moved? (by a low tech user which knows nothing about the global wiki)
>
> I am taking into account page moves. Right now, when a page is moved, if it
> has 20 interwiki links, someone has to update 20 pages on 20 Wikipedias. With
> the extension, someone has to update a single page on a single wiki -
> clearly, something that is easier to do.
But the page has 20 interwikis to the right version. So a
>> *The articles will still have a 'preferred' title at the interwiki wiki.
>> That means discussing about article titles, "Move to English name", "No,
>> that's not", "Interwikis with pages on Chinese are ugly!"...
>
> I proposed an easy and fair solution: use the name of the page on the first
> wiki that covered the topic. If a topic has first been written about on the
> Vietnamese Wikipedia, use the Vietnamese name. Either way, redirects work,
> and even edit wars of this kind should pose no problem.
I know. But i think avoiding any name is better.
>> IMHO it should be a shared table referencing the wiki and page ids.
>> Then you provide a Special page showing all pages on that group. You'd
>> reference it as 'include this page into the group XX:sometitle is on'.
>> You can also provide some space for free-form commenting (such as
>> explaining the difference with another page).
>> Obviously, all of that must be properly logged, which with SUL should be
>> much easier.
>
> Everything that you described already exists, without the special page. The
> shared table is the langlinks table on the central wiki; you reference it by
> using {{#interlanguage:sometitle}}; free-form commenting is the text on the
> central wiki page; it is properly logged in the page history.
Mmm, you're right. I'd prefer using page_ids, but a more db guy than me
should determine the efficiency difference of using ll_title (the page
title) instead. I notice now that ll_title can't hold any wiki title, as
it's a varchar(255) with namespace, while titles are stored varchar(255)
without namespace everywhere else.
In r36253 [1] I made some changes to the $linkTrail. Primarily I changed
the use of a-z to use \p{L&} instead. From [2] \p{L&} is the Unicode
class for [[:alpha:]] in other words, all alphabetical characters.
This has fixed it so that link trails use all valid characters. As
you'll see in [3] all the characters in Wikipedia's EditTools and a few
more are now considered part of the linkTrail (Before 99% of those would
not be part of the link)
But in addition to that I fixed an old complaint that things like
[[Bar]]'s do not consider the 's as part of the link. (Don't worry, I
made sure that things like ''[[Foo]]'' and '[[Foo]]' do not break)
TimStarling pointed out that some other languages have their own
punctuation characters. I need some translator help compiling a list of
foreign language punctuation characters used similarly to ' which should
become part of the link when they come immediately after the closing ]].
I can't compile a list like this myself because I can't identify what
foreign languages do with certain punctuation characters.
When that list is created, we can add it to a [] inside of the default
$linkTrail. That way punctuation should be linked correctly for all
languages no matter what locale is used.
As TimStarling pointed out, any "ambiguity that can only be resolved ...
at the language [level]" we can fix in individual languages. But for the
most part, things should work for any language no matter what the
current local is. After all, there's nothing wrong with having Japanese
text in an English wiki, we do it all over Wikia and the Anime and Manga
WikiProject on Wikipedia.
[1] http://svn.wikimedia.org/viewvc/mediawiki?view=rev&revision=36253
[2] http://www.regular-expressions.info/posixbrackets.html#class
[3] http://dev.wiki-tools.com/purge/Link_Codes
--
~Daniel Friesen(Dantman) of:
-The Nadir-Point Group (http://nadir-point.com)
--It's Wiki-Tools subgroup (http://wiki-tools.com)
--Games-G.P.S. (http://ggps.org)
-And Wikia ACG on Wikia.com (http://wikia.com/wiki/Wikia_ACG)
On Fri, Jun 13, 2008 at 2:52 PM, <ialex(a)svn.wikimedia.org> wrote:
> - if ( $wgTitle->getText() != $rctitle->getText() ) {
> + if ( $wgTitle->getPrefixedText() != $rctitle->getPrefixedText() ) {
Don't we have a Title::equals() method for this?
Hi jojo,
It do not really get this change. You change the help page handling from
maintainable inside the wiki to a configuration file. And there you use a
non-default interwiki code (mw: is not present in interwiki.sql.
"mediawikiwiki:" points to www.mediawiki.org). This means that a default
MediaWiki installation will not have a proper link to the help page. It will
also no longer facilitate localisation of the help page URL.
I am not sure this is an improvement... I suggest you change it back and
replace 'coll-helppage' to "mediawikiwiki:Extension:Collection/Help". I will
set the message to 'optional' so that translators will not localise it,
unless there is a need for it (for example because
"mediawikiwiki:Extension:Collection/Help" was translated). This ensure a
proper implementation of i18n and possibilities for L10n remain. Wiki sysops
will also still be able to change the documentation link and there is is
sufficient guarantee ensure that the doc link initially points where you
want it to point.
It may of course be that you have a different reason for changing the help
page, and if so, please explain...
Cheers! Siebrand
-----Oorspronkelijk bericht-----
Van: mediawiki-cvs-bounces(a)lists.wikimedia.org
[mailto:mediawiki-cvs-bounces@lists.wikimedia.org] Namens
jojo(a)mayflower.knams.wikimedia.org
Verzonden: vrijdag 13 juni 2008 16:00
Aan: mediawiki-cvs(a)lists.wikimedia.org
Onderwerp: [MediaWiki-CVS] SVN: [36258] trunk/extensions/Collection
Revision: 36258
Author: jojo
Date: 2008-06-13 14:00:01 +0000 (Fri, 13 Jun 2008)
Log Message:
-----------
changed help page handling
Modified Paths:
--------------
trunk/extensions/Collection/Collection.body.php
trunk/extensions/Collection/Collection.i18n.php
trunk/extensions/Collection/Collection.php
Modified: trunk/extensions/Collection/Collection.php
===================================================================
--- trunk/extensions/Collection/Collection.php 2008-06-13 13:22:50 UTC (rev
36257)
+++ trunk/extensions/Collection/Collection.php 2008-06-13 14:00:01 UTC (rev
36258)
@@ -55,6 +55,9 @@
/** Template blacklist article */
$wgPDFTemplateBlacklist = 'MediaWiki:PDF Template Blacklist';
+/** Help page for the Collection extension */ $wgCollectionHelpPage =
+'mw:Extension:Collection/Help';
On Fri, Jun 13, 2008 at 5:27 AM, <raymond(a)svn.wikimedia.org> wrote:
> Remove non standard, Gecko specific CSS.
> At least Opera whines about it and the round corners are not consistent to all other boxes with square corners.
Browsers whining about it should not be a reason not to use it. CSS
deliberately specifies that unsupported properties should be ignored,
and it's no problem if they are (whether or not some caution is noted
in a log somewhere). Firefox will whine if you use any number of CSS
3 properties that it doesn't yet support.
If the CSS added a useful feature, it should have stayed,
vendor-specific or not. However, it shouldn't be made
browser-specific if that can be avoided; in this case, we should have
had standard border-radius properties as well. Personally I don't
care whether the borders are square or round, but if they're round,
-moz-border-radius should be there as well as border-radius, to ensure
better support among browsers.
For example, categorylinks.sql
http://download.wikimedia.org/enwiki/latest/enwiki-latest-categorylinks.sql…
When extracted, around 1.3 GB, I have started the importing on a P3 1G
with 256RAM, the job is running more than 1 day now.
Sure I have lock the table, disable indexing, disable unique key check
ect...and followed the recommended InnoDB settings as in MySQL
Manual.
May I should try MyISAM later to compare.
But anyone has experience in slow importing?
Howard
A tips from Yahoo:
http://developer.yahoo.com/performance/rules.htm
==============
One of the previous best practices states that CSS should be at the
top in order to allow for progressive rendering.
In IE @import behaves the same as using <link> at the bottom of the
page, so it's best not to use it.
=============
Maybe wikipedia should consider this?
Howard