Hi all,
no new blockers.
The two current blockers are still open, and the discussion is
ongoing. In both cases the ball in our court right now:
* The Wikidata branch
(https://bugzilla.wikimedia.org/show_bug.cgi?id=38622): the latest
comment by Tim <https://bugzilla.wikimedia.org/show_bug.cgi?id=38622#c15>
needs to be addressed by us. It is currently for a few days on hold
internally, as Daniel K. is on vacations. He plans to work on it first
thing next week.
* The Sites table
(https://bugzilla.wikimedia.org/show_bug.cgi?id=38705): a discussion
has started here, leading to a few open questions for us to answer and
an RFC to join. We plan to get to this very fast (I am writing that
mail for two days now, and cannot find enough time to finish it). I
can only point everyone who wants to join to the RFC and to add their
use cases to it:
<https://www.mediawiki.org/wiki/Requests_for_comment/New_sites_system>
>From the collected use cases at the RFC, we want to draft a solution
that is most advantageous for MediaWiki as a whole, and then see how
we can fit our current needs into it.
We are working on both. Thanks to everyone who joined and helped so far!
Cheers,
Denny
--
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
== Gerrit ==
For the time being, we're sticking with gerrit for ops, MediaWiki core, and
extensions:
* it works for what we need
* we have existing deployment workflows built on it
* it's being actively developed, and various improvements are coming in
That said there are known negatives; the Java+Google Web Toolkit front-end
is intimidating to people who might want to help improve the UI; even
Gerrit devs don't love it. :)
Improvements to the UI and to the git-review CLI tool are welcome...
The good news for those who aren't so fond of Gerrit is that we do have
some nice alternatives on the horizon, which we can start working with and
improve... we'll be re-evaluating things across the board for core &
extensions next year. (Ops can keep Gerrit forever if they like it, that's
not my concern. ;)
== Phabricator ==
We've been pretty impressed with Phabricator and I encourage continued
experimentation with it. At present it's not capable of fully taking on all
our stuff -- actual repository hosting isn't finished, and I get the
impression there's some UI work needed for some workflows, eg large numbers
of repos (all our extensions).
Being PHP-based, it should be easier for our engineers and volunteers to
jump in and modify or contribute back on Phabricator; this is something
that shouldn't be underestimated. Phabricator devs seem very open to
talking about improvements, and that makes me happy.
Note that projects hosted separately on GitHub can make use of Phabricator
in parallel with GitHub's own pull request review system -- the test
instance at https://phabricator.wmflabs.org/ includes the Wiki Loves
Monuments mobile app and I plan to do more testing with it myself.
We may also be able to devise a Gerrit+Phabricator hybrid setup using the
gerrit repository hosting, but that'll be something we'll have to
experiment with in the future.
== Github strategy ==
I very strongly recommend having official mirrors on github that can be
easily forked for experimentation. I understand this is currently waiting
on the Gerrit 2.5 update, which makes things easier for the mirroring setup
-- that should be ready in a month or so, unless Chad can figure out how
make it work on 2.4 sooner.
Even if we don't have an automated way of accepting pull requests, a pull
request is easier to work with than a patch posted to Bugzilla -- you can
do the entire pull/test workflow within git, then squash (if necessary) and
send up to gerrit for final review.
Note also that git-review should be able to push up to Gerrit even if you
cloned from a GitHub mirror -- so a mirror may also relieve some stress on
the main Gerrit server for large clones etc.
As a potential primary repository, Github is pretty good but loses on a
couple fronts:
* not open source -> limited in what we can customize
* third-party hosting -> potential availability and customization issues
** their "GitHub enterprise" self-hosted service is not inconceivable, but
still isn't open source and it's unclear what we could modify etc
Note that we are still using GitHub for projects like the Wikimedia mobile
applications, which mostly predate the gerrit switch for MW core:
* https://github.com/wikimedia/WikipediaMobile
* https://github.com/wikimedia/WLMMobile
as well as for patched forks of other projects using GitHub, such as Apache
Cordova (PhoneGap) which hosts on Apache git servers but takes a lot of
contributions through forks on their GitHub mirror.
Having an official GitHub presence just makes sense.
-- brion vibber (bvibber @ wikimedia.org / brion @ pobox.com)
On Wed, Aug 15, 2012 at 12:00 PM, Thomas Gries <mail(a)tgries.de> wrote:
> Am 15.08.2012 20:58, schrieb Mark Holmquist:
>>> Saves much time and efforts
>>> You don't use it, otherwise you wouldn't have asked that question.
>>
>> Clearly--but could you elaborate more as to why it's helpful? (Thomas
>> is currently in #ethereditor having this conversation)
> OpenID was discussed in other threads; a discussion of pros and cons
> would be off-topic here.
>
I think it would be useful to have the openid extension enabled in
labs prototypes if wikimedia ran as a provider and the openid
extension was forced to use the wikimedia provider for login. Just
adding in the OpenID extension without thought really just makes
things more confusing and less usable.
- Ryan
Hey,
So a quick grep search cannot tell me where the method Status::getXML is
used. It doesn't seem to exist anywhere in the core. Maybe some extensions
use it? I'm asking primarily because if it's not used then it should
probably be removed. Logic for processing Status objects into export
formats such as XML shouldn't be handled (and apparently aren't handled) by
the Status object itself.
*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerromeo(a)gmail.com
Hello,
On Tue, Aug 7, 2012 at 11:48 PM, Srikanth Lakshmanan (WMF) <
slakshmanan(a)wikimedia.org> wrote:
> Hello everyone,
>
> You're invited to the IRC office hours with the Localisation team[1] at
> the Wikimedia Foundation.
>
> Date: 2012-08-15
> Time: 16.30 UTC
> Venue: #wikimedia-office
>
Thanks for your participation. You can find the log here[1]
[1] http://meta.wikimedia.org/wiki/IRC_office_hours/Office_hours_2012-08-15
--
Srikanth L
Wikimedia Localisation Team
Since MediaWiki 1.18 we have the variable $wgUseCombinedLoginLink [1]
which is set to true per default.
During edit workshops with students and seniors I registered that new
editors are confused about the combined login page. They tried to
register new accounts on the login page.
Surely, these observations are not representative but I think that the
usability could be improved by setting $wgUseCombinedLoginLink=false
If I missed a prior discussion about this issue I apologize and would be
happy if someone could point me to the discussion.
Otherwise I suggest to set $wgUseCombinedLoginLink to false for all WMF
wikis.
Raimond.
[1] https://www.mediawiki.org/wiki/Manual:$wgUseCombinedLoginLink
I've been tracking a performance problem which took me into this
profiler autoloading code by way of a false trail, so I am ignorant of
the development history, but at least I have come to this with fresh
eyes. I have also trawled the DL for relevant threads, and my topic is
related to a thread "Dropping StartProfiler.php", 25-28 Dec 2011,
largely between Daniel Friesen and Tim Starling. (I've just subscribed
to the DL, so can't pick up the thread references).
I have some general observations:
1) Profiling is a rare activity undertaken by developers and sysadmins,
so given the autoload architecture, it makes a load of sense simply to
not load it at all for normal mediawiki use. The use of the functions
wfProfileIn and wfProfileOut facilitate this.
2) However, at the moment db/Database.php (793) and other routines
invoke Profiler::instance() unconditionally which causes at a minimum
ProfilerStub instantiation.
3) The profiling architecture allows the binding of context parameters
during the instantiation of profiling extension classes, yet the
configuration by setting of the array $wgProfiler[] does not enable
these to be passed to the invoked $wgProfiler['class'] class.
4) Whether singleton templates are recommended is at least controversial
topic. Nonetheless, MW does use a number of classic singletons, for
example MessageCache. However, IMO, the Profiler class is an anomaly in
that it is sort of a singleton hybrid -- see instance() and
setInstance() methods. Doing this is confusing and seems to add zero value.
5) As discussed by Daniel in the above referenced thread, the only
modules included between Profiler and LocalSettings are Defines,
StartProfiler and DefaultSettings. Bypassing StartProfiller, Defines
and DefaultSettings set configuration defines and global variable, so
little is to be gained in profiling them.
And so to my suggestions:
A) Modify WebStart to remove all Profiler class related requires and
move the wfProfileIn( 'WebStart.php-conf' ) statement below the
LocalSettings load
B) Move wfProfileIn() and wfProfileOut() to GlobalFunctions, since this
is truly what they are.
C) Add an additional wfProfileEnabled() to GlobalFunctions which guards
any reference to Profile or its extended classes by a test of
$wgProfiler. Replace the ungarded Profiler::instance()->isSub() in
db/Database.php et al by a wfProfileEnabled() test. This plus (B) means
that the Profiler classes will not be autoloaded at all under normal
circumstances.
D) Profiling can now simply be enabled by setting
$wfProfiler = new ProfilerWhateverWanted( ... whatever params are
wanted ... )
in LocalSettings. The profiler extension and base class will then be
loaded by the autoloader except in the case of a custom profiler which
will need an explicit require. Forget the idea of using an array format
for $wfProfiler. This seems to have been for dogmatic reasons and isn't
fully implemented anyway.
E) If dynamic enabling / disabling of profiling is needed to achieve
timeline windowing then this can be done easily by setting, say
$wfProfilerTemp to the profiler extension class instance and setting the
$wfProfiler global to this variable as required and unsetting ditto.
F) Hence the guard test in wfProfileIn(), wfProfileOut() and
wfProfileEnabled() becomes the simple and lean (function call + 10
opcodes according to VLD):
if (!isset( $wfProfiler ) return;
G) Make Profiler a simple (non-pseudo singleton) class as it will be
autoload as the base class of any specific Profiler extension . As
$wfProfiler is used to contain the global profiler extension class
instance then, instance() and setInstance() add no value.
I can provide my patch which implements this, if wanted. Comments?
Regards Terry Ellison
Hello all,
I would like to invite you to the upcoming localisation and
internationalisation bug triage. The bug triage preparation is
available[1]. Please feel free to add bugs / use the chat window / email me
about i18n issues that you would like to discuss in the triage.
When : August 22, 16:00 - 17:00 UTC
Where : Freenode IRC channel #mediawiki-i18n
[1] http://etherpad.wikimedia.org/BugTriage-i18n-2012-08
--
Srikanth L
Wikimedia Localisation Team