I know it has been annoying a couple of people other than me, so now that I've learned how to make it work I'll share the knowledge here.
tl;dr: Star the repositories. No, seriously. (And yes, you need to star each extension repo separately.)
(Is there a place on mw.org to put this tidbit on?)
------- Forwarded message -------
From: "Brian Levine" <support(a)github.com> (GitHub Staff)
To: matma.rex(a)gmail.com
Cc:
Subject: Re: Commits in mirrored repositories not showing up on my profile
Date: Tue, 09 Jul 2013 06:47:19 +0200
Hi Bartosz
In order to link your commits to your GitHub account, you need to have some association with the repository other than authoring the commit. Usually, having push access gives you that connection. In this case, you don't have push permission, so we don't link you to the commit.
The easy solution here is for you to star the repository. If you star it - along with the other repositories that are giving you this problem - we'll see that you're connected to the repository and you'll get contribution credit for those commits.
Cheers
Brian
--
Matma Rex
Hi, I'd like to present a new RFC for your consideration:
https://www.mediawiki.org/wiki/Requests_for_comment/Minifier
It is about how we can shave 10-15% off the size if JavaScript
delivered to users.
Your comments are highly welcome!:)
--
Best regards,
Max Semenik ([[User:MaxSem]])
Thank you for the quick fix!
Best,
--
Sukyoung
On Jan 29, 2014, at 9:55 AM, Nathan wrote:
> FYI in case you aren't subscribed to the list.
>
> ---------- Forwarded message ----------
> From: Yair Rand <yyairrand(a)gmail.com>
> Date: Tue, Jan 28, 2014 at 7:25 PM
> Subject: Re: [Wikitech-l] Bug in the Wikipedia main web page
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
>
>
> Thank you for pointing out this bug. Your suggested change to
> MediaWiki:Gadget-wm-portal.js has been implemented by Meta-Wiki
> administrator User:PiRSquared17.
>
>
> On Tue, Jan 28, 2014 at 6:50 PM, Sukyoung Ryu <sukyoung.ryu(a)gmail.com>wrote:
>
> > Dear all,
> >
> > We are researchers at KAIST in Korea working on finding JavaScript bugs in
> > web pages. While analyzing top websites from Alexa.com, we found an issue,
> > which seems to be a bug, on the Wikipedia main web page (wikipedia.org).
> > We would be grateful if you can either confirm that it is a bug and even
> > better fix it or let us know what we're missing.
> >
> > Here's the issue. When a user selects a language in which search results
> > are displayed via the language selection button from the Wikipedia main web
> > page, the following JavaScript function is executed:
> >
> > 1 function setLang(lang) {
> > 2 var uiLang = navigator.language || navigator.userLanguage, date
> > = new Date();
> > 3
> > 4 if (uiLang.match(/^\w+/) === lang) {
> > 5 date.setTime(date.getTime() - 1);
> > 6 } else {
> > 7 date.setFullYear(date.getFullYear() + 1);
> > 8 }
> > 9
> > 10 document.cookie = "searchLang=" + lang + ";expires=" +
> > date.toUTCString() + ";domain=" + location.host + ";";
> > 11 }
> >
> > Depending on the evaluation result of the conditional expression on line
> > 4, "uiLang.match(/^\w+/) === lang", the function leaves or dose not leave
> > the selected language information on the user's computer through a cookie.
> > But we found that the expression, "uiLang.match(/^\w+/) === lang", always
> > evaluates to false, which results in that the function always leaves
> > cookies on users' computers. We think that changing the contidional
> > expression, "uiLang.match(/^\w+/) === lang", to the expression,
> > "uiLang.match(/^\w+/) == lang", will solve the problem.
> >
> > This problem may occur in the main web pages of all the Wikimedia sites.
> > Could you kindly let us know what you think? Thank you in advance.
> >
> > Best,
> > Changhee Park and Sukyoung Ryu
> >
> >
> > _______________________________________________
> > Wikitech-l mailing list
> > Wikitech-l(a)lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
Hey all,
TL;DR: jQuery will soon be upgraded from v1.8.3 to v1.11.x (the latest). This
major release removes deprecated functionality. Please migrate away from this
deprecated functionality as soon as possible.
It's been a long time coming but we're now finally upgrading the jQuery package
that ships with MediaWiki.
We used to regularly upgrade jQuery in the past, but got stuck at v1.8 a couple
of years ago due to lack of time and concern about disruption. Because of this,
many developers have needed to work around bugs that were already fixed in later
versions of jQuery. Thankfully, jQuery v1.9 (and its v2 counterpart) has been
the first release in jQuery history that needed an upgrade guide[1][2]. It's a
major release that cleans up deprecated and dubious functionality.
Migration of existing code in extensions, gadgets, and user & site scripts
should be trivial (swapping one method for another, maybe with a slight change
to the parameters passed). This is all documented in the upgrade guide[1][2].
The upgrade guide may look scary (as it lists many of your favourite methods),
but they are mostly just addressing edge cases.
== Call to action ==
This is a call for you, to:
1) Get familiar with http://jquery.com/upgrade-guide/1.9/.
2) Start migrating your code.
jQuery v1.9 is about removing deprecated functionality. The new functionality is
already present in jQuery 1.8 or, in some cases, earlier.
3) Look out for deprecation warnings.
Once instrumentation has begun, using "?debug=true" will log jQuery deprecation
warnings to the console. Look for ones marked "JQMIGRATE" [7]. You might also
find deprecation notices from mediawiki.js, for more about those see the mail
from last October [8].
== Plan ==
1) Instrumentation and logging
The first phase is to instrument jQuery to work out all the areas which will
need work. I have started work on loading jQuery Migrate alongside the current
version of jQuery. I expect that to land in master this week [6], and roll out on
Wikimedia wikis the week after. This will enable you to detect usage of most
deprecated functionality through your browser console. Don't forget the upgrade
guide[1], as Migrate cannot detect everything.
2) Upgrade and Migrate
After this, the actual upgrade will take place, whilst Migrate stays. This
should not break anything since Migrate covers almost all functionality that
will be removed. The instrumentation and logging will remain during this phase;
the only effective change at this point is whatever jQuery didn't think was
worth covering in Migrate or were just one of many bug fixes.
3) Finalise upgrade
Finally, we will remove the migration plugin (both the Migrate compatibility
layer and its instrumentation). This will bring us to a clean version of latest
jQuery v1.x without compatibility hacks.
A rough timeline:
* 12 May 2014 (1.24wmf4 [9]): Phase 1 – Instrumentation and logging starts. This
will run for 4 weeks (until June 9).
* 19 May 2014 (1.24wmf5): Phase 2 – "Upgrade and Migrate". This will run for 3
weeks (upto June 9). The instrumentation continues during this period.
* 1 June 2014 (1.24wmf7) Finalise upgrade.
== FAQ ==
Q: The upgrade guide is for jQuery v1.9, what about jQuery v1.10 and v1.11?
A: Those are regular updates that only fix bugs and/or introduce non-breaking
enhancements. Like jQuery v1.7 and v1.8, we can upgrade to those without any
hassle. We'll be fast-forwarding straight from v1.8 to v1.11.
Q: What about the jQuery Migrate plugin?
A: jQuery developed a plugin that adds back some of the removed features (not
all, consult the upgrade guide[2] for details). It also logs usage of these to
the console.
Q: When will the upgrade happen?
A: In the next few weeks, once we are happy that the impact is reasonably low.
An update will be sent to wikitech-l just before this is done as a final reminder.
This will be well before the MediaWiki 1.24 branch point for extension authors
looking to maintain compatibility.
Q: When are we moving to jQuery v2.x?
A: We are not currently planing to do this. Despite the name, jQuery v2.x
doesn't contain any new features compared to jQuery v1 [3]. The main difference
is in the reduced support for different browsers and environments; most
noticeably, jQuery 2.x drops support for Internet Explorer 8 and below, which
MediaWiki is still supporting for now, and is outside the scope of this work.
Both v1 and v2 continue to enjoy simultaneous releases for bug fixes and new
features. For example, jQuery released v1.11 and v2.1 together[4][5].
-- Krinkle
[1] http://blog.jquery.com/2013/01/15/jquery-1-9-final-jquery-2-0-beta-migrate-
final-released/
[2] http://jquery.com/upgrade-guide/1.9/
[3] http://blog.jquery.com/2013/04/18/jquery-2-0-released/
[4] http://blog.jquery.com/2014/01/24/jquery-1-11-and-2-1-released/
[5] http://blog.jquery.com/2013/05/24/jquery-1-10-0-and-2-0-1-released/
[6] https://gerrit.wikimedia.org/r/131494
[7] https://github.com/jquery/jquery-migrate/blob/master/warnings.md
[8] http://www.mail-archive.com/wikitech-l@lists.wikimedia.org/msg72198.html
[9] https://www.mediawiki.org/wiki/MediaWiki_1.24/Roadmap
Are you good in swearing? WE NEED YOU
Huggle 3 comes with vandalism-prediction as it is precaching the diffs
even before they are enqueued including their contents. Each edit has
so called "score" which is a numerical value that if higher, the edit
is more likely a vandalism.
If you want to help us improve this feature, it is necessary to define
a "score words" list for every wiki where huggle is about to be used,
for example on English wiki.
Each list has following syntax:
(see https://en.wikipedia.org/w/index.php?title=Wikipedia:Huggle/Config&diff=573…)
score-words(score):
list of words separated by comma, can contain newlines but comma
must be present
example
score-words(200):
these, are, some, words, which, presence, of, increases, the, score,
each, word, by, 200,
So, if you know english better than me, which you likely do, go ahead
and improve the configuration file there, no worries, huggle's config
parser is very syntax-error proof.
If you have any other suggestion how to improve huggle's prediction,
go ahead and tell us!
I just love this Google I/O 2013 talk on human perception and
cognition, and its implications for interactive and visual design. It
is accessible, but with a lot of information and applies very well to
us I think.
I'm sure that many designers know all about this and some have
probably seen the clip before, but it is also very good for
developers, because many of these things we know subconsciously, but
it's not really part of our vocabulary.
https://www.youtube.com/watch?v=z2exxj4COhU
DJ
Hi everyone,
I've written an RfC about changing the way extensions store metadata
about themselves and also how we load them. It's at
<https://www.mediawiki.org/wiki/Requests_for_comment/Extension_registration>.
A brief summary:
Extensions register a lot of things, like classes, hooks, special pages,
configuration options, and plenty more. Currently all of these are
usually stored in the extension's main entry point
($IP/extensions/Foo/Foo.php).
This creates two problems. First, it's difficult to tell what an
extension is going to register without actually enabling the extension.
Second, registration currently depends on global state ($wgHooks,
$wgSpecialPages, etc.) which we are trying to move away from.
My proposal is that we store this information in a JSON file (I've
provided an example on the RfC), and have MediaWiki read them when
loading extensions. We would no longer use the current method of
require_once "Foo.php";, but instead $wgEnableExtensions[] = 'foo';, and
MediaWiki would take care of the rest.
Please leave any comments or thoughts you might have on the talk page.
Thanks,
-- Legoktm
Hello everyone,
I am Diwanshi Pandey, an OPW intern. I'd like to have your feedback on the
course I have created on codecademy for mediawiki api with help of my
mentor Yuri Astrakhan.
A little insight:
The course is about parsing and querying mediawiki api.
Initially we created one course which included 44 exercises but according
to codecademy's guidelines their course are for beginners and should have
maximum 30 exercises in one course.
So we did a split up into two courses:
One is Introduction to Wikipedia
API<http://www.codecademy.com/courses/web-beginner-en-vj9nh/0/1>and
other is Wikipedia:Query
API <http://www.codecademy.com/courses/web-beginner-en-yd3lp/0/1>.
Also due to api security and restrictions we couldn't implement tutorial on
"editing wiki pages through api call" from a non wiki site yet. We are
waiting till we find a good and easy way to demo that.
Feedback may include:
* Are the exercises easy to understand for novice users/developers?
* Are changes needed in the look of exercises?
* Are there any exercises which need not to be implemented or in too depth?
* Any other thing?
Thanks,
--
*Regards,*
*Diwanshi Pandey*
Hi everyone,
A big thank you to everyone who participated in the Architecture Summit
this year! We covered a lot of ground this year, and collectively learned
a lot about how to put these things together.
A lot of our work from this summit on this is only just beginning.
Speaking of that, just the act of processing the notes is going to be the
first step. MatmaRex and Legoktm pulled together a concise list of the
etherpads here, which I've annotated with the corresponding agenda pages:
https://www.mediawiki.org/wiki/Architecture_Summit_2014/Retrospective#Notes…
Our next course of action is to get all of these copied to mediawiki.org to
the agenda pages, so that we have a record of each session that will
outlive Etherpad's temporary storage. I've done a little bit (the
Architecture value, process, and guidelines discussion), but I'd love help
from others on both getting it all copied in place, and wikifying it. Any
takers?
Those of you that couldn't make it, I'd encourage you to read through the
notes We'll be discussing a lot of this over the coming days, so it'll be
useful context to bring you up to speed on these things.
Rob
tl;dr Let's adopt a better structure for skins. A more detailed proposal is at the bottom.
As you might know, I am doing a Google Summer of Code project aiming to disentangle the mess of MediaWiki's skinning system a little bit, make creating custom skins a bit less painful and improve the separation between MediaWiki and its core skins [0] (https://www.mediawiki.org/wiki/Separating_skins_from_core_MediaWiki).
I want this thread to result in code changes :)
----
So, MediaWiki supports skins, and apart from the four core ones there's a couple dozen of skins available for installation [1]. And looking at them, it seems as if every single one used a different directory structure, and this a different installation method.
I think this is bad, and that we should standardize on something – preferably one of the widely used methods – and use it for the core skins as well to provide a good example.
----
There seem to be three popular ways:
* $IP/skins/SkinName.php for the main file plus $IP/skins/skinname/ for assets, using an autodiscovery mechanism to automagically make the skin available after the files are copied in the right place. This is used by all of the core skins (Vector has some special cases, but let's ignore that for now), as well as many external skins (e.g. Cavendish [2]), at a glance mostly older ones.
* $IP/skins/SkinName/ for both assets and PHP files ($IP/skins/skinname/SkinName.php etc.), using require_once in LocalSettings like extensions to load the skin, manually adding an entry to $wgValidSkinNames in the main PHP file. This seems to be the preferred method among "modern" skins, for example Erudite [3] or Nimbus [4].
* $IP/extensions/SkinName/ for everything, the rest as above. This makes the skin work exactly like an extension. The only example I could find on mediawiki.org is the Nostalgia skin [5].
----
The first one sounds like a no-go for me (in spite of being currently used for core skins, ugh):
* The directory structure makes it annoying to both manage and write such skins (you need to copy/delete the PHP file and the directory separately, many text editors provide additional features for files contained in a single directory, and just look at our .gitignore file for skins oh god why [6]).
* The usage of autodiscovery, while making installation and testing a bit simpler, makes it impossible or unpleasant to temporarily disable a skin or to provide configuration settings for it (the last point doesn't affect core skins).
This leaves us with the two latter options: packaging skins similarly to extensions and sticking them in /skins, or packaging them like extensions and treating them like extensions. These two options are pretty similar and discussing them will be a bit bikesheddy, but let's do it anyway.
(Note also that even if we wanted to, we can't stop anyone from using either of these if they feel like it, as MediaWiki supports loading everything from anywhere if you really want. We can, however, deprecate skin autodiscovery.)
----
Personally I'm leading towards the /skins/SkinName option. The pros are:
* It seems to be more widely used, which means that it "felt right" to a lot of people, and that shouldn't be underestimated.
* It's less revolutionary, and rather a simple improvement over the current system.
* It's more intuitive when compared to how other applications / projects works. (Corollary: just because MediaWiki skins can do everything that extensions can do, we shouldn't encourage that.)
* Since it's still similar to how extensions work, adapting the current system (WMF deployments, tarball packaging, installation via web installer) should be straightforward.
* Switching current skins to this system within the mediawiki/core repo will be trivial.
The pros of using /extensions/SkinName are:
* We already have a battle-tested system for doing things with extensions (WMF deployments, tarball packaging, installation via web installer).
* All non-core code in one place.
I would like to settle this within a week or two. Help! :)
Thoughts?
I will document the result and, if feasible, convert core skins to be closer to the recommended format afterwards.
----
tl;dr Let's start putting all skins files in a single directory, and let's use a grown-up structure with one class per file + separate init code for them. Okay?
[1] https://www.mediawiki.org/wiki/Category:Skin (this category tree is a mess, huh)
[2] https://www.mediawiki.org/wiki/Skin:Cavendish
[3] https://www.mediawiki.org/wiki/Skin:Erudite
[4] https://www.mediawiki.org/wiki/Skin:Nimbus
[5] https://www.mediawiki.org/wiki/Extension:Nostalgia
[6] https://git.wikimedia.org/blob/mediawiki%2Fcore/master/skins%2F.gitignore
--
Matma Rex