I know it has been annoying a couple of people other than me, so now that I've learned how to make it work I'll share the knowledge here.
tl;dr: Star the repositories. No, seriously. (And yes, you need to star each extension repo separately.)
(Is there a place on mw.org to put this tidbit on?)
------- Forwarded message -------
From: "Brian Levine" <support(a)github.com> (GitHub Staff)
To: matma.rex(a)gmail.com
Cc:
Subject: Re: Commits in mirrored repositories not showing up on my profile
Date: Tue, 09 Jul 2013 06:47:19 +0200
Hi Bartosz
In order to link your commits to your GitHub account, you need to have some association with the repository other than authoring the commit. Usually, having push access gives you that connection. In this case, you don't have push permission, so we don't link you to the commit.
The easy solution here is for you to star the repository. If you star it - along with the other repositories that are giving you this problem - we'll see that you're connected to the repository and you'll get contribution credit for those commits.
Cheers
Brian
--
Matma Rex
Hi, I'd like to present a new RFC for your consideration:
https://www.mediawiki.org/wiki/Requests_for_comment/Minifier
It is about how we can shave 10-15% off the size if JavaScript
delivered to users.
Your comments are highly welcome!:)
--
Best regards,
Max Semenik ([[User:MaxSem]])
Thank you for the quick fix!
Best,
--
Sukyoung
On Jan 29, 2014, at 9:55 AM, Nathan wrote:
> FYI in case you aren't subscribed to the list.
>
> ---------- Forwarded message ----------
> From: Yair Rand <yyairrand(a)gmail.com>
> Date: Tue, Jan 28, 2014 at 7:25 PM
> Subject: Re: [Wikitech-l] Bug in the Wikipedia main web page
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
>
>
> Thank you for pointing out this bug. Your suggested change to
> MediaWiki:Gadget-wm-portal.js has been implemented by Meta-Wiki
> administrator User:PiRSquared17.
>
>
> On Tue, Jan 28, 2014 at 6:50 PM, Sukyoung Ryu <sukyoung.ryu(a)gmail.com>wrote:
>
> > Dear all,
> >
> > We are researchers at KAIST in Korea working on finding JavaScript bugs in
> > web pages. While analyzing top websites from Alexa.com, we found an issue,
> > which seems to be a bug, on the Wikipedia main web page (wikipedia.org).
> > We would be grateful if you can either confirm that it is a bug and even
> > better fix it or let us know what we're missing.
> >
> > Here's the issue. When a user selects a language in which search results
> > are displayed via the language selection button from the Wikipedia main web
> > page, the following JavaScript function is executed:
> >
> > 1 function setLang(lang) {
> > 2 var uiLang = navigator.language || navigator.userLanguage, date
> > = new Date();
> > 3
> > 4 if (uiLang.match(/^\w+/) === lang) {
> > 5 date.setTime(date.getTime() - 1);
> > 6 } else {
> > 7 date.setFullYear(date.getFullYear() + 1);
> > 8 }
> > 9
> > 10 document.cookie = "searchLang=" + lang + ";expires=" +
> > date.toUTCString() + ";domain=" + location.host + ";";
> > 11 }
> >
> > Depending on the evaluation result of the conditional expression on line
> > 4, "uiLang.match(/^\w+/) === lang", the function leaves or dose not leave
> > the selected language information on the user's computer through a cookie.
> > But we found that the expression, "uiLang.match(/^\w+/) === lang", always
> > evaluates to false, which results in that the function always leaves
> > cookies on users' computers. We think that changing the contidional
> > expression, "uiLang.match(/^\w+/) === lang", to the expression,
> > "uiLang.match(/^\w+/) == lang", will solve the problem.
> >
> > This problem may occur in the main web pages of all the Wikimedia sites.
> > Could you kindly let us know what you think? Thank you in advance.
> >
> > Best,
> > Changhee Park and Sukyoung Ryu
> >
> >
> > _______________________________________________
> > Wikitech-l mailing list
> > Wikitech-l(a)lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
Hey all,
TL;DR: jQuery will soon be upgraded from v1.8.3 to v1.11.x (the latest). This
major release removes deprecated functionality. Please migrate away from this
deprecated functionality as soon as possible.
It's been a long time coming but we're now finally upgrading the jQuery package
that ships with MediaWiki.
We used to regularly upgrade jQuery in the past, but got stuck at v1.8 a couple
of years ago due to lack of time and concern about disruption. Because of this,
many developers have needed to work around bugs that were already fixed in later
versions of jQuery. Thankfully, jQuery v1.9 (and its v2 counterpart) has been
the first release in jQuery history that needed an upgrade guide[1][2]. It's a
major release that cleans up deprecated and dubious functionality.
Migration of existing code in extensions, gadgets, and user & site scripts
should be trivial (swapping one method for another, maybe with a slight change
to the parameters passed). This is all documented in the upgrade guide[1][2].
The upgrade guide may look scary (as it lists many of your favourite methods),
but they are mostly just addressing edge cases.
== Call to action ==
This is a call for you, to:
1) Get familiar with http://jquery.com/upgrade-guide/1.9/.
2) Start migrating your code.
jQuery v1.9 is about removing deprecated functionality. The new functionality is
already present in jQuery 1.8 or, in some cases, earlier.
3) Look out for deprecation warnings.
Once instrumentation has begun, using "?debug=true" will log jQuery deprecation
warnings to the console. Look for ones marked "JQMIGRATE" [7]. You might also
find deprecation notices from mediawiki.js, for more about those see the mail
from last October [8].
== Plan ==
1) Instrumentation and logging
The first phase is to instrument jQuery to work out all the areas which will
need work. I have started work on loading jQuery Migrate alongside the current
version of jQuery. I expect that to land in master this week [6], and roll out on
Wikimedia wikis the week after. This will enable you to detect usage of most
deprecated functionality through your browser console. Don't forget the upgrade
guide[1], as Migrate cannot detect everything.
2) Upgrade and Migrate
After this, the actual upgrade will take place, whilst Migrate stays. This
should not break anything since Migrate covers almost all functionality that
will be removed. The instrumentation and logging will remain during this phase;
the only effective change at this point is whatever jQuery didn't think was
worth covering in Migrate or were just one of many bug fixes.
3) Finalise upgrade
Finally, we will remove the migration plugin (both the Migrate compatibility
layer and its instrumentation). This will bring us to a clean version of latest
jQuery v1.x without compatibility hacks.
A rough timeline:
* 12 May 2014 (1.24wmf4 [9]): Phase 1 – Instrumentation and logging starts. This
will run for 4 weeks (until June 9).
* 19 May 2014 (1.24wmf5): Phase 2 – "Upgrade and Migrate". This will run for 3
weeks (upto June 9). The instrumentation continues during this period.
* 1 June 2014 (1.24wmf7) Finalise upgrade.
== FAQ ==
Q: The upgrade guide is for jQuery v1.9, what about jQuery v1.10 and v1.11?
A: Those are regular updates that only fix bugs and/or introduce non-breaking
enhancements. Like jQuery v1.7 and v1.8, we can upgrade to those without any
hassle. We'll be fast-forwarding straight from v1.8 to v1.11.
Q: What about the jQuery Migrate plugin?
A: jQuery developed a plugin that adds back some of the removed features (not
all, consult the upgrade guide[2] for details). It also logs usage of these to
the console.
Q: When will the upgrade happen?
A: In the next few weeks, once we are happy that the impact is reasonably low.
An update will be sent to wikitech-l just before this is done as a final reminder.
This will be well before the MediaWiki 1.24 branch point for extension authors
looking to maintain compatibility.
Q: When are we moving to jQuery v2.x?
A: We are not currently planing to do this. Despite the name, jQuery v2.x
doesn't contain any new features compared to jQuery v1 [3]. The main difference
is in the reduced support for different browsers and environments; most
noticeably, jQuery 2.x drops support for Internet Explorer 8 and below, which
MediaWiki is still supporting for now, and is outside the scope of this work.
Both v1 and v2 continue to enjoy simultaneous releases for bug fixes and new
features. For example, jQuery released v1.11 and v2.1 together[4][5].
-- Krinkle
[1] http://blog.jquery.com/2013/01/15/jquery-1-9-final-jquery-2-0-beta-migrate-
final-released/
[2] http://jquery.com/upgrade-guide/1.9/
[3] http://blog.jquery.com/2013/04/18/jquery-2-0-released/
[4] http://blog.jquery.com/2014/01/24/jquery-1-11-and-2-1-released/
[5] http://blog.jquery.com/2013/05/24/jquery-1-10-0-and-2-0-1-released/
[6] https://gerrit.wikimedia.org/r/131494
[7] https://github.com/jquery/jquery-migrate/blob/master/warnings.md
[8] http://www.mail-archive.com/wikitech-l@lists.wikimedia.org/msg72198.html
[9] https://www.mediawiki.org/wiki/MediaWiki_1.24/Roadmap
Are you good in swearing? WE NEED YOU
Huggle 3 comes with vandalism-prediction as it is precaching the diffs
even before they are enqueued including their contents. Each edit has
so called "score" which is a numerical value that if higher, the edit
is more likely a vandalism.
If you want to help us improve this feature, it is necessary to define
a "score words" list for every wiki where huggle is about to be used,
for example on English wiki.
Each list has following syntax:
(see https://en.wikipedia.org/w/index.php?title=Wikipedia:Huggle/Config&diff=573…)
score-words(score):
list of words separated by comma, can contain newlines but comma
must be present
example
score-words(200):
these, are, some, words, which, presence, of, increases, the, score,
each, word, by, 200,
So, if you know english better than me, which you likely do, go ahead
and improve the configuration file there, no worries, huggle's config
parser is very syntax-error proof.
If you have any other suggestion how to improve huggle's prediction,
go ahead and tell us!
I just love this Google I/O 2013 talk on human perception and
cognition, and its implications for interactive and visual design. It
is accessible, but with a lot of information and applies very well to
us I think.
I'm sure that many designers know all about this and some have
probably seen the clip before, but it is also very good for
developers, because many of these things we know subconsciously, but
it's not really part of our vocabulary.
https://www.youtube.com/watch?v=z2exxj4COhU
DJ
I asked some folks about
https://www.mediawiki.org/wiki/Requests_for_comment/Standardized_thumbnails…
.
Antoine, the original author, said on the talk page:
"We had several mailing list discussion in 2012 / beginning of 2013
regarding optimizing the thumbnails rendering. That RFC is merely a summary
of the discussions and is intended to avoid repeating ourself on each
discussion. I am not leading the RFC by any mean, would be nice to have the
new multimedia team to take leadership there."
Gergo of the multimedia team has a question about whether he should start a
new RfC, and a question for Ops (below), which he said I could forward to
this list, so I'm doing so. :-)
If we can settle this onlist, cool. Otherwise I'll be setting up an IRC
chat for later this week.
Sumana Harihareswara
Senior Technical Writer
Wikimedia Foundation
On Fri, Jun 20, 2014 at 12:27 PM, Gergo Tisza <gtisza(a)wikimedia.org> wrote:
>
> Hi Sumana!
>
> We are working on some form of standardized thumbnail sizes, but it is not
> exactly the same issue that is discussed in the RfC
> <https://www.mediawiki.org/wiki/Requests_for_comment/Standardized_thumbnails…>
> .
>
> The problem we have ran into is that MediaViewer fits the image size to
> the browser window size (which means a huge variety of image sizes even
> when the browser window is fully enlarged, and practically infinite
> otherwise),
> but thumbnail rendering is very slow and waiting for it would result in a
> crappy user experience. We started using a list of standardized thumbnail
> sizes, so that MediaViewer always requests one of these sizes from the
> browser and rescales them with CSS, but even so the delay remains
> problematic for the first user who requests the image with a given bucket.
> To address that, we are working with ops towards automatically rendering
> the thumbnails in those sizes as soon as the image is uploaded.
>
> Another possibility related to standardized thumbnail sizes that we are
> exploring is to speed up the thumbnail generation for large images by
> having a list of sizes for which the thumbnail is pregenerated and always
> present, and resize one of those thumbnails instead of the original to
> generate the size requested by the user. The goal of this would be to avoid
> overloading the scalers when several large images need to be thumbnailed at
> the same time (GWToolset caused outages this way on a few occasions).
>
> I can create an RfC about one or both of the above issues if there is
> interest in wider discussion. I don't know whether the current thumbnail
> size standardization RfC should be replaced with those, though; its goals
> are not stated, but seem to be mainly operations concerns (how to make sure
> thumbnails don't take up too much storage space). Maybe ops wants to take
> it over, or provide clearer goals in that regard for the multimedia team to
> work towards.
>
>
>
Hi everyone,
I've written an RfC about changing the way extensions store metadata
about themselves and also how we load them. It's at
<https://www.mediawiki.org/wiki/Requests_for_comment/Extension_registration>.
A brief summary:
Extensions register a lot of things, like classes, hooks, special pages,
configuration options, and plenty more. Currently all of these are
usually stored in the extension's main entry point
($IP/extensions/Foo/Foo.php).
This creates two problems. First, it's difficult to tell what an
extension is going to register without actually enabling the extension.
Second, registration currently depends on global state ($wgHooks,
$wgSpecialPages, etc.) which we are trying to move away from.
My proposal is that we store this information in a JSON file (I've
provided an example on the RfC), and have MediaWiki read them when
loading extensions. We would no longer use the current method of
require_once "Foo.php";, but instead $wgEnableExtensions[] = 'foo';, and
MediaWiki would take care of the rest.
Please leave any comments or thoughts you might have on the talk page.
Thanks,
-- Legoktm
Hello everyone,
I am Diwanshi Pandey, an OPW intern. I'd like to have your feedback on the
course I have created on codecademy for mediawiki api with help of my
mentor Yuri Astrakhan.
A little insight:
The course is about parsing and querying mediawiki api.
Initially we created one course which included 44 exercises but according
to codecademy's guidelines their course are for beginners and should have
maximum 30 exercises in one course.
So we did a split up into two courses:
One is Introduction to Wikipedia
API<http://www.codecademy.com/courses/web-beginner-en-vj9nh/0/1>and
other is Wikipedia:Query
API <http://www.codecademy.com/courses/web-beginner-en-yd3lp/0/1>.
Also due to api security and restrictions we couldn't implement tutorial on
"editing wiki pages through api call" from a non wiki site yet. We are
waiting till we find a good and easy way to demo that.
Feedback may include:
* Are the exercises easy to understand for novice users/developers?
* Are changes needed in the look of exercises?
* Are there any exercises which need not to be implemented or in too depth?
* Any other thing?
Thanks,
--
*Regards,*
*Diwanshi Pandey*
Hi everyone,
A big thank you to everyone who participated in the Architecture Summit
this year! We covered a lot of ground this year, and collectively learned
a lot about how to put these things together.
A lot of our work from this summit on this is only just beginning.
Speaking of that, just the act of processing the notes is going to be the
first step. MatmaRex and Legoktm pulled together a concise list of the
etherpads here, which I've annotated with the corresponding agenda pages:
https://www.mediawiki.org/wiki/Architecture_Summit_2014/Retrospective#Notes…
Our next course of action is to get all of these copied to mediawiki.org to
the agenda pages, so that we have a record of each session that will
outlive Etherpad's temporary storage. I've done a little bit (the
Architecture value, process, and guidelines discussion), but I'd love help
from others on both getting it all copied in place, and wikifying it. Any
takers?
Those of you that couldn't make it, I'd encourage you to read through the
notes We'll be discussing a lot of this over the coming days, so it'll be
useful context to bring you up to speed on these things.
Rob