Hey,
I just wanted to check in about the status of enabling JavaScript package
management usage in MediaWiki. I am basically talking about an equivalent
for JS to what we have with Composer for PHP.
Real-world example:
The "data-values/value-view" package[0] is defining
"jquery.event.special.eachchange.js":
ValueView/lib/jquery.event/jquery.event.special.eachchange.js
Now, recently I needed the same functionality in one of my extensions, so
I just copied it over. [1]
I know that this is the worst way one could do this, but as far as I can
see we don't have that much of a choice right now. Here are the alternative
options I can see:
Moving "jquery.event.special.eachchange.js" out of the
"data-values/value-view" package into its own "WMDE/jquery-eachchange"
package...
1. ... and using it in my extension via composer.
+ pro: two or more extensions or other packages requiring this package
will still result in having only one MW-wide installation..
- con: requires MW specific code which is actually not related to the
MW-independent package to feed the resource loader.
- con: using Composer to manage pure JavaScript packages! Uuuh, ugly!
2. ... and having a build step in other packages using the package, pulling
the "WMDE/jquery-eachchange" somewhere into the file structure of the
packages/extensions using it.
+ pro: don't need to abuse composer, we can use "npm", "Bower" or any
other arbitrary JS package manager here.
- con: got to tell resource loader somehow... (didn't think so much about
that yet)
- con: if more than one extensions or other packages require this package
we still end up with the same code twice or more often in one MW
installation.
3. Combining 1 and 2: Start with 2, using a JS package manager. Then going
to 1, creating a composer package and pulling the "WMDE/jquery-eachchange"
package in via some build script.
+ pro: The two pros from 1 + 2
+ pro: ^^
- con: still got to tell resource loader somehow...
- con: Overhead; We now create two packages where the Composer one is
just a bridge to the MW-world, still polluting packagist.org. Still kind of
ugly and more effort for publishing a package and therefore potentially
scaring programmers away from doing so since they've got better things to
do than doing work that could be automated.
I have not seen Approach 2 and 3 yet. Though I could imagine that the
VisualEditor team has used something like that.
Approach 1 is the way the "data-values/value-view" package itself is being
handled. And that package should actually be a MW independent pure JS
package but right now it contains MW specific code and uses composer for
distribution!
There is still another option but that had to be properly implemented:
4. Choose one native JS package manager for now and go with it (add support
for others later perhaps). Integrate it properly with MW (resource loader
to begin with), document how to use it and finally distribute JS code
coming from the MW world but useful for other projects in a way where it
can actually be used in a non-MW context.
This has already been bugging me when working on Wikidata. Now I'd like to
reuse some of the code I have written there without spending hours and
hours with option 3 because there should be support for option 4 rather
sooner or later.
So I am wondering; Does anyone have any thoughts, any alternatives perhaps
or is there any roadmap on anything like the option 4 that I have shown?
Cheers,
Daniel
[0]: https://packagist.org/packages/data-values/value-view
[1]:
https://github.com/DanweDE/mediawiki-ext-UserBitcoinAddresses/blob/master/r…
Hi, I'd like to present a new RFC for your consideration:
https://www.mediawiki.org/wiki/Requests_for_comment/Minifier
It is about how we can shave 10-15% off the size if JavaScript
delivered to users.
Your comments are highly welcome!:)
--
Best regards,
Max Semenik ([[User:MaxSem]])
Hello, wikitech-l,
tl;dr trying to make mw-core pass mw-codesniffer, expect large patches on
Gerrit, and please help
A lot of work has been done on MediaWiki codesniffer
<https://phabricator.wikimedia.org/tag/mediawiki-codesniffer/> (the
PHP_CodeSniffer standard for MediaWiki) over the last few months and this
might be a good time to get core to pass our coding conventions
<https://www.mediawiki.org/wiki/Manual:Coding_conventions/PHP>.
Work on this has already started at T102609
<https://phabricator.wikimedia.org/T102609>. There were two primary reasons
to send this email:
1. This is a pretty big task and any help would be very welcome! If we can
get phpcs to run against core's master, it would make every patch
contributors' work a little easier.
2. Work is being organized as subtasks of T102609, and has been divided on
the basis of sniffs. Because of this, every patch is going to change a lot
of unrelated files and a lot of reviewers are going to get added on Gerrit.
Let's make this happen!
Vivek Ghaisas (polybuildr)
I'm happy to announce a Gerrit Cleanup Day on Wed, September 23.
It's an experiment to reduce Wikimedia's code review backlog which
hurts growing our long-term code contributor base.
Development/engineering teams of the Wikimedia Foundation are supposed
to join and use the day to primarily review recently submitted open
Gerrit changesets without a review, focussing on volunteer
contributions. And developers of other organizations and individual
developers are of course also very invited to join and help! :)
https://phabricator.wikimedia.org/T88531 provides more information,
steps, links. Note it's still work in progress.
Your questions and feedback are welcome.
Thanks,
andre
--
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/
In the next RFC meeting, we will discuss the following RFC:
* Improving extension management
<https://www.mediawiki.org/wiki/Requests_for_comment/Improving_extension_man…>
The meeting will be on the IRC channel #wikimedia-office on
chat.freenode.net at the following time:
* UTC: Wednesday 21:00
* US PDT: Wednesday 14:00
* Europe CEST: Wednesday 23:00
* Australia AEST: Thursday 07:00
-- Tim Starling
Are there any stable APIs for an application to get a parse tree in
machine-readable format, manipulate it and send the result back without
touching HTML?
I'm sorry if this question doesn't make any sense.
Is there any easy way to find all of citations of specified academic
articles on Wikipedias in all languages, and the text that is supported by
those references, so that the citations of questionable articles can be
removed and the article texts can be quickly reviewed for possible changes
or removal?
See
https://www.washingtonpost.com/news/morning-mix/wp/2015/08/18/outbreak-of-f…
If we don't have easy ways to deal with this (and I believe that we don't),
I'd like to suggest that the Community Tech team work on tools to help when
these situations happen.
Thanks,
Pine
There's a pretty hilarious American police procedural TV show in 2015
called "CSI: Cyber", featuring mostly cybercrime. Obviously they have to
dredge up snippets of code from places for screenshots on the show.
Episode 4 happened to include a tidbit from MediaWiki 1.25/wmf3. Supposedly
the code was a hack to make your printer blow up.
Original lulz and screenshots via
http://moviecode.tumblr.com/post/114815574587/this-is-from-csi-cyber-s01e04…
Hi all!
We have some breaking API changes that will soon be deployed to wikidata.org.
The deployment date should be: 9th September 2015 (just under 2 weeks)
The change making the breaks an be found at:
https://gerrit.wikimedia.org/r/#/c/227686/
The breaking changes are:
- XML output aliases are now grouped by language
- XML output may no longer give elements when they are empty
- XML any claim, qualifier, reference or snak elements that had an
'_idx' element will no longer have it
- ALL output may now give empty elements, ie. labels when an entity has none
If you want to see a wikipage explaining these changes take a look at:
https://www.wikidata.org/wiki/User:Addshore/API_Break_September_2015
If you have any questions regarding these breaking changes please ask!
Addshore