Our release notes process is quite straight forward.
New release notes go to RELEASE-NOTES file, and after
releasing a new major version, they get moved into
HISTORY and a clean RELEASE-NOTES is created.
The linear history is easily preserved this way.
However, there's a kind of changes which doesn't have
an exact addition version. Changes that deserve backporting
are done into trunk, then backported by merging with the
supported releases. The RELEASE-NOTES of those versions are
updated in a "Changes since xyz" heading.
However, it is not clear where that release notes should go
on the HISTORY file. Specially since it is no longer linear.
Currently, we could have a revision go into 1.15.6, 1.16.1
and 1.17.0. At which point should it be placed?
Old versions (up to 1.5.x) do have entries for updates in HISTORY
file, and also 1.13.x, but that's it.
We don't seem to be maintaining it, as brought up in r72587.
What should be the procedure when backporting?
* When merging to an earlier release, the RELEASE-NOTES go there,
and is added into trunk HISTORY file at the same time.
* When merging to an earlier release, the RELEASE-NOTES go there.
On release, the new section is added as a whole on the trunk HISTORY.
* We don't keep HISTORY for release updates. Changes get an entry inside
the branch RELEASE-NOTES, but also inside the trunk one. It is shipped
as a fix on the next major, even if it was also fixed in some minor
in-between.
The last one is the lazier one, and completely avoids the
on-which-version issues.
However, it can't cope with problems that are only fixed in the branch
(eg. a minor
fix for a feature that was completely rewritten in trunk). It also
delusions users
as giving fixes that they (should) already have.
I would prefer one of the other two, and perhaps give a common entry on
HISTORY
"Bug fixes in 1.15.x and 1.16.y".
Opinions?
MediaWiki Developers,
Over the past couple of months, Roan Kattouw and I (Trevor Parscal) have
been working on a JavaScript and CSS delivery system called
ResourceLoader. We're really excited about this technology, and hope
others will be too.
This system has been proving itself to be able to seriously improve
front-end performance. Just for starters, we're talking about taking the
Vector skin from 35 requests @ 30kB gzipped to 1 request @ 9.4kB gzipped
(see http://www.mediawiki.org/wiki/ResourceLoader/Benchmarks)
We are looking to make this the standard way to deliver Javascript, CSS,
and small images in MediaWiki and on Wikimedia projects, and we're
seeking your comments and help.
== Background ==
The goals of the project were to improve front-end performance, reduce
the complexity of developing JavaScript libraries and user-interfaces,
and get the ball rolling on a rewrite/refactoring of all JavaScript and
CSS code in MediaWiki.
What's wrong with things as they are now?
* Too many individual requests are being made. All JavaScript, CSS and
image resources are being loaded individually, which causes poor
performance on the cluster and users experience the site as being slow.
* We are wasting too much bandwidth. We are sending JavaScript and CSS
resources with large amounts of unneeded whitespace and comments.
* We are purging our caches too much. Many user interface changes
require purging page caches to take effect and many assets are
unnecessarily being purged from client machines due to the use of a
single style version for all assets
* We are sending people code they don't even use. Lots of JavaScript is
being sent to clients whose browsers will either crash when it arrives
(BlackBerry comes to mind), just not use it at all (older versions of
many browsers) while parsing it unnecessarily (this is slow on older
browsers, especially IE 6) or isn't even being completely utilized
(UsabilityInitiative's plugins.combined.min.js for instance)
* Internationalization in JavaScript is a mess. Developers are using
many different ways -- most of which are not ideal -- to get their
translated messages to the client.
* Right-to-left support in CSS is akward. Stylesheets for right-to-left
must to be either hand-coded in a separate stylesheet, generated each
time a change is made by running CSSJanus, or an extra style-sheet which
contains a series of over-rides.
* There's more! These and other issues were captured in our requirements
gathering process (see
http://www.mediawiki.org/wiki/ResourceLoader/Requirements )
What does ResourceLoader do to solve this?
* Combines resources together. Multiple scripts, styles, messages to be
delivered in a single request, either at initial page load or
dynamically; in both cases resolving dependencies automatically.
* Allows minification of JavaScript and CSS.
* Dramatically reduces the number of requests for small images. Small
images linked to from CSS code can be automatically in-lined as data
URLs (when the developer marks it with a special comment), and it's done
automatically as the file is served without requiring the developer to
do such steps manually.
* Allows deployment changes to all pages for all users within minutes,
without purging any HTML. ResourceLoader provides a short-expiry
start-up script which then decides to continue loading more JavaScript
or not, and if so has a complete manifest of all scripts and styles on
the server and their most recent versions, Also, this startup script
will be able to be inlined using ESI (see
http://en.wikipedia.org/wiki/Edge_Side_Includes ) when using Squid or
Varnish, reducing requests and improving performance even further.
* Provides a standard way to deliver translated messages to the client,
bundling them together with the code that uses them.
* Performs automatic left-to-right/right-to-left flipping for CSS files.
In most cases the developer won't have to do anything before deploying.
* Does all kinds of other cool tricks, which should soon make everyone's
lives better
What do you want from me?
* Help by porting existing code! While ResourceLoader and traditional
methods of adding scripts to MediaWiki output can co-exist, the
performance gains of ResourceLoader are directly related to the amount
of software utilizing it. There's some more stuff in core that needs to
be tweaked to utilize the ResourceLoader system, such as user scripts
and site CSS. We also need extensions to start using it, especially
those we are deploying on Wikimedia sites or thinking about deploying
soon. Only basic documentation exists on how to port extensions, but
much more will be written very shortly and we (Roan and I) be leading by
example by porting the UsabilityInitiative extensions ourselves. If you
need help, we're usually on IRC. (See
http://www.mediawiki.org/wiki/ResourceLoader/Getting_Started )
* Help writing new code! While wikibits.js is now also known as the
"mediawiki.legacy.wikibits" module, the functionality that it and
basically all other existing MediaWiki JavaScript code provide is being
deprecated, in favor of new modules which take advantage of jQuery and
can be written using a lot less code while eliminating the current
dependence on a large number of globally accessible variables and
functions (see
http://www.mediawiki.org/wiki/ResourceLoader/JavaScript_Deprecations )
* Some patience and understanding... Please... While we are integrating
into trunk, things might break unexpectedly. We're diligently tracking
down issues and resolving them as fast as we can, but help in this
regard is much needed and really appreciated. But most of all, we're
sorry if something gets screwed up, and we're trying our best to make
this integration smooth.
* Enthusiasm!
Documentation is coming online as fast as we can write it. There's a
very detailed design specification document at
http://www.mediawiki.org/wiki/ResourceLoader/Design_Specification and
more information in general at
http://www.mediawiki.org/wiki/ResourceLoader , where we will be adding
more and more documentation as time goes on. If you can help with
documentation, please feel free to edit boldly - just try not to modify
the design specification unless you are also modifying the software :)
While this project has been bootstrapped by Roan and myself in a branch,
we're really excited about bringing it to trunk and hope the community
can start taking advantage of the new features right away.
Tracking bug for tracking things that ResourceLoader will fix:
http://bugzilla.wikimedia.org/show_bug.cgi?id=24415
Bugzilla Component:
https://bugzilla.wikimedia.org/buglist.cgi?query_format=advanced&bug_status…
- Trevor (and Roan, who's committing the merge to SVN right now)
I know there's some discussion about "what's appropriate" for the
Wikipedia API, and I'd just like to share my recent experience.
I was trying to download the Wikipedia entries for people, of
which I found about 800,000. I had a scanner already written that
could do the download, so I got started.
After running for about I day, I estimated that it would take
about 20 days to bring all of the pages down through the API (running
single-threaded.) At that point I gave up, downloaded the data dump (3
hours) and wrote a script to extract the pages -- it then took about an
hour to the extraction, gzip compressing the text and inserting into a
mysql database.
Don't be intimidated by working with the data dumps. If you've got
an XML API that does streaming processing (I used .NET's XmlReader) and
use the old unix trick of piping the output of bunzip2 into your
program, it's really pretty easy.
I'm referring to the Foo.deps.php files which contain:
// This file exists to ensure that base classes are preloaded before
// <filename> is compiled, working around a bug in the APC opcode
// cache on PHP 5, where cached code can break if the include order
// changed on a subsequent page view.
// see http://lists.wikimedia.org/pipermail/wikitech-l/2006-January/021311.html
The referenced bug http://pecl.php.net/bugs/bug.php?id=6503 is marked
as fixed in PHP 5.1.x.
Can we remove those file or is there something else going on with this
problem that we should be aware of?
-Niklas
--
Niklas Laxström
I was fine until now;
https://secure.wikimedia.org/wikipedia/en/wiki/Fat_Man has the top
image broken now consistently. I have checked on multiple computers,
same results.
Cc'ed wikitech-l. I'd post to IRC but I'm not able to access it at
the moment...
-george
On Thu, Nov 4, 2010 at 12:28 PM, Alan Liefting <aliefting(a)ihug.co.nz> wrote:
> Some are not loading for me at all this morning.
>
>
> Alan
>
>
> On 5/11/2010 6:19 a.m., Charles Matthews wrote:
>> I've noticed a very much slower rate of loading of images for several
>> days now. It's affecting the work I can do. Is this a general
>> experience, or is it perhaps my ISP?
>>
>> Charles
>>
>>
>> _______________________________________________
>> WikiEN-l mailing list
>> WikiEN-l(a)lists.wikimedia.org
>> To unsubscribe from this mailing list, visit:
>> https://lists.wikimedia.org/mailman/listinfo/wikien-l
>>
>>
>> -----
>> No virus found in this message.
>> Checked by AVG - www.avg.com
>> Version: 10.0.1153 / Virus Database: 424/3235 - Release Date: 11/03/10
>>
>>
>
>
> _______________________________________________
> WikiEN-l mailing list
> WikiEN-l(a)lists.wikimedia.org
> To unsubscribe from this mailing list, visit:
> https://lists.wikimedia.org/mailman/listinfo/wikien-l
>
--
-george william herbert
george.herbert(a)gmail.com
Hey,
Will there be any MediaWiki representation at FOSDEM 2011? 2 years back
Brion gave a presentation, and last year there was nothing about MediaWiki
AFAIK, so I'm curious to what will happen this year.
Cheers
--
Jeroen De Dauw
* http://blog.bn2vs.com
* http://wiki.bn2vs.com
Don't panic. Don't be evil. 50 72 6F 67 72 61 6D 6D 69 6E 67 20 34 20 6C 69
66 65!
--
Swedish Wikipedia has long ago disabled all local
uploading of files (pictures) and has moved all files
to Wikimedia Commons. When explaining Wikipedia and
Commons to newcomers, it's very frustrating to land
on the intermediary file description page on Wikipedia
before moving on to Commons where image categories
and other extra functions are found. To this end, a
"gadget" has been installed in sv.wikipedia that changes
all image links to point directly to Commons. This is
very convenient, except that gadgets require a user
account and a personal setting, so they are not
available to the newcomers who need this.
Could we please just skip local file description pages
for everybody? Perhaps a reversed gadget could enable
them back, but the default should be for image links
to go directly to Commons. Is this possible? Does any
project already use this? What should I write in the
site request in Bugzilla?
Yes, we have discussed this and there is a broad consensus
on the Swedish Wikipedia village pump,
http://sv.wikipedia.org/wiki/Wikipedia:Bybrunnen#Avskaffa_svenska_bildbeskr…
--
Lars Aronsson (lars(a)aronsson.se)
Aronsson Datateknik - http://aronsson.se
Hi,
How to perform cross-wiki script importing?
I tried with *importScript('w:Mediawiki:rules.js');* on ml.wikibooks, but
not working.
Thank you.
--
Junaid P V
http://junaidpv.in