We are proud to announce the first stable release of the 1.18 series.
*********************************************************************
What's new?
*********************************************************************
MediaWiki 1.18 brings the usual host of various bugfixes and new
features.
jQuery 1.6.4 is now included as standard, along with numerous
more jQuery plugins.
Breaking changes:
* action=watch / action=unwatch now requires a token
As of 1.18, some commonly used extensions are now being included in the
released tarball; this is allow ease
of installation of these extensions in new MediaWiki installs. If you
already use the extension just replace
the files like you have done for MediaWiki iteself. The following extensions
are bundled with MediaWiki as of
1.18. All are currently in use on Wikimedia sites.
* ConfirmEdit Various CAPTCHA techniques to try to prevent spambots and
other automated tools from editing your wiki.
* Gadgets A system to allow users to enable or disable JavaScript or CSS
tools made available to users site-wide.
* Nuke A special page allowing administrators to mass-delete content added
by a spammer or vandal.
* ParserFunctions Additional parser functions (like #if and #switch to
supplement the "magic words" present in MediaWiki.
* Renameuser A special page which allows authorized users to rename user
accounts.
* Vector Enhancements to the Vector skin.
* WikiEditor An improved and customizable editing toolbar developed along
the Vector skin.
Major features
-- ------------
Better gender support
-- -------------------
Until version 1.17, MediaWiki used neutral nouns to address and identify
users on their user page.
In English, this was not an issue since "User" matches both genders, but in
some languages the
neutral gender is always masculine; for example, this would cause
French-speaking female
Wikipedia users to be referred to as "Utilisateur" (male user) instead of
"Utilisatrice" (female user).
With version 1.18, user pages reflect the user's gender, if they have
specified it in their preferences.
More gender support (for instance in logs and user lists) will be available
in MediaWiki 1.19.
Improved file metadata support
-- -----------------------------
MediaWiki now detects the camera orientation from Exif metadata, and rotates
the
picture preview accordingly. The original file remains unchanged.
The overall metadata support in MediaWiki has been greatly extended.
Previously, MediaWiki could only
extract limited Exif metadata, and showed a subset of it on file description
pages. Since 1.18, MediaWiki
can extract IPTC and XMP metadata from uploaded files, and more Exif
information. This includes an
embedded description, author information, GPS coordinates, or copyright
statement.
Improved directionality support
-- -------------------------------
A lot of work has been done to fix directionality bugs (Left-To-Right,
Right-To-Left). Most notably bug 6100 is
fixed, which allows to display an RTL interface on an LTR wiki properly (and
vice versa). This was developed
under $wgBetterDirectionality, which is now no longer used because the
improvements are merged with the core code.
A positive consequence is that the page content on wikis with multiple
scripts is aligned according to the
direction of the selected variant. For example, on a Kazakh language wiki,
selecting the Arabic script
variant will align the text as RTL, while selecting the Latin or Cyrillic
variant will align it as LTR.
Easily find where to customize interface messages
-------------------------------------------------
MediaWiki allows you to customize the user interface by editing pages in the
MediaWiki namespace.
However, even though they can be viewed at Special:AllMessages] the sheer
number of these messages
makes it difficult to find which one needs to be customized. In MediaWiki
1.18, a new pseudo-language
is introduced (qqx) to help people find such messages, by displaying the
messages' key instead of the
actual messages. All one has to do is append ?uselang=qqx to the page's
index.php/
URL
(see https://www.mediawiki.org/w/index.php?title=MediaWiki_1.18&uselang=qqx
as an example).
New plugin for collapsible elements
-- -----------------------------------
The new jQuery.makeCollapsible allows you to create collapsible tables,
lists and so on,
by adding the class mw-collapsible to the elements.
See the manual for
details: https://www.mediawiki.org/wiki/Manual:Collapsible_elements
Protocol-relative URLs
-- ----------------------
MediaWiki now supports protocol - relative URLs in links, interwiki targets
and $wgServer.
Protocol-relative URLs look like //example.com/wiki/Foo ; the browser will
recognize this
as http://example.com/wiki/Foo when following a link from an HTTP page, and
https://example.com/wiki/Foo when following a link from an HTTPS page.
This way, protocol-relative URLs enable a wiki to support HTTP and HTTPS
while serving
the same HTML for both, which means the parser cache doesn't have to be
split.
More personalisable styles and scripts
-- --------------------------------------
MediaWiki now automatically loads javascript and stylesheets more specific
to each user.
There is a separate CSS and JS file for each usergroup
(MediaWiki:Group-sysop.css,
MediaWiki:Group-autoconfirmed.js, etc), and also a CSS file for users
viewing without
JavaScript (MediaWiki:Noscript.css).
Other changes
-- -------------
$wgEnableDublinCoreRdf and $wgEnableCreativeCommonsRdf no longer
work in core, and the functionality has been moved to the relevant
extensions.
See http://www.mediawiki.org/wiki/Extension:DublinCoreRdf and
http://www.mediawiki.org/wiki/Extension:CreativeCoreRdf as appropriate
Math
- ----
$wgUseTeX has been superseded by the Math extension. To re-enable
math conversion after upgrading, obtain the Math extension from SVN or from
http://www.mediawiki.org/wiki/Extension:Math and add to LocalSettings.php:
require_once "$IP/extensions/Math/Math.php";
Language support
- ----------------
As with every release, MediaWiki 1.18 brings improved support for
languages in MediaWiki, with improved translation and features for
the many supported languages.
New languages:
* Angika (anp)
* Brahui (brh)
* Central Dusun (dtp)
* Jamaican Creole English (jam)
* Khowar (khw)
* Liv (liv)
* Kichwa (qug)
API
- ---
API bug fixes and new features have been added to 1.18, providing
more options for input and output.
* API modules were added to access QueryPage based special
pages, to Compare pages, Revert files, and to be able to access
other special pages such as Special:UnwatchedPages,
Special:MimeSearch and Special:ActiveUsers
* The output of the generated help page has been improved
The API contains a breaking changes against previous releases:
* action=watch now requires POST and token.
Other
- -----
Our thanks go to everyone who helped to improve MediaWiki
by testing the beta release and submitting bug reports and patches.
For more information about what's new in the MediaWiki 1.17 branch, see:
http://www.mediawiki.org/wiki/MediaWiki_1.17
Frequently asked questions about upgrading:
http://www.mediawiki.org/wiki/Manual:FAQ#Upgrading
Changes since 1.18.0rc1
-- ---------------------------
* (bug 32228) regression in Special:Search which did not conserve profile on
new search
* (bug 32460) Categories were improperly aligned in Simple and CologneBlue
* (bug 32412) TOC links on [[Special:EditWatchlist]] points to the fieldsets
* (bug 32582) Fix TOC show/hide link regression on IE 8
Release notes
- -------------
Complete release notes are at
http://www.mediawiki.org/wiki/Release_notes/1.18
**********************************************************************
Download:
http://download.wikimedia.org/mediawiki/1.18/mediawiki-1.18.0.tar.gz
Patch to previous version (1.18.0rc1), without interface text:
http://download.wikimedia.org/mediawiki/1.18/mediawiki-1.18.0.patch.gz
Interface text changes:
http://download.wikimedia.org/mediawiki/1.18/mediawiki-i18n-1.18.0.patch.gz
GPG signatures:
http://download.wikimedia.org/mediawiki/1.18/mediawiki-1.18.0.tar.gz.sighttp://download.wikimedia.org/mediawiki/1.18/mediawiki-1.18.0.patch.gz.sighttp://download.wikimedia.org/mediawiki/1.18/mediawiki-i18n-1.18.0patch.gz.s
ig
Public keys:
https://secure.wikimedia.org/keys.html
Bug 24207 requests switching the math rendering preference default from its
current setting (which usually produces a nice PNG and occasionally produces
some kinda ugly HTML) to the "always render PNG" setting.
I'd actually propose dropping the rendering options entirely...
* "HTML if simple" and "if possible" produce *horrible* ugly output that
nobody likes, so people use hacks to force PNG rendering. Why not just
render to PNG?
* "MathML" mode is even *MORE* limited than "HTML if simple", making it
entirely useless.
* nobody even knows what "Recommended for modern browsers" means, but it
seems to be somewhere in that "occasionally crappy HTML, usually PNG"
continuum.
So we're left with only two sane choices:
* Always render PNG
* Leave it as TeX (for text browsers)
Text browsers will show the alt text on the images, which is... the TeX
code. So even this isn't actually needed for its stated purpose. (Hi
Jidanni! :) lynx should show the tex source when using the PNG mode.)
It's conceivable that a few folks really honestly prefer to see the latex
source in their graphical browsers (should at least do a quick stat check to
see if anybody uses it on purpose), but I wouldn't mind removing that
either.
Fancier rendering like MathJax etc should be considered as a separate thing
(and implemented a bit differently to avoid parser cache fragmentation!), so
don't let future mode concerns worry y'all. Any thoughts on whether this
makes sense to do for 1.18 or 1.19?
https://bugzilla.wikimedia.org/show_bug.cgi?id=24207#c9
-- brion
Now that the Wikipedia CAPTCHA has been comprehensively broken by
Burzstein et. al. in their paper "Text-based CAPTCHA Strengths and
Weaknesses":
http://elie.im/publication/text-based-captcha-strengths-and-weaknesses
it's time to fix the current CAPTCHA system while there is still no
evidence that it is yet being automatically exploited on a large scale.
Accordingly, I've reworked the 2005-era CAPTCHA-image-generating Python
script in the CAPTCHA engine in a way that I hope should be a drop-in
replacement for the existing script.
Following the recommendations of the paper's authors, I've made several
improvements, each of which is relatively weak, but which all put
together I hope should present a defence in depth against the techniques
described in the paper.
I've also reduced the strength some of the existing features of the
current CAPTCHA identified by the paper as not being sufficiently
effective against modern attacks.
For example, the paper shows that noise-based blurring / fragmenting of
individual characters is not an effective measure against modern shape
classifiers. This was a major feature of the previous code, which used
this to try to attempt to confuse edge-slope based recognizers which
were one of the most promising attacks being developed at the time it
was written. Now this is shown to no longer be as useful as I had
thought, I've backed off quite a bit on this -- without removing it
completely -- while trying to strengthen other features of the CAPTCHA.
Similarly, the paper identifies geometric distortion alone as being a
relatively weak technique, unless combined with effective geometric
confusion and anti-segmentation measures.
So I've added the following:
* Negative kerning of the characters to join them together at the edges,
making segmentation more difficult
* The addition of a randomly-placed near-horizontal long and shallowly
curved confusion line to make the job of segmenters and shape
recognizers just that bit more difficult. This line is added in the
middle of the image stirring process, so that it is not aligned either
with the text or the output raster, and should thus not help either
undoing the distortion or recovering the text baseline, while still
breaking up the topology and geometry of the text.
* More stages of more subtle image stirring, solely intended to provide
sufficient extra geometric distortion and character outline disruption
to make defeating the confusion created by these two more difficult.
To counteract the reduction in human readability, I've upped the default
font size a bit, and I now suggest a serif font such as Droid Serif
instead of the previous sans-serif font.
As ever, there are a vast number of twiddly arbitrary parameters in the
code, which I have determined entirely by trial and error, in an attempt
to balance the anti-machine-recognition measures with one another, while
maintaning reasonable levels of human readability. There is plenty of
scope for adjusting these.
The current results look to me like they are more likely to be resistant
to the attacks described in the paper than the current code, but I'd be
interested in getting some more eyes on the problem.
Would anyone be interested in taking a look at the code and some sample
output?
-- Neil
In https://bugzilla.wikimedia.org/enter_bug.cgi I don't see any
appropriate category for gadgets like Twinkle or Image Annotator. Let
me know if I'm missing something.
Roan, Timo and others have been working towards making gadgets
manageable through a shared repository ( see
http://www.mediawiki.org/wiki/ResourceLoader/Version_2_Design_Specification
). As part of moving towards better systems for publishing,
internationalizing and sharing gadgets, I suggest we also standardize
how we track gadget bugs.
Analog to the "MediaWiki extensions" product, I suggest that we create
a "MediaWiki gadgets" category, initially with only an "[Other]"
component.
We could make it part of the gadget publication process to a shared
repository on MediaWiki.org that gadgets receive a Bugzilla component,
and that the initial author is added as a default-CC to it.
Thoughts?
Erik
--
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation
Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate
Hi everyone,
Here's an attempt at putting together a more sophisticated (though not
necessarily correct) goal for 1.19:
https://docs.google.com/a/wikimedia.org/spreadsheet/ccc?key=0Agte_lJNpi-OdD…
The goal was not to make something 100% accurate based on our past
record, but to come up with a goal that's informed by our history, but
also nudges us a little to get better than before.
During the 1.18 cycle, our goals were based on a linear review rate
(same number of revisions every day). That may have been too easy on
us in the early going. What was clear looking at the numbers from
last time around was that, when focused on code review, the curve
looked a lot more like an exponential decay curve (more reviews in the
early going, tapering off toward the end). That's assuming that our
backlog increase in July was an anomaly that we won't repeat.
So, the 1.19 cycle has new review goals that are based on exponential
decay curve, with a linear fudge factor built in so that we don't just
approach zero, but make it there. Fixmes, however, did seem to have a
linear rate during the 1.18 cycle, so I left our goals linear for this
release.
I've accounted for a plateau in the last two weeks of December when
many WMF staff will likely be taking holiday vacation.
The goal is to get done with review by January 31, 2012. While this
is much later than we were previously hoping for, it's still pretty
aggressive given the backlog we have. This assumes that the huge
swath of revisions that Hashar marked "deferred" on Friday all stay
deferred, for example, and that we'll still find other pockets of
revisions that we can use to knock off 50+ revs a day in the early
going.
Does this look like a workable goal to everyone? Assuming so, we'll
report our progress against this goal, and plan the 1.19 deploy for
early February.
Rob
> Message: 2
> Date: Wed, 16 Nov 2011 01:48:48 +0900
> From: Philip Chang <pchang(a)wikimedia.org>
> Subject: [Wikitech-l] Final mobile switch-over
> To: wikipedia-l(a)lists.wikimedia.org, wiktionary-l(a)lists.wikimedia.org,
> wikiquote-l(a)lists.wikimedia.org, textbook-l(a)lists.wikimedia.org,
> wikisource-l(a)lists.wikimedia.org, wikinews-l(a)lists.wikimedia.org,
> wikiversity-l(a)lists.wikimedia.org, wikispecies-l(a)lists.wikimedia.org,
> wikitech-l(a)lists.wikimedia.org, commons-l(a)lists.wikimedia.org,
> incubator(a)lists.wikimedia.org
> Message-ID:
> <CAE1jH8LD=drV79wNb4iJMGm9_XYxCh7b5rPFEpKyhfY+gmuuiQ(a)mail.gmail.com>
> Content-Type: text/plain; charset=ISO-8859-1
>
> TO: All Wikimedia Project Administrators
>
>
> As a follow-up to this original blog post about the new mobile site:
>
> http://blog.wikimedia.org/2011/09/14/new-mobile-site-launched-on-wikipedia-…
>
> Please note that the conversion to the new Mobile Frontend extension will
> be completed in the week of November 28.
>
> This means that by default any user of a mobile device will see the mobile
> interface rather then the desktop version. Users will no longer have to add
> the extra .m by hand.
>
> This also means any home pages not yet designed for mobile viewing will
> appear with a search bar only - unless you create a home page in the next
> three weeks, which is very easy to do!
>
> Instructions for creating a home page are here:
>
> http://meta.wikimedia.org/wiki/Mobile_Projects/Mobile_Gateway#Mobile_homepa…
>
> Please forward this email as necessary.
>
> As always, the mobile-l and mobile-feedback-l mailing lists are available.
> There will be many more announcements in the coming months on mobile-l, and
> always feel free to send comments to mobile-feedback-l.
>
> Thank you.
>
> Phil
>
> --
> Phil Inje Chang
> Product Manager, Mobile
> Wikimedia Foundation
> 415-812-0854 m
> 415-882-7982 x 6810
Surely it would make more sense for the mobile frontend to detect that
no ids match the looked for pattern and treat the main page as a
normal page in that case instead of totally and utterly breaking if
the admin didn't customize the front page properly?
-bawolff
Hello!
I don't know if the subject of this question belongs to the scope of this
group. Anyway, I will be pleased if I find an aswer to my question.
I'm writing some Java code in order to realize NLP tasks upon texts using
Wikipedia. What can I do in order to extract the first paragraph of a
Wikipedia article? Thanks a lot.
Truly yours
Ben Sidi Ahmed