All,
When we end up moving MW core to Phabricator I'd like us to jettison our
history. The
repo is large and clunky and not conducive to development. It's only going
to grow in
size unless we do something to cut back on the junk we're carrying around.
This is my ideal Phabby world:
mediawiki (no /core, that was always redundant)
mediawiki/i18n (as submodule)
mediawiki/historical (full history, previous + all mediawiki going forward)
If we jettison all our history we can get the repo size down to a 30-35MB
which is
very nice. Doing it on Gerrit isn't worthwhile because it'd basically break
everything.
We're gonna be breaking things with the move to Phab...it's then or never
if we're
going to do this.
Being able to stitch with the old history would be nice and I think might
be doable with
git-replace. If not, I still think it's worth discussing for developer and
deployer productivity.
Thoughts?
-Chad
Hello everyone,
I am happy to announce the availability of the first stable release of the new MediaWiki 1.23 release series.
MediaWiki 1.23 is a large release that contains many new features and bug fixes. This is a summary of the major changes of interest to users. You can consult the RELEASE-NOTES-1.23 file for the full list of changes in this version.
This is a Long Term Support release (LTS) and will be supported until May 2017.
Our thanks to everyone who helped to improve MediaWiki by testing the release candidates and submitting bug reports.
== What's new? ==
* MediaWiki 1.23 includes all changes released in the smaller 1.23wmfX software deployments to Wikimedia sites.
=== Skin autodiscovery deprecated ===
Skin autodiscovery, the legacy skin installation mechanism used by MediaWiki since very early versions (around 2004), has been officially deprecated and will be removed in MediaWiki 1.25.
* MediaWiki 1.23 will emit warnings in production if a skin using the deprecated mechanism is found.
* See Manual:Skin autodiscovery for more information and a migration guide for site admins and skin developers.
=== Notifications ===
With 1.23, MediaWiki starts to behave more like a modern website as regards notifications, to keep the editors of your wiki engaged and always up to date about what interests them. This used to require several custom settings.
* (bug 45020) Make preferences "Add pages I create and files I upload to my watchlist" and "pages and files I edit" true by default.
* (bug 45022) Make preference "Email me when a page or file on my watchlist is changed" true by default.
* (bug 49719) Watch user page and user talk page by default.
This will allow your new users to immediately start benefiting from the watchlist and email notification features, without needing to first read all the docs to find out that they're as useful as they are.
=== Merged extensions ===
Merged into 1.23:
* ExpandTemplates (bug 28264).
* AssertEdit (bug 27841) - documented at API:Assert.
=== Interface ===
* (bug 42026) Add option to only show page creations in Special:Contributions (and API).
* Add new special page to list duplicate files, Special:ListDuplicatedFiles.
* (bug 60333) Add new special page listing tracking categories (Special:TrackingCategories).
=== Editing ===
* A new special page Special:Diff was added, allowing users to create internal links to revision comparison pages using syntax such as Special:Diff/12345, Special:Diff/12345/prev or Special:Diff/12345/98765.
=== Help pages ===
With 1.23, MediaWiki begins a process of consolidation of its help pages. Now, most are using the Translate extension and can be easily translated and updated in hundreds languages.
In the coming months, we'll focus on making more of the central help pages translatable and on linking them from the relevant MediaWiki interfaces for better discoverability. Please help: add your own translations; update existing pages and cover missing MediaWiki topics.
Traditionally, help pages have been scattered on countless wikis and poorly translated; most of those on mediawiki.org were migrated with the help of some Google Code-in students.
=== CSS refresh for Vector ===
* Various Vector CSS properties have been converted to LESS variables.
* The font size of <code>#bodyContent</code>/<code>.mw-body-content</code> has been increased to 0.875em.
* The line-height of <code>#bodyContent</code>/<code>.mw-body-content</code> has been increased to 1.6.
* The line-height of superscript (sup) and subscript (sub) are now set to 1.
* The default color for content text (but not the headers) is now #252525; (dark grey).
* All headers have updated sizes and margins.
* H1 and H2 headers now use a serif font.
* Body font is "sans-serif" as always.
For more information see Typography refresh.
=== Configuration ===
Add Config and GlobalConfig classes:
* Allows configuration options to be fetched from context.
* Only one implementation, GlobalConfig, is provided, which simply returns $GLOBALS[$name]. There can be more classes in the future, possibly a database-based one. For convinience the "wg" prefix is automatically added.
* This adds the $wgConfigClass global variable which is used to determine which implementation of Config to use by default.
* The ContextSource getConfig and setConfig methods were introduced.
Full release notes:
https://git.wikimedia.org/blob/mediawiki%2Fcore.git/1.23.0/RELEASE-NOTES-1.…https://www.mediawiki.org/wiki/Release_notes/1.23
**********************************************************************
Download:
http://download.wikimedia.org/mediawiki/1.23/mediawiki-1.23.0.tar.gz
GPG signatures:
http://download.wikimedia.org/mediawiki/1.23/mediawiki-1.23.0.tar.gz.sig
Public keys:
https://www.mediawiki.org/keys/keys.html
Markus Glaser
(Release Team)
Hello and welcome to the latest edition of the WMF Engineering Roadmap
and Deployment update.
The full log of planned deployments next week can be found at:
<https://wikitech.wikimedia.org/wiki/Deployments#Week_of_June_9th>
A quick list of notable items...
== Monday ==
* ElasticSearch
** There will be an upgrade of the backend/server software of the new
search system (to ElasticSearch 1.2.1). This should not have any user
facing impact.
== Tuesday ==
* MediaWiki deploy
** group1 to 1.24wmf6: All non-Wikipedia sites (Wiktionary, Wikisource,
Wikinews, Wikibooks, Wikiquote, Wikiversity, and a few other sites)
** <https://www.mediawiki.org/wiki/MediaWiki_1.24/wmf8>
* WikiQuote/WikiData: WikiQuote will now have access to data in
WikiData.
== Thursday ==
* MediaWiki deploy
** group2 to 1.24wmf8 (all Wikipedias)
** group0 to 1.24wmf9 (test/test2/testwikidata/mediawiki)
* MediaViewer
** Will be enabled on by default on all Wikis
** <https://www.mediawiki.org/wiki/Multimedia/Media_Viewer/Release_Plan#Timeline>
Thanks and as always, questions and comments welcome,
Greg
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
One of Dan's next steps is coordinating with a few language wikis for a
~month long public rollout test period, similar to the way mediaviewer did.
I agree with you Erik, that there should be a way to disable for readers,
the team is thinking about possible alternatives currently. As well as
making sure that logged in users have full control over using hovercards,
navigation popups, or turning off link previews all together.
Jan, we can possibly plan on including sverige wiki in the pilot program if
that works for Dan and the community there, however the beta feature is
really more geared toward content wikis (wikipedia, wiktionary) rather than
other wiki types.
*Jared Zimmerman * \\ Director of User Experience \\ Wikimedia Foundation
M +1 415 609 4043 \\ @jaredzimmerman <http://loo.ms/g0>
On Fri, Jun 6, 2014 at 6:44 AM, Jan Ainali <jan.ainali(a)wikimedia.se> wrote:
> Very nice!
>
> You can turn it on the Swedish chapter wiki se.wikimedia.org
>
>
> *Best regards, Jan Ainali*
>
> CEO, Wikimedia Sverige <http://se.wikimedia.org/wiki/Huvudsida>
> +46 729 67 29 48
>
>
> *Tänk dig en värld där varje människa har fri tillgång till mänsklighetens
> samlade kunskap. Det är det vi gör.*
> Bli medlem. <http://blimedlem.wikimedia.se>
>
>
>
> 2014-06-06 0:10 GMT+02:00 Nick Wilson (Quiddity) <nwilson(a)wikimedia.org>:
>
>> The Hovercards Beta Feature [1] is inspired by the Navigation Popups
>> gadget. The goal of Hovercards is to make the reading experience better for
>> our readers. When readers hover over links to other articles, they are
>> provided a short summary and image. They can decide whether they need to
>> visit that subject more fully before continuing the current subject.
>>
>> We are looking for local project wikis who are interested in turning on
>> the Hovercards feature for all readers, ahead of the future Beta Feature
>> graduation. Please let us know, if your wiki might be interested in this.
>>
>> We're also looking for additional feedback on all aspects of the
>> extension.[2]
>>
>> -
>>
>> [1] https://www.mediawiki.org/wiki/Beta_Features/Hovercards
>> [2] https://www.mediawiki.org/wiki/Talk:Beta_Features/Hovercards
>>
>> Sidenotes:
>> * The recent flickering-bug (primarily in Firefox) was fixed today.
>> * There is a gerrit patch which is examining the conflict of Navigation
>> Popups and Hovercards. https://gerrit.wikimedia.org/r/#/c/120188/
>> * We also plan to suggest some potential updates to the styles in the
>> original Navpopups gadget, so that it has better usability and is visually
>> consistent with Hovercard and Reference-tooltip styles.
>>
>> --
>>
>> Quiddity
>> Community Liaison
>>
>> _______________________________________________
>> Wikitech-ambassadors mailing list
>> Wikitech-ambassadors(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-ambassadors
>>
>
>
> _______________________________________________
> Wikitech-ambassadors mailing list
> Wikitech-ambassadors(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-ambassadors
>
>
I'd like to restart the conversation about hardening Wikipedia (or
possibly Wikimedia in general) against traffic analysis. I brought
this up ... last November, I think, give or take a month? but it got
lost in a larger discussion about HTTPS.
For background, the type of attack that it would be nice to be able to
prevent is described in this paper:
http://sysseclab.informatics.indiana.edu/projects/sidebuster/sidebuster-fin…
Someone is eavesdropping on an encrypted connection to
LANG.wikipedia.org. (It's not possible to prevent the attacker from
learning the DNS name and therefore the language the target reads,
short of Tor or similar. It's also not possible to prevent them from
noticing accesses to ancillary servers, e.g. Commons for media.) The
attacker's goal is to figure out things like
* what page is the target reading?
* what _sequence of pages_ is the target reading? (This is actually
easier, assuming the attacker knows the internal link graph.)
* is the target a logged-in user, and if so, which user?
* did the target just edit a page, and if so, which page?
* (... y'all are probably better at thinking up these hypotheticals than me ...)
Wikipedia is different from a tax-preparation website (the case study
in the above paper) in that all of the content is public, and edit
actions are also public. The attacker can therefore correlate their
eavesdropping data with observations of Special:RecentChanges and the
like. This may mean it is impossible to prevent the attacker from
detecting edits. I think it's worth the experiment, though.
What I would like to do, in the short term, is perform a large-scale
crawl of one or more of the encyclopedias and measure what the above
eavesdropper would observe. I would do this over regular HTTPS, from
a documented IP address, both as a logged-in user and an anonymous
user. This would capture only the reading experience; I would also
like to work with prolific editors to take measurements of the traffic
patterns generated by that activity. (Bot edits go via the API, as I
understand it, and so are not reflective of "naturalistic" editing by
human users.)
With that data in hand, the next phase would be to develop some sort
of algorithm for automatically padding HTTP responses to maximize
eavesdropper confusion while minimizing overhead. I don't yet know
exactly how this would work. I imagine that it would be based on
clustering the database into sets of pages with similar length but
radically different contents. The output of this would be some
combination of changes to MediaWiki core (for instance, to ensure that
the overall length of the HTTP response headers does not change when
one logs in) and an extension module that actually performs the bulk
of the padding. I am not at all a PHP developer, so I would need help
from someone who is with this part.
What do you think? I know some of this is vague and handwavey but I
hope it is at least a place to start a discussion.
zw
The Hovercards Beta Feature [1] is inspired by the Navigation Popups
gadget. The goal of Hovercards is to make the reading experience better
for our readers. When readers hover over links to other articles, they
are provided a short summary and image. They can decide whether they
need to visit that subject more fully before continuing the current subject.
We are looking for local project wikis who are interested in turning on
the Hovercards feature for all readers, ahead of the future Beta Feature
graduation. Please let us know, if your wiki might be interested in this.
We're also looking for additional feedback on all aspects of the
extension.[2]
-
[1] https://www.mediawiki.org/wiki/Beta_Features/Hovercards
[2] https://www.mediawiki.org/wiki/Talk:Beta_Features/Hovercards
Sidenotes:
* The recent flickering-bug (primarily in Firefox) was fixed today.
* There is a gerrit patch which is examining the conflict of Navigation
Popups and Hovercards. https://gerrit.wikimedia.org/r/#/c/120188/
* We also plan to suggest some potential updates to the styles in the
original Navpopups gadget, so that it has better usability and is
visually consistent with Hovercard and Reference-tooltip styles.
--
Quiddity
Community Liaison
Hi,
I need some help regarding my GSoC project in which I need to implement an
OAuth login system for a browser based plugin, so we can identify users.
But I am stuck and not able to get anything here >
https://github.com/apsdehal/Mediawiki-login-api/blob/master/lib/OAuth/MWOAu…
. Kindly help me, and tell me if further information is needed.
--
Amanpreet Singh,
IIT Roorkee