I followed the instruction in the announcement email:
"To do a simple Git clone, the command is:
git clone https://gerrit.wikimedia.org/r/p/mediawiki/core.git"
however this gives me version 1.20alpha.
'git branch' shows no branches other than master.
I can't find any documentation at mediawiki.org
do I need to clone a different repository?
~~John Laurino
Hello!
Jenkins is somehow back in action. I have disabled IRC notification to
avoid any possible spam over the week-end.
Jenkins is connected directly to Gerrit and get notified whenever a
change is submitted in any repository.
Whenever a change is made to mediawiki/core.git, Jenkins trigger the
MediaWiki-GIT-Fetching job which will get fetch the change. Then, two
other jobs are triggered on that code:
- MediaWiki-Tests-Databaseless
-> runs tests using make databaseless and an empty LocalSettings.php
- MediaWiki-Tests-Parser
-> runs the parser tests using a sqlite backend
Over the course of next week, I will create the remaining jobs.
You can have a look at the current status at:
https://integration.mediawiki.org/ci/job/MediaWiki-GIT-Fetching/
For each build, it list the gerrit change number and patchset. That will
give a good overview of what is going wrong.
Known bug:
- somehow, tests results from child jobs are not aggregated back up to
the main MediaWiki-GIT-Fetching job.
- there might be some race conditions when several changes are
submitted at the same time.
--
Antoine "hashar" Musso
Hi,
I was playing with properties as written at
https://www.mediawiki.org/wiki/API:Properties#Info:_Parameters
I spent a lot of time, but I cannot get the properties watched and preload
anyway. Preload is empty for an existing page and None for a non-existing
page with preload; watched doeas not appear at all. I tried it in browser
as well as with bot, logged in as admin. Could somebody please provide me a
working link that shows watched and preload?
Page example with
preload<http://hu.wikipedia.org/w/index.php?action=edit&preload=Wikip%C3%A9dia%3AJa…>
I am interested in new pages in namespace MediaWiki where a text is
preloaded but without a preload subpage (I guess taken from translatewiki).
Do they qualify as preload and may be got by API?
--
Bináris
Hi everyone,
Git day is upon us (actual cutover happened a few hours ago). Chad,
Antoine, Roan, Sumana and many others made a heroic effort to get this
pushed out while many of us were instead doing budget stuff (Sumana
gets bonus points for doing both budget stuff and Git stuff). My role
in this process: copy and paste this email out of Etherpad ;-)
Important documentation:
* https://www.mediawiki.org/wiki/Git <- The hub
* Requesting an account (Git and Labs share account infrastructure):
https://www.mediawiki.org/wiki/Project:Labsconsole_accounts
* The list of repositories: https://gerrit.wikimedia.org/r/#admin,projects
* The list of extensions that have moved:
https://gerrit.wikimedia.org/mediawiki-extensions.txt
* How we're dealing with Gerrit project ownership:
http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/59681
(give us a day to set up the page/templates for requesting Gerrit
project ownership)
Since a lot of people are likely curious, now is a good time to talk
about what our deployment strategy is going to be in the short term,
and how this migration affects our plans moving forward.
In the short term, we're still deploying from 1.19wmf1 in SVN.
Therefore, things that need immediate deployment will need to be
manually merged back to the deployment branch (and all non-urgent
deployments should hang on until we finish the work here). We plan to
be a little more relaxed for this short period about things going
directly into 1.19wmf1. Deployment from fenari doesn't change yet.
We're currently planning security releases for 1.17, 1.18, and 1.19.
These will be released from git.
In the medium term, we plan to have far more frequent deployments,
starting as early as April 9. More in a separate email on the
subject.
Moving over the deployment process and ironing out the remaining
issues with our current migrated projects is Chad and Antoine's number
one priority right now, so there should be significant continuing
progress on this over the next several days. General workflow issues
are Chad's responsibility. The work to support Translatewiki is well
underway, and Antoine is taking the lead with that.
Thanks for your patience with this move. We hope you enjoy working
with Git (or, if you're currently a skeptic, at least come to
appreciate it). With the combination of Git and the workflow changes
it enables, we're pretty excited by our new ability to deploy code
more frequently, and we're pretty optimistic that we'll be able to
actually get that benefit sooner rather than later.
Rob (and Chad, and Sumana)
Markus-
Congrats because SMW 1.7 is a real bellringer - can't wait to upgrade. I was
thinking that "named subobjects" can parallel wiki pagenames,
[[#interwiki:en:ns:pgnm]]. Some namespaces would be nice to reserve would
be #Dublin Core:, #OWL:, #SKOS:, #Topic Map: etc and their information
elements easily defined. And should SF be able to associate forms with
subobject namespaces (and categories), it seems a page can maintain a
virtual wiki. What a coup!
Topic maps distinguish type, subjects, scopes, names, associations &
occurrences - an ISO model of an index at the back of any reference book. A
minimalist implementation is to associate subjects, like categories, in a
page's #DublinCore:Subject property. Subjects can be SKOS objects defined in
Subject: namespace, where eg the Library of Congress Subject Headings are
defined. Markup is just [[Subject:subjectname]] on any page or within any
subobject. [By within, I mean a #^text property for content that's copied
into a page should it be "moved" from subobject to page -- this text can
contain category and subject markup too]. To be fully wiki,
{{pgnm#DublinCore}} might transclude the same material as
{{pgnm/DublinCore}} using this #^text property.
Their idea of "type" in <xtm:scope>:<xtm:type>:<xtm:name> maps to wiki
namespaces more naturally than categories can. Today, categories confuse
plural lists of things with singular types of thing, causing a few practical
problems. Singular nouns seem best defined in Type namespace with instances
located in an appropriately named MW namespace or MW namespace alias. Note,
MW can handle 64K namespaces, maybe for a seriously good dictionary of
common nouns and noun-phrases like Wiktionary?
A few grammatical ideas. A tilde unary operator that does a "like" page or
object selection, and filters for categories and subjects that apply to
pages & subobjects alike, seems valuable. A magic word {{ANAMESPACE}} ("A"
prefix = "A-box") might be nice too to identify a subobject's namespace.
page filters
[[~{{FULLPAGENAME}}#DublinCore:]]
[[Category:Active]][[Subject:LCSH Science]]
subobject filters
[[~{{FULLPAGENAME}}#DublinCore:]]
[[#Category:Active]][[#Subject:LCSH Science]]
both filters
[[{{NAMESPACE}}:+]]
[[Category:Active ]][[Subject:LCSH Science]]
[[#Category:Active]][[#Subject:LCSH Science]]
One alternative subobject naming is {{FULLPAGENAME}}#{{FULLPAGENAME}} which
is a Dublin Core object ie [{{FULLPAGENAME}}#{{FULLPAGENAME}}]
|?Contributor|?Coverage...|?Subject|?Title|?Type
So [[Subject:pgnm]] markup for a page can be stored in the Subject property
in a {{FPN#FPN}} or {{FPN#DublinCore}} object. Another approach is that
mediawiki build a Category-like mechanism, with its own set of tables and
indexes, for Subjects. This may boil down to which part of the wiki stack a
[[Subject:pgnm]] tag best processed. I think a [[Subject:pgnm]] could be a
useful addition to wiki markup so I hope that doesnt get obscured. Wikidata
may want to incorporate ideas concerning Topic Maps - strategically it's
possibly an easy big win, so I'll bother them too.
>-----Original Message-----
>From: Markus Krötzsch [mailto:markus@semantic-mediawiki.org]
>Sent: Wednesday, March 21, 2012 7:43 AM
>To: jmcclure(a)hypergrove.com
>Subject: Re: [Wikitech-l] Topic Maps
>
>
>On 21/03/12 13:06, John McClure wrote:
>> (1) official ISO versions are purchaseable while unofficial
>versions are
>> foss.
>
>Ok, that's a pity. A standard that nobody can read is not so
>helpful ...
>We already had this issue with other ISO standards before. In the end,
>people just ignore them because they do not get to them.
>
>> (2) there is a TMAPI which I think is conceptual not a
>software library. Use
>> XML parser.
>
>I was more thinking of a library to represent and manipulate
>topic maps
>in a program. Parsing it is one thing, but representing TMs as
>a kind of
>DOM tree (that you would get from XML) would not be so great I guess.
>
>> (3) Drupal has an extension now. There are a few others.
>> (4) again, there's a QL but it's far more SQL - check it
>with XTM for data
>> structures.
>
>Ok, so there is a standard mapping from TM to RDB that allows
>SQL to be
>used for querying?
>
>> Personally, topic maps should be implemented in WP using
>SMW's subobjects.
>
>Ah, so I suppose you are viewing TMs more as a kind of
>conceptual model
>than as a concrete standard with a fixed syntax/semantics. Fair enough.
>
>Regards
>
>Markus
>
>>
>>> -----Original Message-----
>>> From: Markus Krötzsch [mailto:markus@semantic-mediawiki.org]
>>> Sent: Wednesday, March 21, 2012 3:28 AM
>>> To: jmcclure(a)hypergrove.com
>>> Subject: Re: [Wikitech-l] Topic Maps
>>>
>>>
>>> [Off-list, maybe off-topic too ;-)]
>>>
>>> Hi John,
>>>
>>> I would actually like to learn a bit more about topic maps.
>Could you
>>> maybe help me to answer some of the following questions?
>>>
>>> (1) Where can I see the official standard? All the documents that I
>>> found start by saying that they are not the ISO standard.
>Where is the
>>> official spec?
>>>
>>> (2) Which software libraries support topic maps? (parsing,
>API, etc.)
>>> For which other tasks is there support in existing software
>(authoring
>>> tools? database systems? reasoners?).
>>>
>>> (3) What are the major users of topic maps today? Where are
>they used?
>>>
>>> (4) How does one query information that is stored in topic maps? (In
>>> other words: which kinds of data access are supported?)
>>>
>>> Thanks,
>>>
>>> Markus
>>>
>>>
>>> On 21/03/12 00:53, John McClure wrote:
>>>> Adding Topic Maps to MW base software could be a winner --
>>> it can generate a
>>>> wiki-site map (some think WP needs one!); it can be used to
>>> corelate the
>>>> contents of documents loaded into a wiki (like conference
>>> proceedings) with
>>>> a wiki's topic map; and would make a cool tool for any page
>>> in a wiki, most
>>>> clearly on a user page. It's perhaps a smart strategic move
>>> - ISO 82250
>>>> Topic Maps are the fruit of SGML/Hytime n-ary models that
>>> 'lost' to RDF
>>>> triples back when. Being a superset of RDF, TMs can type
>associations
>>>> between articles while capturing all infobox data.
>>>>
>>>> Topic maps may be a compelling FUNCTIONAL upgrade for MW as
>>> it captures
>>>> subjects of an article for the first time. Given topic-map
>>> to RDF transforms
>>>> amid continuing W3 research, this could be enough for the
>>> semantic world. By
>>>> adopting say the Lib of Congress' Subject Headings, a wiki
>>> like Wikipedia
>>>> could play an important role in the semantic web. The
>>> current situation with
>>>> Wikipedia is that it's hard to have a large library of
>>> information without a
>>>> subject catalogue... right now, wikis have an author
>>> catalogue sort of, fine
>>>> for smaller hadcrafted wikis but doesn't scale well for many.
>>>>
>>>> Since other platforms now have maturing topic map extensions
>>> I'm worried the
>>>> impact on wikis not to have that technology.
>>>>
>>>> John McClure
>>>>
>>>>
>>>> _______________________________________________
>>>> Wikitech-l mailing list
>>>> Wikitech-l(a)lists.wikimedia.org
>>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>>>
>>>
>>
>>
>
Hi folks,
is it possible to place a page on top of Special:Emailuser for each user
that allows e-mails? What I think is a page with a given special name that
may be edited (possibly by owner and admins only, as .css and .js
subpages), may be watched and is used to display a message to those wanting
to send a mail. E.g. "Please write me only if it is confidential, otherwise
use my talk page" or "May blue death harm your computer twice a day if you
don't supply a proper subject" or "I usually read mails every second Friday
of months, be patient". Just in case the developer team suffers from lack
of great new ideas.
--
Bináris
I would like to announce the release of MediaWiki 1.17.3. Five security
issues were discovered.
It was discovered that the api had a cross-site request forgery (CSRF)
vulnerability in the block/unblock modules. It was possible for a user
account with the block privileges to block or unblock another user without
providing a token.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=34212
It was discovered that the resource loader can leak certain kinds of private
data across domain origin boundaries, by providing the data as an executable
JavaScript file. In MediaWiki 1.18 and later, this includes the leaking of
CSRF
protection tokens. This allows compromise of the wiki's user accounts, say
by
changing the user's email address and then requesting a password reset.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=34907
Jan Schejbal of Hatforce.com discovered a cross-site request forgery (CSRF)
vulnerability in Special:Upload. Modern browsers (since at least as early as
December 2010) are able to post file uploads without user interaction,
violating previous security assumptions within MediaWiki.
Depending on the wiki's configuration, this vulnerability could lead to
further
compromise, especially on private wikis where the set of allowed file types
is
broader than on public wikis. Note that CSRF allows compromise of a wiki
from
an external website even if the wiki is behind a firewall.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35317
George Argyros and Aggelos Kiayias reported that the method used to generate
password reset tokens is not sufficiently secure. Instead we use various
more
secure random number generators, depending on what is available on the
platform. Windows users are strongly advised to install either the openssl
extension or the mcrypt extension for PHP so that MediaWiki can take
advantage
of the cryptographic random number facility provided by Windows.
Any extension developers using mt_rand() to generate random numbers in
contexts
where security is required are encouraged to instead make use of the
MWCryptRand class introduced with this release.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35078
A long-standing bug in the wikitext parser (bug 22555) was discovered to
have
security implications. In the presence of the popular CharInsert extension,
it
leads to cross-site scripting (XSS). XSS may be possible with other
extensions
or perhaps even the MediaWiki core alone, although this is not confirmed at
this time. A denial-of-service attack (infinite loop) is also possible
regardless of configuration.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35315
Full release notes:
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=blob_plain;f=RE
LEASE-NOTES;hb=1.17.3
https://www.mediawiki.org/wiki/Release_notes/1.17
Co-inciding with these security releases, the MediaWiki source code
repository has
moved from SVN (at https://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3)
to Git (https://gerrit.wikimedia.org/gitweb/mediawiki/core.git). So the
relevant
commits for these releases will not be appearing in our SVN repository. If
you use
SVN checkouts of MediaWiki for version control, you need to migrate these to
Git.
If you up are using tarballs, there should be no change in the process for
you.
Please note that any WMF-deployed extensions have also been migrated to Git
also, along with some other non WMF-maintained ones.
Please bear with us, some of the Git related links for this release may not
work instantly,
but should later on.
To do a simple Git clone, the command is:
git clone https://gerrit.wikimedia.org/r/p/mediawiki/core.git
More information is available at https://www.mediawiki.org/wiki/Git
For more help, please visit the #mediawiki IRC channel on freenode.netirc://irc.freenode.net/mediawiki or email The MediaWiki-l mailing list
at mediawiki-l(a)lists.wikimedia.org.
**********************************************************************
Download:
http://download.wikimedia.org/mediawiki/1.17/mediawiki-1.17.3.tar.gz
Patch to previous version (1.17.2), without interface text:
http://download.wikimedia.org/mediawiki/1.17/mediawiki-1.17.3.patch.gz
Interface text changes:
http://download.wikimedia.org/mediawiki/1.17/mediawiki-i18n-1.17.3.patch.gz
GPG signatures:
http://download.wikimedia.org/mediawiki/1.17/mediawiki-1.17.3.tar.gz.sighttp://download.wikimedia.org/mediawiki/1.17/mediawiki-1.17.3.patch.gz.sighttp://download.wikimedia.org/mediawiki/1.17/mediawiki-i18n-1.17.3.patch.gz.
sig
Public keys:
https://secure.wikimedia.org/keys.html
I would like to announce the release of MediaWiki 1.18.2. Five security
issues were discovered.
It was discovered that the api had a cross-site request forgery (CSRF)
vulnerability in the block/unblock modules. It was possible for a user
account with the block privileges to block or unblock another user without
providing a token.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=34212
It was discovered that the resource loader can leak certain kinds of private
data across domain origin boundaries, by providing the data as an executable
JavaScript file. In MediaWiki 1.18 and later, this includes the leaking of
CSRF
protection tokens. This allows compromise of the wiki's user accounts, say
by
changing the user's email address and then requesting a password reset.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=34907
Jan Schejbal of Hatforce.com discovered a cross-site request forgery (CSRF)
vulnerability in Special:Upload. Modern browsers (since at least as early as
December 2010) are able to post file uploads without user interaction,
violating previous security assumptions within MediaWiki.
Depending on the wiki's configuration, this vulnerability could lead to
further
compromise, especially on private wikis where the set of allowed file types
is
broader than on public wikis. Note that CSRF allows compromise of a wiki
from
an external website even if the wiki is behind a firewall.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35317
George Argyros and Aggelos Kiayias reported that the method used to generate
password reset tokens is not sufficiently secure. Instead we use various
more
secure random number generators, depending on what is available on the
platform. Windows users are strongly advised to install either the openssl
extension or the mcrypt extension for PHP so that MediaWiki can take
advantage
of the cryptographic random number facility provided by Windows.
Any extension developers using mt_rand() to generate random numbers in
contexts
where security is required are encouraged to instead make use of the
MWCryptRand class introduced with this release.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35078
A long-standing bug in the wikitext parser (bug 22555) was discovered to
have
security implications. In the presence of the popular CharInsert extension,
it
leads to cross-site scripting (XSS). XSS may be possible with other
extensions
or perhaps even the MediaWiki core alone, although this is not confirmed at
this time. A denial-of-service attack (infinite loop) is also possible
regardless of configuration.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35315
Full release notes:
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=blob_plain;f=RE
LEASE-NOTES-1.18;hb=1.18.2
https://www.mediawiki.org/wiki/Release_notes/1.18
Co-inciding with these security releases, the MediaWiki source code
repository has
moved from SVN (at https://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3)
to Git (https://gerrit.wikimedia.org/gitweb/mediawiki/core.git). So the
relevant
commits for these releases will not be appearing in our SVN repository. If
you use
SVN checkouts of MediaWiki for version control, you need to migrate these to
Git.
If you up are using tarballs, there should be no change in the process for
you.
Please note that any WMF-deployed extensions have also been migrated to Git
also, along with some other non WMF-maintained ones.
Please bear with us, some of the Git related links for this release may not
work instantly,
but should later on.
To do a simple Git clone, the command is:
git clone https://gerrit.wikimedia.org/r/p/mediawiki/core.git
More information is available at https://www.mediawiki.org/wiki/Git
For more help, please visit the #mediawiki IRC channel on freenode.netirc://irc.freenode.net/mediawiki or email The MediaWiki-l mailing list
at mediawiki-l(a)lists.wikimedia.org.
**********************************************************************
Download:
http://download.wikimedia.org/mediawiki/1.18/mediawiki-1.18.2.tar.gz
Patch to previous version (1.18.1), without interface text:
http://download.wikimedia.org/mediawiki/1.18/mediawiki-1.18.2.patch.gz
Interface text changes:
http://download.wikimedia.org/mediawiki/1.18/mediawiki-i18n-1.18.2.patch.gz
GPG signatures:
http://download.wikimedia.org/mediawiki/1.18/mediawiki-1.18.2.tar.gz.sighttp://download.wikimedia.org/mediawiki/1.18/mediawiki-1.18.2.patch.gz.sighttp://download.wikimedia.org/mediawiki/1.18/mediawiki-i18n-1.18.2.patch.gz.
sig
Public keys:
https://secure.wikimedia.org/keys.html