An automated run of parserTests.php showed the following failures:
This is MediaWiki version 1.10alpha (r20017).
Reading tests from "maintenance/parserTests.txt"...
Reading tests from "extensions/Cite/citeParserTests.txt"...
Reading tests from "extensions/Poem/poemParserTests.txt"...
18 still FAILING test(s) :(
* URL-encoding in URL functions (single parameter) [Has never passed]
* URL-encoding in URL functions (multiple parameters) [Has never passed]
* TODO: Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html) [Has never passed]
* TODO: Link containing double-single-quotes '' (bug 4598) [Has never passed]
* TODO: message transform: <noinclude> in transcluded template (bug 4926) [Has never passed]
* TODO: message transform: <onlyinclude> in transcluded template (bug 4926) [Has never passed]
* BUG 1887, part 2: A <math> with a thumbnail- math enabled [Has never passed]
* TODO: HTML bullet list, unclosed tags (bug 5497) [Has never passed]
* TODO: HTML ordered list, unclosed tags (bug 5497) [Has never passed]
* TODO: HTML nested bullet list, open tags (bug 5497) [Has never passed]
* TODO: HTML nested ordered list, open tags (bug 5497) [Has never passed]
* TODO: Inline HTML vs wiki block nesting [Has never passed]
* TODO: Mixing markup for italics and bold [Has never passed]
* TODO: 5 quotes, code coverage +1 line [Has never passed]
* TODO: dt/dd/dl test [Has never passed]
* TODO: Images with the "|" character in the comment [Has never passed]
* TODO: Parents of subpages, two levels up, without trailing slash or name. [Has never passed]
* TODO: Parents of subpages, two levels up, with lots of extra trailing slashes. [Has never passed]
Passed 493 of 511 tests (96.48%)... 18 tests failed!
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
See http://ha.ckers.org/blog/20070220/mediawiki-192-utf-7-xss/ for
details. I'm sure we get these all the time, but since RSnake picked it
up it probably will get a bit more publicity than normal. Has it been
fixed on the trunk yet?
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.6 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFF24nRqTO+fYacSNoRAl+VAJ4vlOKYMIYamO2x6BxFJdj8qVYiewCfZ1tP
XFpAqpDHBu2onGY2NrITQY8=
=Dnhz
-----END PGP SIGNATURE-----
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
February 20, 2007
MediaWiki 1.9.3 is a security and bug-fix update to the Winter 2007
quarterly release. Minor compatibility fixes for IIS and PostgreSQL are
included.
An XSS injection vulnerability based on Microsoft Internet Explorer's
UTF-7 charset autodetection was located in the AJAX support module,
affecting MSIE users on MediaWiki 1.6.x and up when the optional setting
$wgUseAjax is enabled.
If you are using an extension based on the optional Ajax module,
either disable it or upgrade to a version containing the fix:
* 1.9: fixed in 1.9.3
* 1.8: fixed in 1.8.4
* 1.7: fixed in 1.7.3
* 1.6: fixed in 1.6.10
There is no known danger in the default configuration, with $wgUseAjax off.
* (bug 8992) Fix a remaining raw use of REQUEST_URI in history
* (bug 8984) Fix a database error in Special:Recentchangeslinked
when using the PostgreSQL database.
* Add 'charset' to Content-Type headers on various HTTP error responses
to forestall additional UTF-7-autodetect XSS issues. PHP sends only
'text/html' by default when the script didn't specify more details,
which some inconsiderate browsers consider a license to autodetect
the deadly, hard-to-escape UTF-7.
This fixes an issue with the Ajax interface error message on MSIE
when $wgUseAjax is enabled (not default configuration); this UTF-7
variant on a previously fixed attack vector was discovered by Moshe BA
from BugSec: http://www.bugsec.com/articles.php?Security=24
* Trackback responses now specify XML content type
Full release notes:
http://svn.wikimedia.org/svnroot/mediawiki/tags/REL1_9_3/phase3/RELEASE-NOT…
Download:
http://download.wikimedia.org/mediawiki/1.9/mediawiki-1.9.3.tar.gz
Patch against 1.9.2:
http://download.wikimedia.org/mediawiki/1.9/mediawiki-1.9.3.patch
Downloads, checksums, and GPG signatures for all versions:
http://download.wikimedia.org/mediawiki/1.9/http://download.wikimedia.org/mediawiki/1.8/http://download.wikimedia.org/mediawiki/1.7/http://download.wikimedia.org/mediawiki/1.6/
Before asking for help, try the FAQ:
http://www.mediawiki.org/wiki/Manual:FAQ
Low-traffic release announcements mailing list:
(Please subscribe to receive announcements of security updates.)
http://lists.wikimedia.org/mailman/listinfo/mediawiki-announce
Wiki admin help mailing list:
http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Bug report system:
http://bugzilla.wikimedia.org/
Play "stump the developers" live on IRC:
#mediawiki on irc.freenode.net
- -- brion vibber (brion @ pobox.com / brion @ wikimedia.org)
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.2.2 (Darwin)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFF27NDwRnhpk1wk44RAhmmAKCVZNGTidpNmCJUwUs5JA1CIJL3OwCfUsxy
uny25mn0vihjgNoDxl2ZDiw=
=bvTp
-----END PGP SIGNATURE-----
Currently, an interwiki link like this:
[[wiktionary:de:Kopf]]
is expanded in the HTML output as:
http://en.wiktionary.org/wiki/de:Kopf
The server at en.wiktionary.org has special code that redirects a
request for this URL to:
http://de.wiktionary.org/wiki/Kopf
This is a clever hack, and we've been using it for a long time. I
wonder, however, if we can make interwiki links render correctly in the
output in the first place. This would save an unnecessary hit on the
server.
Our default interwiki table maps "wiktionary" to
"http://en.wiktionary.org/wiki/$1". I think we could have the URL
pattern include a second argument, the language code, at $2, like so:
"http://$2.wiktionary.org/wiki/$1".
Note that this will only work if every language version of a project has
URLs matching this pattern. I think this is the case for all Wikimedia
projects, but someone correct me if I'm wrong.
The code in Title::getFullURL that creates an interwiki URL could work
as follows:
if (the interwiki url pattern doesn't have a '$2' in it) {
substitute the title text for $1
in the interwiki URL pattern;
} else if (the title text has a colon AND
the part before the first colon is a language code) {
substitute the part after the first colon for $1,
and the part before the first colon for $2,
in the interwiki URL pattern;
} else {
substitute the title text for $1,
and the current content language code for $2,
in the interwiki URL pattern;
}
If there's no objection, I'm going to go ahead and make this change.
~Evan
________________________________________________________________________
Evan Prodromou <evan(a)prodromou.name>
http://evan.prodromou.name/
Brion Vibber wrote:
>
> I've fixed up the image-based captcha to read from subdirectories, which
> should put less load on the file server.
Sorry for asking more information about captcha, but I have some
problem in understanding how the various system message works.
I have found 4 system messages in Allmessages
* captcha-createaccount
* captcha-edit
* fancycaptcha-createaccount
* fancycaptcha-edit
(as weel as some other related messages like captcha-createaccount-fail )
According to what I have been seeing on it.wikinews the
captcha-createaccount message was used some time ago and was asked to
enter a word displayed distorted in an image, then the same message
was used, but what the user had to do was to solve a math operation,
than from today (or yesterday) the fancycaptcha-createaccount is used
and an image of a distorted word is displayed.
What are the system message that are used?
Can I assume that, form now on, image of a word will be displayed and
the user asked to insert the distorted word displayed in the image?
By the way in the default (English) message it is written "... enter
the words ... ", the plural is used, but I have seen only one word on
the image. (Well actually I have made just a few tests)
Thanks
AnyFile
Is there a way to trick the Parser into rendering fully qualified URLs
_other than_ setting "action=render" at the URL level?
The last time I wanted this feature was circa MW 1.5.1 - and I know at that
time you had to hijack global $action to get any results. I'm hoping this
is no longer the case?
I'm developing an extension that needs to run on 1.6.x, 1.8.x, 1.9.x and
1.10 - Thanks in advance for any help.
-- Jim R. Wilson (jimbojw)
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
A few hours ago Tim changed how the wiki handles output buffering and
compression so that a Content-Length HTTP header can be sent.
The primary benefit of this is that the Squid proxies in Amsterdam and
Seoul will be better able to maintain persistent connections to Tampa,
which should improve performance for logged-in visitors in Europe and Asia.
There were a couple small bugs which unfortunately we didn't catch
before putting it live:
* Breakage causing Special:Export to cut off unexpectedly with bogus
'Content-Length:' header, thus breaking transwiki import:
This is now fixed.
* Incorrect 'Vary: Accept-Encoding' header for anonymous page views,
causing logged-in visitors to sometimes see logged-out pages (with
default skin and login link instead of link to own user/talk/contribs):
The bug is fixed, but a fair number of cached entries are still present.
They'll vanish over time, but I'm not sure if we currently have a clean
way to flush them in bulk.
- -- brion vibber (brion @ pobox.com / brion @ wikimedia.org)
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.2.2 (Darwin)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFF2q57wRnhpk1wk44RAhQ0AJ9IVuM4ghugtBRdOI38YN57F0n68gCeO/hF
WTBbvgbHQM1hlTBSVPPJGN4=
=14UW
-----END PGP SIGNATURE-----
An automated run of parserTests.php showed the following failures:
This is MediaWiki version 1.10alpha (r20001).
Reading tests from "maintenance/parserTests.txt"...
Reading tests from "extensions/Cite/citeParserTests.txt"...
Reading tests from "extensions/Poem/poemParserTests.txt"...
18 still FAILING test(s) :(
* URL-encoding in URL functions (single parameter) [Has never passed]
* URL-encoding in URL functions (multiple parameters) [Has never passed]
* TODO: Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html) [Has never passed]
* TODO: Link containing double-single-quotes '' (bug 4598) [Has never passed]
* TODO: message transform: <noinclude> in transcluded template (bug 4926) [Has never passed]
* TODO: message transform: <onlyinclude> in transcluded template (bug 4926) [Has never passed]
* BUG 1887, part 2: A <math> with a thumbnail- math enabled [Has never passed]
* TODO: HTML bullet list, unclosed tags (bug 5497) [Has never passed]
* TODO: HTML ordered list, unclosed tags (bug 5497) [Has never passed]
* TODO: HTML nested bullet list, open tags (bug 5497) [Has never passed]
* TODO: HTML nested ordered list, open tags (bug 5497) [Has never passed]
* TODO: Inline HTML vs wiki block nesting [Has never passed]
* TODO: Mixing markup for italics and bold [Has never passed]
* TODO: 5 quotes, code coverage +1 line [Has never passed]
* TODO: dt/dd/dl test [Has never passed]
* TODO: Images with the "|" character in the comment [Has never passed]
* TODO: Parents of subpages, two levels up, without trailing slash or name. [Has never passed]
* TODO: Parents of subpages, two levels up, with lots of extra trailing slashes. [Has never passed]
Passed 493 of 511 tests (96.48%)... 18 tests failed!
Hi there,
I'd like to use MediaWiki as a storage for some kind of items, which
doesn't have "natural" titles (just like bugs in Bugzilla). So I'd like
to allow user to create new article without specifying the title and I
want the system to generate the unique title by itself. The best
solution I see here is to use consequent numbers in particular namespace
-- e.g. when user clicks "Add item" the system allows he or she to fill
in some form (with textareas and other input fields) then creates an
article containing the data from this form (by template) and a title
like "Items:1234" (assuming we already have "Items:1233").
Is it possible to implement such a feature as an extension?
Actually, I see that it can be done using external script which
communicate with MediaWiki via API, but I'm not sure it'll be reliable
in sense of collisions (e.g. if two users click "Submit" almost
simultaneously, they can both obtain the same ID for their articles and
data will be lost or editing conflict occurs).
Thanks,
--
With best regards,
Ilya V. Schurov.
Recently I have read several news reports regarding funding for the
wikimedia foundation. Regardless of whether the foundation is in need
of a little cash I believe that it would be possible to greatly reduce
the costs of operation by divvying up some of the server load to
volunteers.
This would be done in a way analogous projects such as SETI@home,
where anyone with access to a server could install a client and host
data. For instance: I, as a student in college sometimes feel bad for
not being able to contribute monetarily to the fundraising campaigns.
I do however have access to a server that's using roughly 0.5% it's
CPU and 1.5% of it's allocated bandwidth and would be more than
willing to contribute those resources if it were possible.
I'm raising this topic from the standpoint of the 'idea' and would be
most interested in discussion based on based around the assumption
that it would be trivial to implement such a system. Whether or not
that is the case is something to be explored as well, however I'd
rather not get bogged down in implementation before discussing the
concept.
If anyone has any suggestions or would be interested in helping please reply.
I'm new around here so I'm not exactly sure what the next step should be :)