-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
This list is under temporary moderation.
Remember, this list is for working, not flames.
- -- brion vibber (brion @ pobox.com / brion @ wikimedia.org)
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.2.2 (Darwin)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFF7djowRnhpk1wk44RAmyfAJ4xdFnA8Iw1jGfAc3/5eFHocEQKGQCgvT6G
Q8FUOMOUnMckreAqfteetWU=
=OLPg
-----END PGP SIGNATURE-----
A quick development question.
If an AJAX-enabled, or "next-gen" MediaWiki were to be built, would it be
acceptable for it to have been made in a format such as .aspx (i.e. built
using Microsoft Visual Web Developer)? I was wondering if this might have
some kind of clash with GFDL/Copyleft principles, or accessibility or
compatibility. Would web pages in the .aspx format be acceptable as the
output of the software that Wikipedia runs on?
Thanks.
George Herbert wrote:
> XML is more of a data interchange than storage format. If you're
> storing data, and the software which is using it is the only source or
> destination, there is no clear reason why one would use XML, unless
> you're using it for data structure. But MW currently needs no
> structure; you're either storing metadata (in the db) or the
> article/page contents (text, with some output parsing, but just text;
> also in the DB, but only for convenience of access).
Actually, though, its "text with some output parsing" uses wikitext
as a shorthand for HTML, so it *does* have a data structure. You just
have to parse it to get the structure. So why not use a data
structure that is already close to HTML and that is a widely-accepted
standard rather than the idiosyncratic shorthand of wikitext?
> If the parser is going to stay the same, then there's no reason to
> change the storage format to something which requires an intermediate
> parser before the output parser.
I think there's a feeling already that the parser *shouldn't* stay
the same. It works, but it is the result of an original idea that has
been repeatedly hacked and modified to the point that it has become
unwieldy and imprecise in its ability to map unambiguously to HTML.
Since it needs a rewrite anyway, why not be forward-looking and
ambitious and rewrite with an eye to better standardization and
reusability?
--------------------------------
| Sheldon Rampton
| Research director, Center for Media & Democracy (www.prwatch.org)
| Author of books including:
| Friends In Deed: The Story of US-Nicaragua Sister Cities
| Toxic Sludge Is Good For You
| Mad Cow USA
| Trust Us, We're Experts
| Weapons of Mass Deception
| Banana Republicans
| The Best War Ever
--------------------------------
| Subscribe to our free weekly list serve by visiting:
| http://www.prwatch.org/cmd/subscribe_sotd.html
|
| Donate now to support independent, public interest reporting:
| https://secure.groundspring.org/dn/index.php?id=1118
--------------------------------
I'm upgrading Subversion to 1.4.3 on the server; theoretically if I
upgrade the repository this can same some disk space as well though I'm
not going to do that just yet.
Might be some brief repo downtime.
-- brion vibber (brion @ pobox.com / brion @ wikimedia.org)
An automated run of parserTests.php showed the following failures:
This is MediaWiki version 1.10alpha (r20223).
Reading tests from "maintenance/parserTests.txt"...
Reading tests from "extensions/Cite/citeParserTests.txt"...
Reading tests from "extensions/Poem/poemParserTests.txt"...
3 previously passing test(s) now FAILING! :(
* Blank ref followed by ref with content [Introduced between 07-Mar-2007 08:15:31, 1.10alpha (r20187) and 08-Mar-2007 08:17:23, 1.10alpha (r20223)]
* Regression: non-blank ref "0" followed by ref with content [Introduced between 07-Mar-2007 08:15:31, 1.10alpha (r20187) and 08-Mar-2007 08:17:23, 1.10alpha (r20223)]
* Regression sanity check: non-blank ref "1" followed by ref with content [Introduced between 07-Mar-2007 08:15:31, 1.10alpha (r20187) and 08-Mar-2007 08:17:23, 1.10alpha (r20223)]
18 still FAILING test(s) :(
* URL-encoding in URL functions (single parameter) [Has never passed]
* URL-encoding in URL functions (multiple parameters) [Has never passed]
* TODO: Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html) [Has never passed]
* TODO: Link containing double-single-quotes '' (bug 4598) [Has never passed]
* TODO: message transform: <noinclude> in transcluded template (bug 4926) [Has never passed]
* TODO: message transform: <onlyinclude> in transcluded template (bug 4926) [Has never passed]
* BUG 1887, part 2: A <math> with a thumbnail- math enabled [Has never passed]
* TODO: HTML bullet list, unclosed tags (bug 5497) [Has never passed]
* TODO: HTML ordered list, unclosed tags (bug 5497) [Has never passed]
* TODO: HTML nested bullet list, open tags (bug 5497) [Has never passed]
* TODO: HTML nested ordered list, open tags (bug 5497) [Has never passed]
* TODO: Inline HTML vs wiki block nesting [Has never passed]
* TODO: Mixing markup for italics and bold [Has never passed]
* TODO: 5 quotes, code coverage +1 line [Has never passed]
* TODO: dt/dd/dl test [Has never passed]
* TODO: Images with the "|" character in the comment [Has never passed]
* TODO: Parents of subpages, two levels up, without trailing slash or name. [Has never passed]
* TODO: Parents of subpages, two levels up, with lots of extra trailing slashes. [Has never passed]
Passed 490 of 511 tests (95.89%)... 21 tests failed!
Hello,
The syntax {{ns:x}} resolves to the name of the namespace x. Is there a
syntactical construct providing the name, URL or the interwiki code of
the shared upload Wiki used on the local installation of Mediawiki? For
example returning "Commons" for Wikimedia projects. If there is none,
shouldn't there be one? Would be useful for messages like
MediaWiki:Sharedupload, but also for other pages.
Marcus Buck
User:Slomox
Simetrical wrote:
>On 3/7/07, Mark Clements <gmane(a)kennel17.co.uk> wrote:
>> We already have several instances of this happening:
>>
>> 1) Expanding signatures: ~~~~
>> 2) Pipe trick: [[Help:Contents|]]
>> 3) {{subst:Page}}
>> ...and maybe others.
>
>Nonsense. There is no possible way a null edit should be able to
>activate any of those, because those strings cannot exist in a saved
>page (barring bugs) unless escaped somehow, in which case a null edit
>still won't trigger them.
{{subst:NonExistingPage}} doesn't transform on save.
If [[NonExistingPage]] is then created, and a null edit is done to the
original page, the original page will transform even though no changes
have been made (I've just tested this: [[w:en:User:ais523/Sandbox]]).
So this sort of behaviour happens already.
--
ais523
Simetrical wrote:
> I don't think it's realistic to
> expect total lack of change in the parser anyway: often it's a good
> idea to tweak parser output slightly to add new classes, style things
> a bit better, or whatever.
I think it helps to break things down conceptually into three
categories: input format, storage format, and output format.
STORAGE format should be extremely well-defined and stable, hence XML.
There could be (and probably should be) a variety of INPUT formats:
wikitext, WYSIWYG, etc., each of which is defined through conversion
rules that turn them back and forth into the storage format.
The could also be a variety of OUTPUT formats. In addition to the
current HTML, there could also be an audio output format, a "plain
text" format, maybe a "summary" format (consisting of the lead
paragraph and a table of contents for the rest), and probably some
others I haven't thought of. Again, these would be defined through
conversion rules.
I imagine there would be some third-party development of input and
output formats for a variety of uses.
--------------------------------
| Sheldon Rampton
| Research director, Center for Media & Democracy (www.prwatch.org)
| Author of books including:
| Friends In Deed: The Story of US-Nicaragua Sister Cities
| Toxic Sludge Is Good For You
| Mad Cow USA
| Trust Us, We're Experts
| Weapons of Mass Deception
| Banana Republicans
| The Best War Ever
--------------------------------
| Subscribe to our free weekly list serve by visiting:
| http://www.prwatch.org/cmd/subscribe_sotd.html
|
| Donate now to support independent, public interest reporting:
| https://secure.groundspring.org/dn/index.php?id=1118
--------------------------------
Hi,
I created an external authentication plug-in to get mediawiki
authenticated against external database...it works!!! I want mediawiki
to automatically add the user once he/she is authenticated to its local
database, and disable login/create user form. Any ideas how I should go
about implementing this? Thanks in advance/
Regards,
Kalyan
On the Dutch Wikipedia we found a site that gets live feeds from the Dutch
Wikipedia, and posts them without mentioning Wikipedia or the GFDL, and
replacing all occurences of 'Wikipedia' by the site name. Is this the right
place to ask for them to get blocked? The site is at
http://www.jan-karel.nl/Speciaal:Recentchanges.
--
Andre Engels, andreengels(a)gmail.com
ICQ: 6260644 -- Skype: a_engels