It's strange (but I guess that there's a sound reason) that plain wikilinks
point to a variable field of wiki records (the name of the page) while many
troubles would be solved, if they could point to the invariable field of
such records: the id. The obviuos restult is, that all links are broken (and
need fixing) as soon as a page is "moved" (t.i. renamed).
My question is: which is the sound reason for this strange thing? There's
some idea about fixing this?
Alex
> Message: 4
> Date: Mon, 04 Oct 2010 00:31:05 +0200
> From: Marcus Buck <wiki(a)marcusbuck.org>
> Subject: Re: [Wikitech-l] Fwd: [Mediawiki-l] Please delete mo.
> wikipedia
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
> Message-ID: <4CA90429.1080606(a)marcusbuck.org>
> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>
> An'n 04.10.2010 00:02, hett M. Williamson schreven:
> >> As automatic script conversion is possible, there's really no reason to
[...]
(Without knowing anything about the issues involved at all) If an
automatic conversion script is possible, why not just do the whole
variant thing. For example, Serbian has both a latin and a crylic
version, and the user can select which one they want. Then everyone
wins (?)
-bawolff
Folks,
The server that hosts the XML dumps will be undergoing maintenance (it's
going to be moved to another rack), on Saturday Oct 1 starting at about
15:00 GMT. We expect the server to be back up by 17:00 GMT. During
that time XML dumps will be unavailable.
In other news the first run of the full en.wikipedia history in chunks
has completed. The recompression to 7z has not been done, nor the
recompression into a single large bz2 file for people who prefer it.
However, for those interested, please have a look at the files:
http://dumps.wikimedia.org/enwiki/20100904/
Each file has its own mediawiki header and footer, each covering a range
of 2 million (sequential) page IDs, except for the last "chunk" which
covers rather more than it should.
As you can see, the chunk sizes are rather disparate. The next such run
should split up more evenly with roughly the same number of revisions in
each chunk, and as such, they should all take nearly the same time to
complete.
Ariel Glenn
-------- Original Message --------
Subject: [Mediawiki-l] Please delete mo. wikipedia
Date: Sun, 3 Oct 2010 16:47:41 +0200
From: Cetateanu Moldovanu <cetateanumd(a)gmail.com>
Reply-To: MediaWiki announcements and site admin list
<mediawiki-l(a)lists.wikimedia.org>
To: foundation-l(a)lists.wikimedia.org, mediawiki-l(a)lists.wikimedia.org,
Brion Vibber <brion(a)wikimedia.org>
Hello everyone, I'd like to remind you that existence of the mo. wikipedia
is extremely insulting for us from Moldova.
The one with the power, please take action and delete it.
causes Delete "moldovan" Wikipedia <http://www.causes.com/causes/39775> has
5.140 members
Have a good day.
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
On 29 September 2010 06:51, Rob Lanphier <robla [at] wikimedia> wrote:
> > Another idea is to work with the "Signpost"; these people are already
> > pushing out a tech report weekly and have been doing so for years.
>
> My hope is that a lot of this work is complementary to Signpost and
> other sources. Part of making the drafting process public is to give
> Signpost the opportunity to scoop us.
Just to let you know that the lead* Technology Report writer
appreciates and fully supports your efforts to increasing transparency
and (re)build the connections between paid and amateur developers of
all shapes and sizes :)
I don't know how the demographics compare though, Techblog vs Signpost
Tech. I would think that there probably are people that read one or
the other but not both, but they can't be many. On the other hand, I
know we at the Signpost are keen to expand the readership now that we
feel securer about the format and details like that, so maybe the
difference will increase, who knows. Perhaps the optimal solution lies
in adequate summaries for the less technically minded.
Anyway, I'm drifting off topic, but do keep up the good work in this
important area. Oh, and anything you can do to make us writers' lives
easier is much appreciated; I'm certainly looking forward to seeing
Zak in action in the weeks to come, for example.
--
Jarry1250
* Okay, you got me, only. Not that the occasional help I get is
unwanted, nor do I doubt that if for some reason I couldn't do it,
someone else might be able to. Aude did a sterling job before me, for
example; probably a better one in fact.
Early on in the requirements stage of ResourceLoader development we
decided to use ISO8601 as the format for representing timestamps in
URLs. This was chosen for it's legibility, conformance to a standard and
ease of generation. However this was somewhat of an oversight since the
timestamp "1970-01-01T00:00:00Z" gets URL encoded to be
"1970-01-01T00%3A00%3A00Z" which leaves something to be desired. Also,
generating this format in JavaScript requires sending a extra 220 bytes
(minified and compressed).
So, before we seal the deal on using 8601, I would like to collect some
ideas about alternatives which would ideally...
* Be legible in a URL
* Conform to a well-defined/well-known standard
* Be easy to generate from a unix timestamp in both PHP and JavaScript
Proposals wanted.
- Trevor