Hello,
if you look at http://download.wikipedia.org/dawiki/20090109/ the dump
process for the new dump for dawiki seems to be frozen. Perhaps it
should be killed manually.
Best regards
Andim
Gentlemen, let's say one is afraid one day the forces of evil will
confiscate one's small wiki, so one wants to encourage all loyal users to
keep a backup of the whole wiki (current revisions fine, no need for
full history).
OK, we want this to be as simple as possible for our loyal users, just
one click needed. (So forget Special:Export!)
And, we want this to be as simple as possible for our loyal
administrator, me. I.e., use existing facilities, no cronjobs to run
dumpBackup.php (or even mysqldump, which would be giving up too much
information) and then offering a link to what they produce.
The format desired is for later making a new wiki via Special:Import,
so indeed the Special:Export or dumpBackup.php --current outputs are
the desired format.
I just can't figure out the right
http://www.mediawiki.org/wiki/API URL recipe...
api.php ? action=query & generator=allpages & format=xmlfm & ...?
Could it be that the API lacks the "bulk export of XML formatted data"
capability of Special:Export?
If one click is not enough, then at least one click per Namespace. I
would just have the users backup Main: and Category:, for example.
Embedding the API URL would be no problem, I would just use
[{{SERVER}}/api.php?... Backup this whole site to your disk]
I'm very happy to announce that the Wikimedia Foundation is now opening
hiring for the Wikipedia Usability Initiative!
Realized by a grant from the Stanton Foundation, the goal of this
initiative is to measurably increase the usability of Wikipedia for new
contributors by improving the underlying software on the basis of user
behavioral studies, thereby reducing barriers to public participation.
We have three positions open, all local in San Francisco. See the linked
pages for details and how to submit your CV:
http://wikimediafoundation.org/wiki/Job_openings/Interaction_Designer_(proj…http://wikimediafoundation.org/wiki/Job_openings/Sr._Software_Developer_(pr…http://wikimediafoundation.org/wiki/Job_openings/Software_Developer_(projec…
The new team will be lead by project manager Naoko Komura, who was very
helpful in organizing localization and translations for our recent
fundraiser, and will coordinate closely with me and the rest of
Wikimedia's core developers. Also joining the project will be Wikimedia
staff developer Trevor Parscal.
As always, all of Wikimedia's software development is open-source, and
we expect to be able to roll improvements into the live Wikipedia
environment and general MediaWiki releases over the course of the project.
-- brion vibber (brion @ wikimedia.org)
CTO, Wikimedia Foundation
San Francisco
Can we look into enabling $wgAllowCopyUploads on Wikimedia projects?
This will let Wikimedia work a lot better with external archives
especial around large video files that are cumbersome for users to
download and then upload over our POST upload interface on home Internet
connections. In particular archive.org huge repository of public domain
and free licensed footage now support temporal urls for video:
http://metavid.org/blog/2008/12/08/archiveorg-ogg-support/
not to mention adding some clips from metavid's public domain
legislative collection to relevant articles ;)
As I understand we need to set up a proxy for the internal apache nodes
to access the outside web? Who can look at enabling this?
peace,
--michael
--------------------------------------------------
From: "Greg L" <greg_l_at_wikipedia(a)comcast.net>
Sent: Thursday, January 08, 2009 2:43 PM
To: "Voice of All" <jschulz_4587(a)msn.com>
Subject: StringFunctions
> JSchulz:
>
> Have you read the link here:
> http://en.wikipedia.org/wiki/Wikipedia_talk:Manual_of_Style_(dates_and_numb…
>
> …describing a scientific notation-formatting template? It could use a
> character-counting parser function that StringFunction could handle if it
> worked well (which it doesn’t).
>
> I ran this by Jimbo and he thought having a character-counting parser
> function for Wikipedia made sense. He said he couldn’t pressure Wikipedia’s
> paid developers to make it and I should run it by Erik to see how he
> would like to have it handle. Erik referred me to wikitech.
>
> Do you have a volunteer developer in mind who might be interested in
> making StringFunction bullet-proof?
>
> Greg
Hi,
I’m new to this venue so please have patience with me. Jimbo suggested
I contact Erik and Erik said I should post here.
Wikipedia authors of magic words and templates could really use a
character-counting parser function. All the background information can
be found here:
http://en.wikipedia.org/w/index.php?title=User_talk:Jimbo_Wales&oldid=26081…
- Developer_support_for_parser_function
In a nutshell though, there is currently a template on en.Wikipedia
called {{val}} that delimits numbers (places what appears to be
thinspaces very three characters in scientific notation). It currently
must use math-based techniques to parse the value and this results in
rounding errors 5–10% of the time.
A character-counting parser function would accept interrogations such
as “Are there more than four characters remaining in the string when
counting right from the decimal point?” And “If so, feed me three more
characters.” Such a parser function would be very handy for many other
purposes. With a good, bullet-proof parser function, our small army of
template authors could produce some nice new tools.
I can be reached at Greg_L_at_Wikipedia(a)comcast.net
Greg