I'm trying to get started with a stats project, and the data size for
en.wp.org is a bit daunting for initial testing.
Unfortunately, I am only fluent in English. Is there a download for
the nostalgia edition of wp.org? I see that simple.wikipedia.org is
available, but this one's a bit too small for my purposes.
Hi list!
David Monniaux and I will have to move the french servers (change a
change of bay) tomorrow from 9:00 CEST. I'd like the corresponding DNS
entries to be changed to redirect the traffic to somewhere else. Could
someone do this as soon as possible as i'd like to perform some
maintenance on the servers.
Thank you very much,
Médéric
PS: please CC the answer as i'm not subscribed to the list.
Accédez au courrier électronique de La Poste : www.laposte.net ;
3615 LAPOSTENET (0,34/mn) ; tél : 08 92 68 13 50 (0,34/mn)
> Hi,
>
> I am trying to enhance the wikimedia syntax to
> include a simple easy to use
> citation format. This will include an automatically
> generated reference page
Amruta,
There seem to be a couple proposed projects already
addressing this issue, the most prominent being
Wikicite:
http://meta.wikimedia.org/wiki/Wikicite
There is also a proposal for storing relational data
entered through a wiki called Wikidata:
http://meta.wikimedia.org/wiki/Wikidata
I think it would be best to try and use these projects
as foundations for any further work with regard to
references/citations. Thanks.
>
> Message: 5
> Date: Fri, 2 Sep 2005 09:54:38 -0400
> From: Amruta Lonkar <gtg808u(a)mail.gatech.edu>
> Subject: [Wikitech-l] treating references as first
> class objects in
> wiki
> To: wikitech-l(a)wikimedia.org
> Message-ID:
> <1125669278.4318599ecab56(a)webmail.mail.gatech.edu>
> Content-Type: text/plain; charset=ISO-8859-1
>
> Hi,
>
> I am trying to enhance the wikimedia syntax to
> include a simple easy to use
> citation format. This will include an automatically
> generated reference page
> in addition to the existing talk,, discussion and
> history pages. In addition
> each reference on the reference page will have its
> own page and willl funciton
> same as an article in a wiki with its own talk page.
> Thus the wiki might be
> effectivelu used by students in writing papers.
>
> I have just started going through the mediawiki code
> and i wanted to know what
> would be a good starting point to work on this kind
> of a project. I am getting
> very confused with the actual flow of the code. I
> will be creating a separate
> database for the references so i need to unerstand
> the flow.
>
> --
> Thanks,
> Amruta
>
>
____________________________________________________
Start your day with Yahoo! - make it your home page
http://www.yahoo.com/r/hs
We've been curious for a while about the overhead due to array setup
time, especially in the context of language files. Although Turck caches
the bytecode, the interpreter still has to process that bytecode and
allocate the array data structures. How long does this take?
The answer is 4ms, to load both French and English messages.
Criticism of my method would be welcome, I'll present it here in full. I
used two files, slow.php:
<?php
define('MEDIAWIKI', 1);
function wfSuppressWarnings() {}
function wfRestoreWarnings() {}
$wgLanguageCode = 'fr';
$IP = '/apache/common/php-1.5';
ini_set('include_path', "$IP/languages");
require_once("$IP/languages/Language.php");
print time() . "\n";
?>
And fast.php, which is exactly the same except without the require_once:
<?php
define('MEDIAWIKI', 1);
function wfSuppressWarnings() {}
function wfRestoreWarnings() {}
$wgLanguageCode = 'fr';
$IP = '/apache/common/php-1.5';
ini_set('include_path', "$IP/languages");
print time() . "\n";
?>
Turck MMCache only works for the apache module SAPI, so I benchmarked
these two files using ab. I used hypatia, which is a 3 GHz P4. It was
otherwise idle. Here is the output:
[0625][tstarling@hypatia:~]$ ab -Xlocalhost:80 -n10000
http://en.wikipedia.org/w/benchmark/fast.php
This is ApacheBench, Version 2.0.41-dev <$Revision: 1.141 $> apache-2.0
Copyright (c) 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Copyright (c) 1998-2002 The Apache Software Foundation,
http://www.apache.org/
Benchmarking en.wikipedia.org [through localhost:80] (be patient)
Completed 1000 requests
Completed 2000 requests
Completed 3000 requests
Completed 4000 requests
Completed 5000 requests
Completed 6000 requests
Completed 7000 requests
Completed 8000 requests
Completed 9000 requests
Finished 10000 requests
Server Software: Apache
Server Hostname: en.wikipedia.org
Server Port: 80
Document Path: /w/benchmark/fast.php
Document Length: 11 bytes
Concurrency Level: 1
Time taken for tests: 7.759175 seconds
Complete requests: 10000
Failed requests: 0
Write errors: 0
Total transferred: 1530000 bytes
HTML transferred: 110000 bytes
Requests per second: 1288.80 [#/sec] (mean)
Time per request: 0.776 [ms] (mean)
Time per request: 0.776 [ms] (mean, across all concurrent requests)
Transfer rate: 192.55 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.0 0 0
Processing: 0 0 12.0 0 851
Waiting: 0 0 12.0 0 851
Total: 0 0 12.0 0 851
Percentage of the requests served within a certain time (ms)
50% 0
66% 0
75% 0
80% 0
90% 0
95% 0
98% 0
99% 0
100% 851 (longest request)
[0628][tstarling@hypatia:/apache/common/live-1.5/benchmark]$ ab
-Xlocalhost:80 -n10000 http://en.wikipedia.org/w/benchmark/slow.php
This is ApacheBench, Version 2.0.41-dev <$Revision: 1.141 $> apache-2.0
Copyright (c) 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Copyright (c) 1998-2002 The Apache Software Foundation,
http://www.apache.org/
Benchmarking en.wikipedia.org [through localhost:80] (be patient)
Completed 1000 requests
Completed 2000 requests
Completed 3000 requests
Completed 4000 requests
Completed 5000 requests
Completed 6000 requests
Completed 7000 requests
Completed 8000 requests
Completed 9000 requests
Finished 10000 requests
Server Software: Apache
Server Hostname: en.wikipedia.org
Server Port: 80
Document Path: /w/benchmark/slow.php
Document Length: 11 bytes
Concurrency Level: 1
Time taken for tests: 47.707920 seconds
Complete requests: 10000
Failed requests: 0
Write errors: 0
Total transferred: 1530000 bytes
HTML transferred: 110000 bytes
Requests per second: 209.61 [#/sec] (mean)
Time per request: 4.771 [ms] (mean)
Time per request: 4.771 [ms] (mean, across all concurrent requests)
Transfer rate: 31.32 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.0 0 0
Processing: 4 4 0.5 4 29
Waiting: 0 4 0.5 4 28
Total: 4 4 0.5 4 29
Percentage of the requests served within a certain time (ms)
50% 4
66% 4
75% 4
80% 4
90% 4
95% 4
98% 4
99% 6
100% 29 (longest request)
Thanks a lot for your mail Gerard,
As I am not so on top of your technological discussions, I like to ask when it is possible to experience this implementation.
My main concern is to expose it widely as part of your open content initiative and learn how it goes with usage and in fact suggestion to alter terms/concepts or suggest new ones - in which of the many languages ever.
Any action required from our side?
Greetings and thanks a lot for all you enthusiastic work you are performing here
Stefan
-----Original Message-----
From: Gerard Meijssen [mailto:gerard.meijssen@gmail.com]
Sent: 01 September 2005 23:50
To: wiktionary-l(a)wikipedia.org; Wikimedia developers
Cc: Anthere; Stefan Jensen
Subject: The data design for the GEMET phase (part of the Ultimate Wiktionary)
Hoi,
We have always planned the implementation of the GEMET thesaurus as one
of the steps in the implementation of the Ultimate Wiktionary. In some
discussions we had, we considered that it might be a better idea to
implement GEMET in a subset of the Ultimate Wiktionary in stead of in
its own datadesign. Once it has been implemented, we will be able to
learn many of the lessons we need to learn as reality shatters many
false dreams. With this implementation, we implement several of the key
functionalities of the UW. It will demonstrate the "eat your own
dogfood" idea; the words for a Languages and for a WordRelation need to
exist in order to implement their functionality and their localised
value. It also demonstrates how a thesaures can be implemented in UW, it
will show off how we can have relations between different meanings.
The one thing I however like best is, that it demonstrates the core
functionality of Ultimate Wiktionary much better. The data design for
this is much easier to understand. The absolute core functionality
however is without the Collection, the CollectionLanguage and the
CollectionMeaning tables. These are needed for the implementation of GEMET.
As it is a subset, it does not have many of the features that will exist
in the full blown version of the UW. These can however be added one at a
time. This allows for more frequent updates and this will propably lead
to much more excitement as we will have more often new features to show.
More info can be found here:
http://meta.wikimedia.org/wiki/Ultimate_Wiktionary_data_design
Thanks,
GerardM
Hi,
I've searched on the meta-wiki and googled for the last two days, but
I can't seem to manage to be able to create a wiki-user from an
outside script.
My questions are:
1) I'm using the following code to generate the wiki password (as far
as I can tell it's identical as in the wiki source [1.5]).
$wiki_password = md5($wiki_user_id . "-" . md5($_POST['password1']) );
2) the table user_rights. Setting the correspondent id to 'user' or
'sysop' doesn't solve the problem either. The right value for this is
just "user" is it not?
3) Right now, in the user table, I'm setting user_id, user_name,
user_real_name, user_password, and user_email. Do I need to set
user_options as well? Any others?
Any help much appreciated,
Henno
Hi all,
I received an e-mail from Dwayne Bailey, coordinator of
translate.org.za (free software localisation group for South African
languages), in which he showed interest in the idea of assisting with
MediaWiki localisation in South African languages.
He also suggested using Pootle, about which you can find more at
http://translate.sourceforge.net/ .
There have been a couple of mails on this list I believe about
encouraging localisation somewhat independently of projects, so that
for example a Ladino interface translation could be completed without
the existance of a Ladino Wikipedia or other Wikimedia project. Of
course, this is already possible, but it's unlikely to happen. Within
that framework, it could become a prelaunch condition for requests for
new language versions of Wikimedia projects, and would also allow for
more collaborative translation rather than the current system where
most system messages are translated by one or two sysops in the
MediaWiki namespace, while regular users look on without the ability
to make corrections.
While the current preferred method of localisation appears to be to
use the MediaWiki namespace, this doesn't work well with the
recently-introduced ability to choose ones' own interface language in
preferences: if I choose on en.wiki to view the interface in, say,
Navajo, or Amharic, or Bengali, nothing shows up because most or all
of the translations for these languages were made in the MediaWiki
namespace.
While this may not seem like a major concern, I think that there are
more than a few editors on the major language versions who speak that
language as their second language and might prefer to view the user
interface in their native language. This may still not seem like a
major concern because, after all, don't all of the "major languages"
of the world have full translations of language.php? Unfortunately,
this is not the case, and is limited almost completely to the
languages of Europe, with few exceptions. Languages such as Bengali,
Amharic, Telugu, Fulfulde, Armenian, are all major world languages
with millions of speakers (Bengali, for example, is the national
language of Bangladesh, and a regional language of India, two of the
most populous nations on Earth, for both of which English is a
relatively common second language), each of them has a
widely-translated interface in the MediaWiki namespace, but none of
them has a LanguageXX.php file, or if they do, it has few or no system
messages.
Mark
---------- Forwarded message ----------
From: Dwayne Bailey <______(a)______.___>
Date: 23-Aug-2005 03:06
Subject: Re: Mediawiki software localisation
To: Mark Williamson <node.ue(a)gmail.com>
...
This is a great idea, I will investigate how we can translate this. Do
you think you guys would be interested in using Pootle
(http://pootle.wordforge.org) so that other languages can easily
translate?
...
Thanks Gordon for letting me know that wikimedia servers currently
run PHP 4.3.11.
Are there any plans to be running PHP 5.x?
I am considering writing the next version of blahtex (LaTeX => MathML
converter) in PHP instead of C++, and there are certain language
features (in particular try/catch/throw) only present in PHP 5.x of
which I would like to take advantage.
David Harvey