As the lone active administrator at the Vietnamese Wiktionary, I've
received several questions from regular users about the wiki's slowness.
[1] Part of me wants to tell of the days long ago when Wikipedia used to
go offline for days on end due to demand.
Anyhow, I vaguely recall that, at one point, several Asian-language
wikis were being hosted or mirrored at the yaseo cluster in South Korea.
Is that still the case? At the time, wikt:vi: was still in its infancy,
but it's much larger now -- 13th in article size [2]. The idea of a
free, community-edited Vietnamese translationary is pretty popular now,
and we compete against at least two large projects, not to mention the
scores of sites mirroring the same dictionary we imported into our wiki
last year. These other projects are all hosted in Vietnam, close to the
majority of their users, so speed is a real issue for wikt:vi:. It's
what seems to keep many people from using the site on a regular basis.
Users have complained that accessing Wiktionary from Vietnam is pretty
slow, and I believe them because it takes relatively forever for me to
load some major Vietnamese sites (and I'm on a very fast university
connection). If Wikimedia still hosts or mirrors any wikis on yaseo,
would it be possible to do the same for wikt:vi:?
[1] In Vietnamese:
http://vi.wiktionary.org/wiki/Th%E1%BA%A3o_lu%E1%BA%ADn_Th%C3%A0nh_vi%C3%AA…
[2] http://meta.wikimedia.org/wiki/Table_of_Wikimedia_Projects_by_Size
--
Minh Nguyen <mxn(a)zoomtown.com>
[[en:User:Mxn]] [[vi:User:Mxn]] [[m:User:Mxn]]
AIM: trycom2000; Jabber: mxn(a)myjabber.net; Blog: http://notes.1ec5.org/
I've noticed that the enwiki download dumps always fail when it comes
to all pages with complete edit history... there hasn't been a
successful release of that in almost a year it seems. Is there any
reason this happens? Is there a way to fix it?
Thanks,
FV
The Adobe DNG (digital negative, i.e. portable raw) format seems to
pass free-content muster at a first glance.
The question then is being able to upload the things at all. How
technically necessary is the 20MB limit? If we upped it to 40MB, or
100MB, would the servers melt? What are the practical issues from the
view of the system administrators?
- d.
---------- Forwarded message ----------
From: Oldak Quill <oldakquill(a)gmail.com>
Date: 21 Nov 2007 15:02
Subject: Re: [Commons-l] Allow Digital Negative (DNG) RAW format on
Commons? (and increase filesize limit)
To: andrew.gray(a)dunelm.org.uk, Wikimedia Commons Discussion List
<commons-l(a)lists.wikimedia.org>
On 21/11/2007, Andrew Gray <shimgray(a)gmail.com> wrote:
> On 21/11/2007, Brianna Laugher <brianna.laugher(a)gmail.com> wrote:
>
> > "There are indeed, some amazing images. I definitely believe that
> > publishers could use this resource if they're in need of (one more
> > image) to complete an existing project. But I'm uncertain about how
> > publishable much of the content is, especially in the absence of
> > higher resolution files (which disqualifies printing). "
> >
> > So although our works are usually sufficient for web use, it seems
> > clear that we cannot present ourselves as a serious kind of archivist,
> > culture-recording project, without introducing a RAW format and
> > encouraging people to use it.
>
> Careful not to jump two steps there :-)
>
> We mainly don't have higher resolution image files because people
> aren't uploading high-resolution image files to start with, not
> because the high-resolution JPGs or TIFFs which we have Just Aren't
> Good Enough(TM).
And people aren't uploading high-resolution image files because they
can't. If the upload limit were increased, there are plenty of US-govt
TIFFs that could be added to Commons and greatly improve our
usability.
The 20MB upload limit has come up a fair few times on these mailing
lists and the only objections I remember relate to whether we have the
resources to handle larger files. Tell me if I'm wrong, but I can't
see any problems in doubling our upload limit to 40MB straight away to
enable more/better quality image and sound files to be uploaded. We
should further discuss and investigate the impact of raising the limit
further so we can start storing reasonable film formats and files (I,
personally, cannot wait until the limit is high enough to allow us to
provide reasonable quality early silent films).
Ideally, the upload limit will eventually be high enough to allow us
to provide lossless data files (with classical music movements of up
to ~20MB, this would need to be up to 100 MB lossless music in FLAC
format). The feasibility of serving 100MB files over HTTP still needs
to be discussed, but I can't see how a 40MB limit would cause us any
problems or cost us significant resources.
--
Oldak Quill (oldakquill(a)gmail.com)
_______________________________________________
Commons-l mailing list
Commons-l(a)lists.wikimedia.org
http://lists.wikimedia.org/mailman/listinfo/commons-l
Hello
I am using mediawiki 1.9.3 and
I have set up a wiki with a lot of restrictions.
All articles can only be edited by the administrator and even most of
that articles can be read by anonymous users, they are a dozen
articles which can only be read by registered users.
In order to achieve that, I have the following configuration.
$wgGroupPermissions['*']['createaccount'] = false;
$wgGroupPermissions['*']['edit'] = false;
$wgGroupPermissions['*']['edit'] = false;
$wgGroupPermissions['user']['edit'] = false;
$wgGroupPermissions['sysop']['edit'] = true;
$wgGroupPermissions['*']['read'] = false;
$wgWhitelistRead = array(":Main Page",":Article1");
Now since there are far more articles which should be read anonymously
I find this solution not very comfortable, since I have to add every
new article into the white list.
It would fit my needs much better if there where a blacklist in which
I add the article not to be read anonymously.
$wgGroupPermissions['*']['read'] = true;
$wgBlacklistNoRead = array(":ForbiddenArticle1");
Will there be such a feature in the future?
thanks
Uwe Brauer
On 06/11/2007, catrope(a)svn.wikimedia.org <catrope(a)svn.wikimedia.org> wrote:
> Revision: 27267
> Author: catrope
> Date: 2007-11-06 16:14:24 +0000 (Tue, 06 Nov 2007)
>
> Log Message:
> -----------
> APIEDIT BRANCH MERGE: Making redirect creation on page move optional.
This should not be an option which we encourage the use of; moving
pages without updating redirects breaks incoming links from all other
sources, and is not a particularly clever idea for any web site.
Rob Church
On 11/22/07, catrope(a)svn.wikimedia.org <catrope(a)svn.wikimedia.org> wrote:
> global $wgEditOwnExcludedNamespaces;
> + if(!is_array($wgEditOwnExcludedNamespaces))
> + // Prevent PHP from whining
> + $wgEditOwnExcludedNamespaces = array();
>
> if($action != 'edit' || $user->isAllowed('editall') || in_array($title->getNamespace(), $wgEditOwnExcludedNamespaces))
Note that using any variable without explicitly initializing it is
dangerous in PHP. If an installation has register_globals enabled,
and has not initialized the variable elsewhere, an attacker can insert
any desired value into the variable by just editing the URL. The
better approach is to initialize the variable in EditOwn.php, and
require users to override it in LocalSettings.php after the
require_once line.
On Tue, 20 Nov 2007 21:26:48 +0000, brion wrote:
> Revision: 27696
> Author: brion
> Date: 2007-11-20 21:26:48 +0000 (Tue, 20 Nov 2007)
>
> Log Message:
> -----------
> Revert r27694 -- if you're seeing this problem, it's probably because
> you're pulling things out of $wgParser->mFunctionHooks and copying it to
> another parser without knowing the internal format has changed. That
> should be fixed whereever that's being done (as it was fixed on the parser
> tests.)
For labeled section transclusion, this was fixed by having the extension
call a recursive parse, instead of passing back the wiki text to let the
parser do its thing, which seems to be pretty broken now; so the old way
at least gives partial functionality.
So we probably won't be able to sort out any of the regressions until we
can figure out where that's being done.
Hi,
I am interested in implementing a web-article a la scientific journal paper extension or ... - not sure if extension mechanism can help here.
Just want to be able to write a page and insert bibtex reference id's and have mediawiki take care of numbering references and displaying bibliography at the bottom of the page plus maybe extra embellishments - like giving a way to copy citation and a url to the cited articles in a way that <bibtex> and <bibwiki> extensions do.
Example
wikitext:
.... valuable experiments in the NMR arsenal <bibref>shr1971rrn, neu1989fr,mer1982pns</bibref>
<bibliography/>
result
<p>.... valuable experiments in the NMR arsenal <span id="bibref" onmouseover="...">[1-3]</span></p>
<h2>References</h2>
<ol>
<li> J. H. Noggle and R. E. Schirmer, ‘‘The Nuclear Overhauser Effect,’’
Academic Press, New York, 1971.
<li>D. Neuhaus and M. P. Williamson, ‘‘The Nuclear Overhauser Effect
in Structural and Conformational Analysis,’’ Verlag Chemie, New
York, 1989.
<li>J. K. M. Sanders and J. D. Mersh, Prog. NMR Spectrosc. 15, 353–
400 (1982) .
</ol>
when printed it should look like a real journal paper.
i would like to bypass a somewhat cumbersome way of having a separate page per citation and then adding links to those pages.
other half-way solutions can be imagined: if numbering and list of ref's in the end were omitted, then simple extension mechanism should work provided there is a way to enter those citations that could be accessed by id - but then it won't look like what I want.
I guess a combo hack extension+javascript(ordering the references correctly)+css might work.
- I wonder if there is a middleware-like mechanism in mediawiki that would allow to add al layer of data processing on input/output?
is it possible to implement such a layer as an add-on without breaking the wiki code and allowing painless base-code upgrades?
any advice how I can start?
I will appreciate any suggestions.
Thank you.
Evgeny.
____________________________________________________________________________________
Get easy, one-click access to your favorites.
Make Yahoo! your homepage.
http://www.yahoo.com/r/hs