Some of the less common languages on Wikipedia use scripts which are not
supported by default on most operating systems. In many cases, these
scripts may be unavailable even to native speakers of that language. The
same applies to mathematical fonts.
For example, the following languages seem to use scripts currently not
supported by the fonts installed by default on a MacBook running MacOS
bn, bpy, kn, ml, te, am, arc, dv, km, my, si, bug, got, lo, or, as, cu, ti
Other operating systems may have even fewer scripts supported.
However, all of these scripts appear to have Free font support, as shown
by their all being rendered on my Linux box, after having installed a
variety of Free Software fonts that are part of the Debian distribution
[see note below]. As far as I can see, none of these appear to have
significant font-shaping issues, with only the occasional glitch in some
languages such as Oriya.
This means that, on modern browsers with both @font-face and UTF-8
support, we should now be in a position to support all of these scripts,
even if they are not supported by the user's installed fonts, by adding
@font-face declarations for Free Software fonts supporting these
scripts, and hosting these fonts on WMF servers.
In addition, doing this for mathematical fonts would have immediate
utility for all language versions of Wikipedia.
Would anyone be interested in helping support this?
Note: the non-mathematical Debian font packages currently installed on
my Linux box are as follows:
ttf-alee ttf-arabeyes ttf-arhangai ttf-arphic-ukai ttf-bengali-fonts
ttf-bitstream-vera ttf-bpg-georgian-fonts ttf-dejavu ttf-dejavu-core
ttf-dejavu-extra ttf-devanagari-fonts ttf-dustin ttf-dzongkha
ttf-farsiweb ttf-freefont ttf-gentium ttf-gujarati-fonts
ttf-indic-fonts ttf-junicode ttf-kacst ttf-kannada-fonts ttf-khmeros
ttf-kochi-gothic ttf-lao ttf-liberation ttf-malayalam-fonts
ttf-mph-2b-damase ttf-opensymbol ttf-oriya-fonts ttf-paktype
ttf-punjabi-fonts ttf-sazanami-mincho ttf-sil-abyssinica
ttf-sil-charis ttf-sil-doulos ttf-sil-gentium ttf-sil-padauk
ttf-sjfonts ttf-tamil-fonts ttf-telugu-fonts ttf-thai-tlwg
ttf-thryomanes ttf-tmuni ttf-unfonts-core ttf-unifont ttf-uralic
During the Hackaton, hexmode gave me access to the Cruise Control
server. I have fixed the build configuration to exclude long tests, it
seems to be running again correctly:
I think the IRC notifications are still broken.
Do we have a referent list of wikis? Although it would be good to have
the list of *all* Wikimedia wikis, just the list of content projects
wikis would be fine.
The page Complete list of Wikimedia projects  is not referent at all.
It's not about the fact that it is on wiki, but exactly because of
possibility that anyone can add whatever he or she thinks that it should
Just because of three letters code conflict between Swiss German and
Albanian (als), I've realized that somebody listed Swiss German to have
Wikinews edition (it is, actually, news portal at Wikipedia) and nobody
bothered to remove it.
If someone knows the right place where to find the referent list of
wikis, please let me know.
If such list doesn't exist, and if someone from tech staff could do
something like "echo 'show databases' | mysql ...| grep "^wiki" >
<some_public_file>" or so every 7 days (cron would be helpful) and give
to me the location of that file, I could do the rest.
It's been a while since I've been by to mediawiki.org, and I find it
quite interesting to find that I have 2,124 new messages! Wow! Unfortunately,
I can't read them, because...
A database query syntax error has occurred. This may indicate a bug in the
software. The last attempted database query was:
(SQL query hidden)
from within function "User::invalidateCache". Database returned error "1205:
Lock wait timeout exceeded; try restarting transaction (10.0.6.49)".
Just thought I'd let you know. This is http://www.mediawiki.org/wiki/Special:NewMessages ;
you can probably check your query logs.
I will soon start the official coding phase for my GSoC project, which
in a nutshell is about making gadgets customizable.
The project scope is slightly different from my original proposal
(which was not gadget-centric and didn't take into account current
plans to improve/rewrite the gadgets system).
I've put some notes on etherpad here:
http://eiximenis.wikimedia.org/uZsCqTmgIr (ResourceLoader2 plans are
relevant, too: http://eiximenis.wikimedia.org/RL2).
Any feedback is welcome, both from devs and from current or potential
Salvatore Ingala (aka ^Spider^)
Developers at this weekend's GLAMCamp NYC
are developing a data-munging tool, based on pywikipediabot, to aid in
mass uploads. They'll be hacking on it in sprints this weekend,
starting 11am-12:30pm NYC time tomorrow, Saturday the 21st. Join them
in person, or in #glamwiki on Freenode.
See notes from today's preliminary session:
Summary, I believe by Maarten Zeinstra:
> There is a Python library (pywikipediabot) where many of Maarten's
> bots are derived from. The desired outcome of this session is that
> we're turning ita library that functions as a black box for uploading
> data to Wikimedia Commons. This library will have one external
> function "put(metadata, configuration). It also needs to include a
> function to check for duplicates.
> configuration wll be a dictionary that holds the following keys:
> - configurationTemplate, holds URL to configuration template
> - configurationTitleTemplate, holds configuration of Title Template
> - sourceKey, holds the key of the metadata dict that indicate the url
> of the source
> An extra module will be written to ingest different formats and offer
> its metadata as dictionary in key-value format (metadata in put(); ).
> This module can be GUI-ed. Written as a base class with can be
> subclassed or extended, to enable different standards.
The Etherpad also describes the work they're going to do:
> We are going to make 3 modules
> 1. Upload module
> MaartenD is going to make a function
> put(metadata, configuration);
> with metadata as a dict (python for associative array)
> add duplicate checker
> 2. Conversion module / interface module
> Make a metadata conversion module that ingests CVSs/OAI-PMH and
> converts to internal dict format add a GUI that create the 2 dicts
> necessary (metadata, configuration)
> 3. Develop a configuration standard as an array of keys for a dict.
> draft of standard:
> configurationTemplate: holds template url
Hope this is a productive weekend and we come out of it with something
Volunteer Development Coordinator
I wrote this today to Thomas, author of Halo ACL...
MW today provides the capability to define one interwiki descriptor which
points to the local wiki - I want to build on that with my aim to avoid the
technical and administrative hassles of extensions like Distributed Semantic
Wiki which shares common information across physical wikis. There's another
way though to share common information, across not physical wikis but
logical wikis, that is based on interwiki descriptors.
Consider the fully qualified pagename a:b:c (a=wikispace, b=ns, c=pgnm).
This page can only be accessed when its name is fully qualified. I want to
attach an ACL to all pages in the "a" wikispace. Wikis have one physical
database today managing "b" logical databases. I want a*b logical databases,
and to attach ACLs to a:b wikispace-qualified namespaces.
This would likely eliminate the headache of federating wikis within an
organization, while preserving access to common information resources held
as unqualified pages in the wiki. Replication schemes like DSW would be more
integrated with the MW base software if the base software was aware that
a:b:c pages MIGHT exist in the local wiki, accessing the federated wiki only
when the page could not be resolved in the local wiki.
You ask about brittle Category ACLs. If the category is (inadvertently)
deleted, removed from a page, mistyped or redirected, I'd imagine there'd be
a problem to be resolved by someone (else). And I wonder how receptive folks
are to using the Category namespace both for a folksonomy & for the database
of security tags. And I wonder about maintaining schemes that categorize
security categories... All messy.
But my point is not to bash Category ACLs. They play an important role for
managing page security within a namespace. I dislike their use across
namespaces however so I believe another approach is worthwhile to explore.
Wikispace ACLs make little sense today, I agree, but they would if my
proposal to orient interwikis descriptors as wikispace descriptors is
acceptable to the MW community; indeed my proposal rests on attaching ACLs
to wikispaces (and to namespaces within wikispaces) - whether it's a Halo
ACL or a MW ACL, I can't say I care too much. I'm just hoping to start the
conversation among you.
Is there anyone working on the output of work as EPUB or MOBI for the Mediawiki product?
It is something that would sit well for the output of works from Wikisource. That
technical side is well beyond me, so I thought it would be good to know if anyone is
working on that avenue for output.
Thanks. Regards Andrew