Hoi,
The same logic can be applied to enable both Narayam and WebFonts on Meta.
Thanks,
Gerard
On 19 December 2011 13:41, Niklas Laxstrom <nlaxstrom(a)wikimedia.org> wrote:
> Since we are enabling Translate on mediawiki.org and are planning to
> have user documentation for WebFonts and Narayam there, our team
> decided that it also makes sense to enable Narayam and WebFonts on
> mediawiki.org so that users can read and contribute to the
> translations.
>
> -Niklas
>
> Forwarding to Wikitech also to let MediaWiki developers know about this :)
>
> 2011/12/19 Niklas Laxstrom <nlaxstrom(a)wikimedia.org>:
> > Sorry, I was supposed to send this on Friday evening already.
> >
> > The highlights of the next i18n deployment:
> > * Enabling Translate on mediawiki.org
> > * Updating Narayam and WebFonts to latest versions. Please test them
> > in translatewiki.net
> > ** Menu appers only on click, not when hovering
> > ** Menu positions correctly in RTL and does not go offscreen.
> > * Language name for Veps updated
> >
> > Also a couple of smaller fixes, that can be found at:
> > https://www.mediawiki.org/wiki/Special:Code/MediaWiki/tag/i18ndeploy
> >
> > Deployment will happen between 1800 and 1900 UTC
> >
> > -Niklas
>
> _______________________________________________
> Mediawiki-i18n mailing list
> Mediawiki-i18n(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-i18n
>
Hi everyone,
I'm planning a two meetings on IRC today/tomorrow--one at 17:00UTC (in 2
hours) and one at midnight UTC (in 9 hours). There's not a firm agenda just
yet[0]; my major goals are finding areas in which we don't still agree and try
to come to some consensus on the issues. The two times will largely be
repetitive I think, so feel free to only come to one (I made two events so as
to grab a bunch of timezones).
So if you're interested in discussing this or just want to read along
to see what
everyone else is talking about, feel free to join us in #wikimedia-dev on IRC.
-Chad
[0] http://etherpad.wikimedia.org/Git
Toni Hermoso Pulido (toniher) now has commit access and aims to work on
extensions FollowButton, PopUpFile, CheckGroup, ImageRefer.
Welcome!
As of right now, the commit access queue is clear; we've replied to
everyone who has requested commit access to our Subversion repository.
So if you think you're waiting for a response from us, please check your
spam folder. Thanks.
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
Realized I never sent an announcement here. I think you folks'd be
really interested in checking out what we've been working on:
http://localwiki.org
We just did our first general-public software release.
There's lots of cool stuff going on. Having worked on the DavisWiki
(http://daviswiki.org) code for many, many years [which was a fork of
MoinMoin], we really relished the opportunity to design a new, modern
wiki engine without the burden of legacy support.
Cool bits:
* Everything is stored as HTML5. We threw out wiki markup.
* Visual editing! And it's consistent and fun!
* .. Using a highly-hacked ckeditor. We're hoping to work on
the new visualeditor ya'll have been working on. Exciting!
* The basis of our work was creating a flexible versioning, diffing
and merging framework for Django, called django-versionutils. With
this, we can easily version any Django model, diff any Django fields,
and create new, custom diff handlers (for, say, GeometryFields or
ImageFields :) This allows us to wiki-fy any sort of structured data,
too.
* this lets us make things like versioned, diffable comments
a'la liquidthreads really easily.
* Really cool map stuff / really-easy map editing :)
I made this code walkthrough screencast a few months back which you
might find interesting: http://vimeo.com/25385510 (though it's a bit
long -- sorry!)
& we've got a dev site here: https://dev.localwiki.org
Anyway, would love to work together on anything and everything we can!
(WMF folks: we're in SF, too)
-Philip
(Apologies if you're seeing this email twice. My first send was
apparently silently rejected)
The processes|actions on links we deal-with are:
1) in a browser: a) link activating.
2) in a source-editor: a) link-text editing, b) link-target editing.
3) in a wysiwyg-editor: a) link-text editing, b) link-target editing
c) link activating|following.
In VisualEditor (wysiwyg) link-activation is not yet implemented. So
far I have seen 3 methods for this:
i) double clicking
ii) ctrl + clicking
iii) popup the target, where you can activate the link.
I am using everyday a local hypertext wysiwyg editor for 20 years now
(yes before html!!!) which uses the double-click method. I vote for
this as the fastest and using only one hand.
I would like to here the VisualEditor's team opinions.
--
Kaseluris-Nikos-1959
Synagonism = ALL winners, Antagonism = ONE winner
Folks-
Big congrats on the first general-public visual editor demo. We've
been hoping for a while that someone would work on a
non-designMode/contentEditable editor. This looks -great-!
We're interested in using the editor in LocalWiki. We're currently
using a highly-modified CKeditor, which will probably keep working
well for a year or so, but beyond that it's not very future-proof.
We've done a lot of work to make our editing experience consistent and
fun, but it's a real pain to hack on new editor-specific functionality
because CKeditor is so fickle. We'd love nothing more than to throw
it in a trashbin and set it afire. But such is technology.
The first question I have is: how much trouble will we encounter if
we're not using wiki markup? We store everything as HTML5 everywhere,
and we're converting our old sites to HTML5 on import.
We should hang out and have a little show-and-tell! Maybe early Jan?
Best,
Philip
LocalWiki
http://localwiki.org
(Apologies if you're seeing this email twice. My first send was
seemingly silently rejected)
Hi all,
I would like to notice that I am now working on rewrite of mw-bot,
called wm-bot (wikimedia bot - it's supposed to serve in various
wikimedia chans), the bot now is supporting exactly same functions as
mw bot + some more, and I think it would be good if we replaced
current mw-bot in future at some point. The reasons are:
- Old bot is written in java and nearly no one has access to source
code, neither is managing it, the bot is still running without
problems rather thanks to original creator who did a great work and
made a very stable code, extending the bot with more features could be
problem.
- New bot is in svn (tools/wmib) so that anyone can participate on
development and even on operation of the bot
- New bot is running on wmf labs so that it should be running on more
stable server with better connectivity and also is better accessible
for others, because apart of toolsever it's no problem to give acess
to service user account to more devs (anyone with svn account can get
access there) so that more people can operate the bot and patch it.
I converted current database and it's running in #mediawiki-move so
that you can try various commands like (!mediawiki !b <id>), any
feedback on this whole idea and bot is welcome also please before you
start commiting changes to source code, keep in mind that I now work
on splitting it to more files so that we avoid conflicts when
commiting changes, it should be done by today.
Thanks
Hi,
I was debugging a reported problem with the ro.wp Common.js and I
noticed that it wasn't even getting loaded on Special:Preferences.
Have I missed something or is Firebug lying to me? :)
Thanks,
Strainu
Hi there,
we are doing information retrieval research on the Wikipedia history.
Currently we are thinking about including the archive of Deleted
Articles in the analysis.
What is the current regulation on access to the Deleted Archive?
According to this page, admins can permit access to single articles on
request:
http://en.wikipedia.org/wiki/Wikipedia:Deletion_policy#Access_to_deleted_pa…
However, is there a way (for researchers) to either:
a) access (API) or download the whole archive of Deleted Articles
b) get statistics or meta data about the Deleted Archive (article
counts, revision meta information, logs)
?
Kind regards,
Katja Mueller
I plan to merge the file backend branch next Monday (PST).
Overview:
FileRepo was refactored to use storage paths instead of file system paths. A
storage path looks like "mwstore://backend/container/rel_path_to_file". This
is somewhat similar to FileRepo virtual URLs (though they are URL-encoded of
course), which look like "mwrepo://repo/zone/rel_path_to_file".
Some functions, like storeBatch() still allow FS paths as sources. Important
breaking changes are in functions like File::getPath(), which return storage
paths now instead of file system paths. The append-related functions were
removed as we are using concatenate instead (already added in trunk in
r104687).
The main goal is to abstract storage away so that various backends (FS,
Swift, S3, Azure,...) can be supported. Our current NFS usage for thumbnails
is not sustainable short-term and nor is the usage for source files
long-term. Beyond being a single point of failure, it doesn't scale very
well. With new features like chunked uploads and TimedMediaHandler, we hope
to actually have serious video content in the future, which will require a
better storage medium.
Other changes:
* Media handler code was minimally affected, as the transform tools are
based on FS file reads/output anyway. However File::transform() will copy
the output (if any) to the final storage path destination.
* Upload code was minimally affected too. Initial uploads still work with
temp FS source files and call performUpload(). Stash-based uploads still
store virtual URLs in the DB to track the uploaded files (from the initial
attempt). When the user finishes and uploads from the stash, the usual
performUpload() function is called on a local FS copy. Chunked uploads
likewise use keys that determine virtual URLs, which use the
FileRepo::concatenate() function to create a new storage file. The usual
performUpload() function is called on a local FS copy of the file.
Improvements could still be made here.
* Minor changes to img_auth.php/thumb.php were also required.
* Thumb handler code was recently added to /trunk, this can eventually be
used to replace our custom thumb-handler.php script on our NFS thumbnail
cache server.
Breakage:
Typically, the more a module makes use of FileRepo and virtual URLs, the
less likely it is to break. Even calling File::getPath() and using that as a
source to FileRepo::store() will happen to still work. Things like:
a) filemtime( $file->getPath() )
b) copy( $file->getPath(), ... )
c) StreamFile::stream( $file->getPath() )
...will be broken. You will see errors about PHP not finding a wrapper for
'mwstore'.
For example, ConfirmAccount and NSFileRepo will need updating. Since I wrote
the former, it may provide an example for any updates needed. Such
extensions will want to use FileRepo with an FSFileBackend and handle
storage paths properly. If done correctly, the end-user won't notice
anything on upgrade.
All core unit tests pass on my local machine.
End-users:
Once bugs are ironed out, nothing should really change for end-users.
Setup.php will automatically create backwards compatible FSFileBackend
containers for repositories. There aren't really any user facing features in
this rewrite.
--
View this message in context: http://wikimedia.7.n6.nabble.com/FileBackend-branch-merge-tp1799672p1799672…
Sent from the Wikipedia Developers mailing list archive at Nabble.com.