I finally made some updates to the ShortURL builder tool.
- Error pages include a mailto link to report errors.
- I finally updated the output to match the updates made to
Manual:Short_URL/Apache (No Alias, replace %{REQUEST_FILENAME} with
something that works, and update the flags)
- I finally tested out apache2+fastcgi+php-fpm to see how to handle short
urls config in that case and updated the builder to handle it ( ;) short
story, after 4-5 hours of fighting multiple types of config I finally got
something to work and found that configuring short urls for fastcgi is
exactly the same apache config as for mod_php)
One of these days I'm going to need Windows Apache, ISS, etc... test
environments to determine how to handle short urls there.
Eventually sometime in the future we may actually have enough knowledge to
include short url configuration inside the installer. ((Though we'll
probably wait till after entrypoint routing -- including builtin 404
thumbnail handling -- is implemented.))
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
As you know, wikisource needs robust, well-defined data, and there's a
strict, deep relationship between wikisource and Commons since Commons
hosts images of books, in .djvu or .pdf files. Commons shares both images
and contents fo information page of images, so that any wiki project can
visualize a view-only "pseudo-page" accessing to a local page named as the
file name into Commons.
Working into self-made data semantization into it.wikisouce using a lot of
creative tricks, we discovered that it's hard/almost impossible to read by
AJAX calls the contents of pages of other projects since well-known same
origin policy, but that File: local pages are considered as coming from
"same origin" so that they can be read as any other local page, and this
AJAX call asking for the content of
i.e. File:Die_Judenfrage_in_Deutchland_1936.djvu:
html=$.ajax({url:"
http://wikisource.org/wiki/File:Die_Judenfrage_in_Deutchland_1936.djvu
",async:false}).responseText;
gives back the html text of local File: view-only page, and this means that
any data stored into information page into Commons is freely accessible by
a javascript script and can easily used locally. In particular, data stored
into information and/or (much better) Book and Creator templates can be
retrieved and parsed
Has this been described/used before? It seems a plain, simple way to share
and disseminate good, consistent metadata into any project; and this runs
from today, without any change on current wiki software.
If you like, I'm sharing a practical test use of this trick into
wikisource.org too, you can import User:Alex brollo/Library.js and a lot of
smallo, original scripts will be loaded; click on "metadata" botton from
any page connected to a File: page ( namespaces Index, Page) and you'll see
a result coming from such an AJAX call.
Alex brollo, from it.wikisource
Greetings,
I have developed an offline Wikipedia, Wikibooks, Wiktionary, etc. app for
the iPhone, which does a somewhat decent job at interpreting the wiki
markup into HTML.
However, there are too many templates for me to program (not to mention,
it's a moving target).
Without converting these templates, many articles are simply unreadable and
useless.
Could you please provide HTML dumps (I mean, with the templates
pre-processed into HTML, everything else the same as now) every 3 or 4
months?
Or alternatively, could you make the template API available so I could
import it in my program?
Dear regards,
Roberto Flores
Hi everyone,
Just letting everyone know that we're having some issues with the Gerrit
server at the moment. Sometime since last night, a little over a dozen
repositories (listed below) have gone a wonky and are missing their
branches & tags (objects are intact).
We're working on recovering everything right now, and I'll be sure to let
everyone know the status when we're done.
In the meantime, if you use any of the following repositories, please
refrain from trying to use them until I give the all-clear (I don't want to
exacerbate any problems):
In mediawiki/extensions/*:
Comments, FacebookOpenGraph, GoogleDocs4MW, Nonlinear,
OnlineStatusBar, Phalanx, RandomImageByCategory,
SemanticImageInput, ShoutWikiAds, SphinxSearch,
TranslationNotifcations
In operations/*:
debs/mysqlatfacebook, debs/wikimedia-lvs-realserver
debs/wikimedia-search-qa, debs/wikistats, software
Also note: operations/mediawiki-config was busted, but we went ahead
and just rebuilt the repo from the live copy so as to not interfere with
site operations.
I thank you for your patience, and please find me on IRC if you have
any questions or find other repositories that are misbehaving.
-Chad
Hi,
I just wanted to announce that based on request from one community
member I created a new feature in wm-bot, which may appear to be
unneeded on first sight, but when I was thinking of that, it's not
really so stupid. It collects various informations about user activity
in channel (in xml) and allow them to be rendered in some way. In this
moment I only display number of messages for each user, for example:
http://bots.wmflabs.org/~wm-bot/db/%23wikimedia-labs.htm (scroll to bottom)
This is happening only in selected channels as experiment, but I
believe that it could motivate people to be more active and therefore
more helpful in these help channels. Also it would let us see how much
bots are active compared to people. (For example in #mediawiki.htm you
can see that most active users are bots)
http://eaves.ca/2012/08/30/community-managers-expectations-experience-and-c…
"This is why I keep saying things like code review dashboards matter. I
bet if this user could at least *see* what the average wait time is for
code review he'd have been much, much happier. Even if that wait time
were a month... at least he'd have known *what to expect.* Of course
improving the experience and community culture are harder problems to
solve... but they clearly would have helped as well."
I would love for someone to integrate that kind of wait-time indicator
into https://toolserver.org/~bawolff/gerrit-stats.htm or
http://gerrit-stats.wmflabs.org/ . My suggested stats: the
min/median/average/maximum time between a patchset's submission and its
merge or abandonment, and the min/median/average/maximum time between
patchset submission and any +1 or -1, divided up by Gerrit repository.
Anyone who works on this will probably also want to join the FLOSS
communities metrics working group:
http://www.theopensourceway.org/mailman/listinfo/metrics-wg
--
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation
Central Auth has been around for about 5 years now and we still lack a
API to interact with it. There is no
blocking/unblocking/locking/unlocking ability at all. see
https://bugzilla.wikimedia.org/show_bug.cgi?id=23821 who do I need to
bribe/torture/put a fire underneath in order to get basic access to
said tools?
John
If anyone is interested I've done some tweaks to Gareth recently (though
it still hasn't got the review functionality).
Firstly, I updated the README, requirements, settings setup, etc... so
that it's relatively easier for someone to jump in and try out. (Well,
long as you don't run into gevent installation issues).
I also started adding STOMP, WebSocket, etc... handling. So fetches are
being done in a separate process (long as you're running the taskrunner)
once you request it. And progress alert live updates are only a matter of
finishing the STOMP -> (Web)Socket connection and the theme code to update
the dom.
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
FYI
---------- Forwarded message ----------
From: Philip Chang <pchang(a)wikimedia.org>
Date: Mon, Sep 10, 2012 at 3:41 PM
Subject: Camera users - the Android app is for you!
To: Wiki Loves Monuments Photograph Competition <
wikilovesmonuments(a)lists.wikimedia.org>
Greetings WLM Members,
Version 1.2.3 of the WLM Android App is on Google
Play<https://play.google.com/store/apps/details?id=org.wikipedia.wlm>and
here <http://dumps.wikimedia.org/android/WLM-v1.2.3.apk>. We have made some
improvements and in particular, there is an important enhancement related
to camera photos.
Yes, camera photos. Please let me explain.
The app is a great way to find monuments, but many participants in WLM use
cameras to take the photos they will submit to the contest.
It is great to submit mobile photos to the contest as well, but cameras
generally offer better quality and control. So why not upload mobile photos
as placeholders, to make camera submissions easier?
The following blog post, which will be published tomorrow, explains how
this works.
Using the Wiki Loves Monuments App as a travel
log<http://blog.wikimedia.org/?p=17406>
Posted by Philip Chang <http://blog.wikimedia.org/author/pchang/> on
September 10th, 2012
The Wiki Loves Monuments Android
App<https://play.google.com/store/apps/details?id=org.wikipedia.wlm>
is
a great way to take photos and upload them to Wikimedia Commons during the
world’s largest photo
contest<http://blog.wikimedia.org/2012/08/29/kicking-off-wiki-loves-monuments-2012/>
throughout
September. But what if you are shooting with a camera and don’t see
yourself taking too many photos with your mobile phone? You can now use the
app as a convenient travel log and make it much easier to organize your
photos when processing and uploading them at home.
Here’s how you use the app as a travel log. As you walk around finding
monuments nearby to shoot with your camera, use the app on your phone to
find the monuments and take a picture of them to upload, either on the road
or when you get home. Back on your computer, your uploaded mobile photos
will be a convenient record of all the monuments you visited, sitting on
Commons under “My uploads.”
As an added bonus, every mobile upload you add to Commons will include a
link to the Special Upload Wizard that automatically allows you to upload
and categorize more photos of that monument based on its campaign and
reference number. This is similar to clicking the “upload photo” button on
the monument lists in Wikipedia, but it is right there in your travel log.
<http://commons.wikimedia.org/wiki/File:Upload_more_photos.png>
A screenshot of the new travel log feature associated with WLM app photos
uploaded to Commons.
To see your travel log and use this feature, you must login at
commons.wikimedia.org <http://blog.wikimedia.org/commons.wikimedia.org> and
click on “My uploads” at the top. Click on the name of any uploaded photo
and the file page of that photo will open. Scroll down and below the
description you will see the link, “Upload more photos of this monument.”
The travel log can help you in two ways:
1. you will see a sequential list of the monuments you visited, which
helps in identifying the monuments taken on your camera
2. you can submit the photos from your camera for each monument directly
from the travel log
The latest app, version 1.2.3, has been published in the Google
Play<https://play.google.com/store/apps/details?id=org.wikipedia.wlm>
store
and that version has this new feature. Please update the app if you
downloaded it before and do not have auto-updates turned on. If you have
good ideas about photo uploads in general, or improvements next year, feel
free to post feedback<http://www.mediawiki.org/wiki/Wiki_Loves_Monuments_mobile_application/Feedb…>
or
send email <mobile-feedback-l(a)lists.wikimedia.org>.
You may also download the app
here<http://dumps.wikimedia.org/android/WLM-v1.2.3.apk>,
or on the F-Droid
market<http://f-droid.org/repository/browse/?fdfilter=wiki%20loves%20monuments&fdi…>
.
We appreciate your support. Happy uploading!
*Phil Chang, Product Manager, Mobile*
- Copyright notes: "Upload more
photos"<http://commons.wikimedia.org/wiki/File:Upload_more_photos.png>
by Philinje <http://commons.wikimedia.org/wiki/User:Philinje>,
under CC-BY-SA
3.0 Unported <https://creativecommons.org/licenses/by-sa/3.0/legalcode>,
from Wikimedia Commons
--
Phil Inje Chang
Product Manager, Mobile
Wikimedia Foundation
415-812-0854 m
415-882-7982 x 6810
--
Phil Inje Chang
Product Manager, Mobile
Wikimedia Foundation
415-812-0854 m
415-882-7982 x 6810