I deleted a test article from huwiki, and this was the result instead of
the usual success message. However, the deletion has been completed.
A database error has occurred. Did you forget to run maintenance/update.php
after upgrading? See:
https://www.mediawiki.org/wiki/Manual:Upgrading#Run_the_update_script
Query: DELETE FROM `globalimagelinks` WHERE gil_wiki = 'huwiki' AND
gil_page = '929875'
Function: GlobalUsage::deleteLinksFromPage
Error: 1290 The MySQL server is running with the --read-only option so it
cannot execute this statement (10.0.6.61)
--
Bináris
Everyone,
I'm pleased to welcome Vibha Bamba, a new member of the Product group.
Vibha is starting today as Interaction Designer and will work mainly
on the Editor Engagement projects. As folks may know, many of the
Editor Engagement projects (such as New Pages Feed) involve complex
workflows for our editing community. Vibha will help us design
interfaces to make these features more user-friendly for our readers
and editors.
Vibha joins us from Yahoo, where she was a Senior Interaction
Designer. At Yahoo, she worked on a variety of projects, including
their engine for online display advertising, location based mobile
apps, the new personalized yahoo experiences and cross device design.
She worked on features such as sales dashboards, campaign management
and reporting tools, and advertising toolkits, so she brings a solid
background in building interfaces to support intricate and demanding
workflows. While in college, she interned for the United Nations
Hunger Project where she designed tools to help educate women on
microfinance.
She has a BS in Electrical Engineering from the University of Bombay,
as well as an MPS in Interactive Telecommunication from the Tisch
School of Arts at New York University. For more information on her
background, please see her public profile [1].
Please join me in welcoming Vibha!
Howie
[1] http://www.linkedin.com/pub/vibha-bamba/7/29b/529
While considering adding syntax highlighting to the format=jsonfm output
of the API, I noticed the hackishness of the current API help page
system[1] and started working on a replacement that generates a "[...] a
fully-HTML version of the help message"[2] while remaining
fully compatible with existing extensions.
I propose moving API help to a new special page Special:ApiHelp, which
would run a few preg_replace operations and then the parser on
individual portions of the documentation to format them as HTML. The
combined output for all modules would be cached in memcached as in the
old ApiHelp class.
Here are a few questions about the best way to implement this:
1. Some members of ApiBase I need access to are marked "protected", yet
special pages have to subclass SpecialPage, not ApiBase. Which of
these possible solutions is least hackish?
(a) Generating the help page in an API module and then making
an internal API request to execute that module when accessing
the special page. The special page would show the result.
(b) Internally calling action=paraminfo, individually requesting
information on each API module to avoid encountering API limits.
This would avoid duplicating lines 183-200 and 223-227 of
includes/api/ApiParamInfo.php .
(c) Adding an "allmodules" option to action=paraminfo, which would
only be allowed for internal requests because I am unsure of how
to cache the result.[3]
This would have the same advantage as option (b).
2. In bug 26681[1], Sam Reed suggested moving ApiHelp out of core.
I disagree. One of the main uses of the API is for coding bots
and user scripts, which are a quicker and more convenient way to
automate wiki processes than extensions that a server admin must
install. Having accurate, easy-to-read documentation specific to
the MediaWiki version and set of extensions is extremely useful
when coding a bot or user script. So does API help really not
belong in core?
3. Special:ApiHelp would need about ten CSS rules to display properly.
Is creating a separate ResourceLoader module the norm in
this situation?
4. To fit as many parameters on screen as possible, Special:ApiHelp
would use a tabular layout similar to the current text-based output
format. Is there any advantage to using definition lists over tables
(or vice-versa), keeping in mind that CSS can style the definition
list to appear in two columns?
5. Certain "tags" can apply to modules (i.e. "Read", "Write",
"Must POST", "Can generate"), which will go in the module's heading.
Therefore, I need to reduce the tags' font size to that of the
body text similar to .editsection. Is there a good alternative to
copying the .editsection code for each skin (or just using the
percentages for Vector), given the limitations of CSS?
I would greatly appreciate your input.
[1]: https://bugzilla.wikimedia.org/show_bug.cgi?id=26681
[2]: quoted from includes/api/ApiFormatBase.php
[3]: https://bugzilla.wikimedia.org/show_bug.cgi?id=26680
--
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand
>
>
> From: Aaron Pramana <aaron(a)sociotopia.com>
>
To: wikitech-l(a)lists.wikimedia.org
>
Subject: Re: [Wikitech-l] Visual watchlist
>
Message-ID: <loom.20120510T020226-593(a)post.gmane.org>
>
Content-Type: text/plain; charset=utf-8
>
Thanks again for your feedback - my GSoC project page is available here and
> I'd
>
appreciate any feedback you have:
>
http://www.mediawiki.org/wiki/User:Blackjack48/GSOC_proposal_for_watchlist_…
>
vements
>
> I will incorporate your suggestions into my project within the next few
> days.
>
Thanks Amir. Aaron, that is awesome, I'd be happy to help you out with the
visual design and css.
Multiple Wikimedia communities are waiting[0] for us to deploy the
ShortURL extension[1]. It looks like we want to move forward with
deploying ShortURL; the code was reviewed by Roan and approved. Reedy
says that someone needs to get the Apache rule correct and set-up
(ticket for Apache rewrites is filed[2] but the link structure change
hasn't been finalized), and that someone from ops should make "a few
database tables to create on target wikis". Besides that, he says,
"setup on the various wikis is a simple task and would only take a few
minutes to do".
So we need to decide how to structure the Apache rewrites, an issue that
touches ops, internationalisation, and maintenance concerns. There's
additional discussion in the bug comments, but it seems to have stalled
out. We've discussed URL shorteners a few times before on this
list[3],[4]. Please leave your comments at bug 1450 so we can decide
how to write the rewrite rule.
[0] https://bugzilla.wikimedia.org/show_bug.cgi?id=1450
[1] https://www.mediawiki.org/wiki/Extension:ShortUrl
[2] http://rt.wikimedia.org/Ticket/Display.html?id=2121
[3]
http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/59891
[4]
http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/58997/
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
From [[Special:Linksearch]] I can find all the external links,
based on the external links table in the database, which can
be accessed by tools on the German toolserver.
But is there any way to find similar information about links to
Wikisource? I.e. what are the total number of links? Which pages
link to a particular Wikisource page?
--
Lars Aronsson (lars(a)aronsson.se)
Aronsson Datateknik - http://aronsson.se
Daniel Zahn and I have been doing a survey of MediaWiki sites and trying
to get older sites running insecure copies of MediaWiki to upgrade.
When we asked Wikimedia India about their site which is currently at
1.18.1, Dhaval pointed to a bug he had filed earlier
(https://bugzilla.wikimedia.org/36275) that would causing problems for
them if they attempt to from upgrading to 1.20 when it is released.
I don't think this prevents an upgrade to 1.19 or 1.18.3, but we still
need to get this fixed for 1.20. I asked Srikanth, who responded on the
bug, if he could fix it, but he didn't have ready access to IE7.
Anyone else want to take this on?
--
http://hexmode.com/
Find peace within yourself and there will be peace on heaven and
earth. -- Abba Isaac
The MWNightly tool has been down for a while (since February, around migration to Git), so I took the liberty to write a new tool for this.
https://toolserver.org/~krinkle/mwSnapshots/
Updates hourly, even.
With Git making packages of a repository is a lot easier, but for those without command-line expertise this is still a comfort.
Source code and issue tracker online as usual (see links in the tool)
-- Krinkle
It has been requested for quite a while to have the Wikimedia site
configuration for MediaWiki in a public version control system [1]. This is
now done, and can be viewed using gitweb [2].
This essentially means you can do your own shell requests (mostly). This is
the same as with MediaWiki and the Operations Puppet repos, anyone with an
account can push changes to these repositories for review. When reviewed and
merged, they can be pulled onto site and deployed.
No history of these files have been imported.
There are probably a few more configuration files that are in the common
directory subtree that can be made public in the near future - for now, the
important ones (for most people), e.g. CommonSettings.php and
InitialiseSettings.php are there.
Sam
[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=17517
[2] https://gerrit.wikimedia.org/r/gitweb?p=operations/mediawiki-config.git