Hi,
I've implemented AJAX patrolling for the MediaWiki core
( https://gerrit.wikimedia.org/r/26440 ) and I would like to gather some
feedback and hopefully get someone to take a look at the code in the
hope to get it merged soon (as the change is in gerrit for more than a
month now).
It acts like this:
When clicking on a patrol link a spinning animation will show up between
the brackets the patrol link was shown in and the patrol request will be
send to the API. In case it succeeds the patrol links on page will
disappear (just like on a normal page view) and the user will get a
notification. In case an error occurs the spinner will be removed and
the old patrol link gets shown again, so that the user can give it
another try (on top of a notification, of course).
Thank you in advance
Hi folks,
I know some of you are tired of these wm-bot messages :) but I didn't send
one regarding today's maintenance and now I feel sorry for that.
I did a huge maintenance today replacing the old monolithic core with new
lightweight core with features bundled as dynamic modules (which makes it
way easier to patch and maintain), but unfortunately, stuff didn't go so
well as I expected. This resulted in about 30 minutes outage, approximately
since 16:20 GMT till 17:00 GMT that means, all logs from all channels that
are being publicly logged may be missing during that time.
It's no problem for me to insert these logs by hand if you need, just send
me the missing part with GMT time and I will do that.
Also because stuff is still not fully recovered, you can expect some
additional issues, especially in wmf wiki's RC feed relay to freenode (note
WMF RC feed at irc.wikimedia.org is fine, I am talking about relay to
freenode only, provided by wm-bot) and in html dumps, statistics and
exports (this is provided by 2 modules, which both right now produces a lot
of errors in log).
I apologize for all inconveniences caused by this unannounced maintenance,
and if u don't mind, I will notify you all next time on this list, if I was
about to do something large what could make the bot unavailable for longer
time. (In fact I was hoping for quick replacement with no problems, but
that never happens :D)
Thanks, Petr
Hello all,
Is it possible to pause and resume a Mediawiki edit?
To explain, I've written a MW extension that accesses an external
database; this database requires OAuth authentication [1.0, pre-OAuth
wars version], which is a three-step process requiring the user to be
redirected to an external site to allow the extension access to the
external db. If the MW extension already has an access token for the
extDb, all is well. However, if there isn't a token, there is a
problem. This is a tag extension, and is triggered by finding a
certain XML tag in the wiki page, which typically occurs in the
'preview' or 'submit' of an edit, e.g.
http://server.com/wiki/index.php?title=Bibliography&action=submit (the
parser hook is ParserFirstCallInit). The callback URL constructed by
the OAuth code returns you to the page you were editing, but in its
pre-edit state: i.e. you lose all your edits.
How can I resume the edit and not lose my edit data?
Thanks for any help, clues, or workarounds!
--
Amelia Ireland
GMOD Community Support || http://gmod.org
Hi all,
Wikidata aims to centralize structured datas from the Wikipedias in
one central wiki, starting with the language links. The main technical
challenge that we will face is to implement the data flow on the WMF
infrastructure efficiently. We invite peer-review on our design.
I am trying to give a simplified overview here. The full description
is on-wiki: <http://meta.wikimedia.org/wiki/Wikidata/Notes/Change_propagation>
There are a number of design choices. Here is our current thinking:
* Every change on the language links in Wikidata is stored in the
wb_changes table on Wikidata
* A script (or several, depends on load), run per wiki cluster, checks
wb_changes, gets a batch of changes it has not seen yet, and
creates jobs for all pages that are affected in all wikis on the
given cluster
* When the jobs are executed, the respective page is re-rendered and
the local recentchanges filled
* For re-rendering the page, the wiki needs access to the data.
We are not sure about how do to this best: have it per cluster,
or in one place only?
We appreciate comments. A lot. This thing is make-or-break for the
whole project, and it is getting kinda urgent.
Cheers,
Denny
--
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Thank you, Erik. Before (or rather than) commenting, I have a single
question below; the rest of the email is just a premise+addendum to it. ;-)
Terry Chay, 07/11/2012 21:04:
>> You aren't the only one. It turns out we use a lot of industry terminology, without realizing that we are poorly communicating what that means to most people. [...]
>> First of all, this will help greatly to the others (you already read it): <http://wikimediafoundation.org/wiki/Staff_and_contractors>.
Thanks for your explanation but personally I'm more confused than before
about the difference between Engineering and Product, also because the
terminology didn't appear internally consistent. :-)
So, to keep it simple, that page has:
2 Engineering and Product Development
2.1 Platform
2.2 Features
2.3 Technical Operations
2.4 Mobile and Special Projects
2.5 Language
2.6 Product
and as first approximation "Product" would be something like 2.2+2.6 and
"Engineering" something like 2.1+2.3, with 2.4 and 2.5 aside?
>> [...] On the "Engineering" side, there exists an amalgam of specific focused groups with their own directors. The focused groups are: Language (formerly "i18n and Experimentation", internationalization/localization/globalization is a cross-cutting concern), and Mobile (formerly, "Mobile and Special Projects: the mobile web, the mobile app, also including Wikipedia Zero). The "area" focused ones are: Operations (keeping the lights on), Platform (keeping the code working) and Features (ostensibly new features). [...]
What you call the Engineering side here, at a first glance, could seem
product development, and in fact those two "focused groups" currently
have some members which are under 2.6 (Product). Surely the same happens
for the other areas you mentioned.
Which brings me to my question.
Erik Moeller, 06/11/2012 04:03:
> A split dept structure wouldn’t affect the way we assemble teams --
> we’d still pull from required functions (devs, product, UI/UX, etc.),
> and teams would continue to pursue their objectives fairly
> autonomously.
Could you please elaborate on this?
"The [current] way we assemble teams" is very obscure to me.
Will members of each team become more or less scattered among different
responsibles than they currently are?
For instance, if I understand correctly, what Terry called the
Engineering side is distinguished by being "used" by teams in other
areas/department for "cross-cutting concerns" in addition to having some
product-development-like tasks? Will the mixed functions which
individual persons/teams have become more or less clear by the split in
two departments?
Thanks,
Nemo
https://bugzilla.wikimedia.org/userprefs.cgi?tab=email
You can tweak your Bugzilla settings so that, for example, if you're
just cc'd on a bug, you don't get email every time the keywords field
changes, the cc list changes, etc. If you believe you get too much
bugmail, this is a good place to remove some. :)
--
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation
Hi all,
Wikidata is planned as a multilingual resource, and we are using ULS
for switching languages. ULS is pretty cool, if you have not tried it
out yet, you definitively should.
ULS works great if you are a logged in user.
ULS on Wikidata does not work so well right now if you are not logged
in. Due to caching issues, the system sometimes switches your language
randomly.
There is a bug with quite some discussion going on:
<https://bugzilla.wikimedia.org/show_bug.cgi?id=41451>
There is a changeset to ULS providing one solution:
<https://gerrit.wikimedia.org/r/#/c/32030/>
There seems to be an agreement that the solution could fix the issue
for now with Wikidata, but it is unclear if this will scale to the
Wikipedias. This is a call for some input and attention to anyone who
can help resolve the issue.
So far huge thanks to Daniel K., Mark, Niklas, Katie, Siebrand, Faidon
for pushing for solutions.
Cheers,
Denny
--
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
I'm happy to announce the availability of the first stable release
of the new MediaWiki 1.20 release series.
MediaWiki 1.20 is a large release that contains many new features and
bug fixes. This is a summary of the major changes of interest to users.
You can consult the RELEASE-NOTES-1.20 file for the full list of changes
in this version.
Our thanks go to everyone who helped to improve MediaWiki by testing
the beta release and submitting bug reports.
== What's new? ==
MediaWiki 1.20 brings the usual host of various bugfixes and new features.
* Minimum PHP version is now 5.3.2.
* New diff view, greatly improved in clarity especially for
whitespace and other small changes and color-blind users.
* New special page Special:MostInterwikis.
* New magic word {{PAGEID}} which gives the current page ID.
* The info action has been reimplemented.
Internationalization:
* New languages supported: Emilian (egl), Tornedalen Finnish (fit),
Mizo (lus), Santali (sat), Turoyo (tru)
* New Cyrillic-Latin language converter for Uzbek (uz)
== What's next? ==
=== Next Release ===
Since the Wikimedia Foundation has successfully switched to a biweekly
release cycle for their sites, making releases of MediaWiki available
on a more regular basis makes sense. As of this release, we plan to
release a new version of MediaWiki every six months. This means that
version 1.21 of MediaWiki will be released in April or May 2013.
=== Long Term Support ===
We're working closely with Linux distributors to make sure that the
MediaWiki bundled in Linux is something that we feel more comfortable
supporting. In this vein, MediaWiki 1.19 is being targeted for "long
term support". Since Debian (the Linux distribution with the longest
release cycle) has a two year cycle between each freeze and we've
gotten MediaWiki 1.19 into Wheezy, we'll support MW 1.19 for the next
two years. (Thank you especially to MediaWiki developer Platonides
for his help in working with the Debian developers.)
Full release notes:
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=blob;f=RELEASE…https://www.mediawiki.org/wiki/Release_notes/1.20
Frequently asked questions about upgrading:
http://www.mediawiki.org/wiki/Manual:FAQ#Upgrading
**********************************************************************
Download:
http://download.wikimedia.org/mediawiki/1.20/mediawiki-1.20.0.tar.gz
GPG signatures:
http://download.wikimedia.org/mediawiki/1.20/mediawiki-1.20.0.tar.gz.sig
Public keys:
https://secure.wikimedia.org/keys.html
- --
http://hexmode.com/
Any time you have "one overriding idea", and push your idea as a
superior ideology, you're going to be wrong. ... The fact is,
reality is complicated -- Linus Torvalds <http://hexm.de/mc>
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.11 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://www.enigmail.net/
iD8DBQFQmaW3c17xCi38v/URAgToAJ0SVYePd+BxCNWToV8LEc5lpcVrrQCeKKXR
GTKxNnWaQIoLoTUoE7BvuOo=
=wR0s
-----END PGP SIGNATURE-----
Hello all
I am Harsh Kothari from Gujarat, India. I want to contribute. I am more active on Gujarati Wikipedia. More then 3k+ edits are there and I have also made some wikibot scripts for doing various work on Gujarati Wikipedia. I want to solve bugs, create new scripts for Gujarati as well as English. So How can I start? I am expert in java script and python and very good knowledge of php, Json, Xml etc. So please guide me how can I start. How I can solve the bugs?
Thanks in advance
Harsh
---
Harsh Kothari
Research Fellow,
Physical Research Laboratory(PRL).
Ahmedabad.