Hey,
I've seen a lot of changes in token system, however how it works is a
mystery to me. So far I understood it as:
You no longer get revert token per each edit using intoken but get
some kind of global rollback token using action=query&meta=tokens
Is this how it should work? If yes, there is a bug in mediawiki,
because token returned by this api is randomly invalid. That means,
sometimes it can be used and sometimes it return badtoken, which can
be fixed only by either logging out and back in, or using the
deprecated intoken parameter which returns working token (but is
deprecated).
If this is not how it should work, can someone explain to me how it
works? Thanks
here are some logs (read from bottom to top):
Thu Nov 6 13:57:57 2014 DEBUG[1]: Failed to deliver message to 122.176.67.4
Thu Nov 6 13:57:57 2014 Did not revert Tourism in Jammu and Kashmir:
ERROR: Cannot rollback, token
0c59203d7b4d176f28540646c82e264a545b7039+\ is not valid for some
reason (mediawiki bug), please try it once more
Thu Nov 6 13:57:57 2014 DEBUG[1]: Query failed: badtoken details: See
https://en.wikipedia.org/w/api.php for API usage
Thu Nov 6 13:57:57 2014 DEBUG[1]: Rolling back Tourism in Jammu and Kashmir
Thu Nov 6 13:57:57 2014 WARNING: API query (info): The intoken
parameter has been deprecated.
Thu Nov 6 13:57:57 2014 DEBUG[1]: Sending message to user 122.176.67.4
Thu Nov 6 13:57:29 2014 DEBUG[1]: enwiki mediawiki 1.25wmf6
Thu Nov 6 13:57:29 2014 DEBUG[2]: Token for enwiki rollback
0c59203d7b4d176f28540646c82e264a545b7039+\
Hi,
CSSJanus is now[1] included in MediaWiki core through composer, and
maintained as an external library on Github[2].
Running "composer update" will download and autoload the library, or if
you're using the mediawiki/vendor repository a simple "git pull" will work.
-- Legoktm
[1] https://gerrit.wikimedia.org/r/170107
[2] https://github.com/cssjanus/php-cssjanus
There's a long-standing request for Scribunto to provide library functions
for JSON encoding and decoding.
The advantage of this would be improved interoperability with the growing
number of extensions that use JSON (e.g. JsonConfig, Graph).
The disadvantages include:
* People may store data in JSON blobs that must be parsed where a module
using mw.loadData would be more appropriate.
* People may write templates that attempt to bypass the normal MediaWiki
parameter handling mechanism in favor of passing a JSON blob, which would
likely lead to page wikitext that is harder for end users to understand.
So, let's discuss it: do the advantages outweigh the potential
disadvantages? Are there additional advantages or disadvantages not yet
mentioned?
--
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
Hi, WikiGrok[0] has been successfully deployed to the English Wikipedia.
Along with it a first campaign[1] was deployed. Now database is being
slowly populated with suggestions:
MariaDB [enwiki_p]> select page_title, pp_value from page, page_props where
pp_page=page_id and pp_propname='wikigrok_questions_v1' limit 100;
+-------------------------+------------------------------------------------------------------------------------------------------+
| page_title | pp_value
|
+-------------------------+------------------------------------------------------------------------------------------------------+
| Richard_Branson |
a:1:{s:6:"author";a:2:{s:8:"property";s:4:"P106";s:9:"questions";a:1:{s:7:"Q482980";s:6:"author";}}}
|
| Tina_Fey |
a:1:{s:6:"author";a:2:{s:8:"property";s:4:"P106";s:9:"questions";a:1:{s:7:"Q482980";s:6:"author";}}}
|
| Jon_Stewart |
a:1:{s:6:"author";a:2:{s:8:"property";s:4:"P106";s:9:"questions";a:1:{s:7:"Q482980";s:6:"author";}}}
|
| Bill_Maher |
a:1:{s:6:"author";a:2:{s:8:"property";s:4:"P106";s:9:"questions";a:1:{s:7:"Q482980";s:6:"author";}}}
|
| Jeff_Foxworthy |
a:1:{s:6:"author";a:2:{s:8:"property";s:4:"P106";s:9:"questions";a:1:{s:7:"Q482980";s:6:"author";}}}
|
| Evadne_Price |
a:1:{s:6:"author";a:2:{s:8:"property";s:4:"P106";s:9:"questions";a:1:{s:7:"Q482980";s:6:"author";}}}
|
| Dominic_Guard |
a:1:{s:6:"author";a:2:{s:8:"property";s:4:"P106";s:9:"questions";a:1:{s:7:"Q482980";s:6:"author";}}}
|
| Dilsa_Demirbag_Sten |
a:1:{s:6:"author";a:2:{s:8:"property";s:4:"P106";s:9:"questions";a:1:{s:7:"Q482980";s:6:"author";}}}
|
| J._Douglas_MacMillan |
a:1:{s:6:"author";a:2:{s:8:"property";s:4:"P106";s:9:"questions";a:1:{s:7:"Q482980";s:6:"author";}}}
|
| Carol_Bowman |
a:1:{s:6:"author";a:2:{s:8:"property";s:4:"P106";s:9:"questions";a:1:{s:7:"Q482980";s:6:"author";}}}
|
| Lianella_Carell |
a:1:{s:6:"author";a:2:{s:8:"property";s:4:"P106";s:9:"questions";a:1:{s:7:"Q482980";s:6:"author";}}}
|
| G._K._Reddy |
a:1:{s:6:"author";a:2:{s:8:"property";s:4:"P106";s:9:"questions";a:1:{s:7:"Q482980";s:6:"author";}}}
|
| Liù_Bosisio |
a:1:{s:6:"author";a:2:{s:8:"property";s:4:"P106";s:9:"questions";a:1:{s:7:"Q482980";s:6:"author";}}}
|
| Matilde_Rodríguez_Cabo |
a:1:{s:6:"author";a:2:{s:8:"property";s:4:"P106";s:9:"questions";a:1:{s:7:"Q482980";s:6:"author";}}}
|
+-------------------------+------------------------------------------------------------------------------------------------------+
14 rows in set (0.42 sec)
Pages are getting updated when edited (null edit works, but not
action=purge). According to estimations made with WikiData Query[2], the
number of potentially affected pages is approximately 33,000. If really
needed, we could whip up a script to null-edit these pages from server side
in a controlled manner, but I would like to have more data on performance
and memory consumption first.
== Monitoring ==
* Graphite: MediaWiki -> WikiGrok
* Exceptions from WikiData: type:mobile in Logstash.
== Firefighting ==
Most of potentially performance-scary/error causing code with can be
disabled by commenting out $wgWikiGrokSlowCampaigns in
wmf-config/mobile.php. If shit hits fan really hard, whole extension can be
disabled through the usual means, with $wmgUseWikiGrok.
== Next steps ==
I'm working on DB storage for questions[3] which will allow us to avoid
abusing page_props and give features such as "find me pages that could use
this type of fixes" and "find me a random page to fix".
----
[0] https://www.mediawiki.org/wiki/Extension:WikiGrok
[1] https://gerrit.wikimedia.org/r/#/c/170453/
[2] http://wdq.wmflabs.org/
[3] https://gerrit.wikimedia.org/r/170263
--
Best regards,
Max Semenik ([[User:MaxSem]])
Hi all!
It seems that during the current efforts to modernize the API, the default
language (as represented by $wgLang) was changed to always be the content
language [1][2] instead of the user language. This may not be a problem for most
of the core modules, but broke some essential wikibase API modules (which relied
on the actual user language being the default). We propose to change the default
behavior back to what it was [3].
I think however that it would be good to first discuss how exactly we WANT the
API to behave.
First off, I'd like to point out that there are two distinct use cases for API
calls, or rather, two contexts in which API output is shown to a user:
1) Bots. Error messages are shown on the command line. The bot is potentially
interacting with multiple wikis at once.
2) UI. The web interface uses the API to implement user interactions.
Feedback/error messages are shown to the user directly. In some cases (notably
for Wikibase), the language is also relevant for parsing user input and
formatting output (think dates, numbers).
The "API help" use case is a bit in between the two, but since it's meant for
bot authors, I'd count it under (1); Help shown on Special:ApiSandbox, however,
would count under (2).
Now, let's look at the three options we have for the default language in the API:
a) The user language according to user preferences. This used to be the case
(technically), it's what wikibase relies on, and it's what [3] reverts to.
b) Hardcoded to "en". This was the de-facto standard for error/help messages,
since API messages are/were hard-coded in English.
c) The content language. This is what [1] did.
Option (a) used to be the status quo, and seems intuitive: if I set a default
language, I want to use that default language everywhere. API is an integral
part of the UI and should behave as such, so it's right for (2).
There is something to be said for option (b) for use case (1): if I run a bot on
100 wikis, I don't want to go and set the user language on all the wikis.
Getting error messages in english seems a good compromise (assuming people who
can program need to understand at least basic english anyway), and it's what
people are used to.
I personally see no advantage in option (c): For use case (1), it means I have
to put the uslang parameter in ALL API requests, otherwise I'll see errors in
the local language of each of the 100 wikis. For use case (2), it means I get
output in the content language instead of my user language - this is just
inconsistent, and particularly annoying on multilingual wikis like commons.
The current behavior, option (c), was introduced in the context of localizing
the API self-documentation page. If you consider that to be "content", it makes
a certain amount of sense, but it breaks a ton of other use cases. Was this even
intentional?
I recommend to return to option (a). We can still cover use case (1) by keeping
the english message in the current place in the response, and adding the
localized message under a different key (and perhaps make it optional). But keep
in mind that errors are not the only locale-sensitive output of the API - some
API modules generate HTML for display to the user.
What do you think?...
[1] https://gerrit.wikimedia.org/r/#/c/160798/
[2] The goal of the change in question was apparently to support the uselang
parameter for API modules, which is a good thing.
[3] https://gerrit.wikimedia.org/r/#/c/170895/
[Redirecting conversation about VisualEditor to wikitech-l; wikitext-l is
for parser and Parsoid discussion.]
On 3 November 2014 05:18, Andre Klapper <aklapper(a)wikimedia.org> wrote:
> Hi Parsoid & VisualEditor crew,
>
> Google Code-In (GCI) will soon take place again - a contest for 13-17
> year old students to contribute to free software projects.
>
> Wikimedia wants to take part again.
> Last year's GCI results were surprisingly good - see
> https://www.mediawiki.org/wiki/Google_Code-in_2013
>
[Snip]
> * Open VisualEditor tickets created in the last six months (if I got
> your products and components right):
>
> https://bugzilla.wikimedia.org/buglist.cgi?bug_status=UNCONFIRMED&bug_statu…
Corrected link (in front-end we use ASSIGNED to mean "accepted"):
https://bugzilla.wikimedia.org/buglist.cgi?bug_status=UNCONFIRMED&bug_statu…
> * Zero existing VisualEditor "easy" tickets (are they still valid? Are
> they really self-contained, non-controversial issues with a clear
> approach? Could some of them be GCI tasks that you would mentor? If so,
> please tag them as described above!):
>
> https://bugzilla.wikimedia.org/buglist.cgi?bug_status=UNCONFIRMED&bug_statu…
Corrected link, per above:
https://bugzilla.wikimedia.org/buglist.cgi?bug_status=UNCONFIRMED&bug_statu…
—
but there are still no such bugs
> Could you imagine mentoring some of these tasks?
>
Unfortunately, I think the VisualEditor world is too complicated at this
point to break off simple bugs without a lot more documentation (and this
is reflected in the lack of bugs tagged as "easy"). We've talked about
writing up a "crash course" to explain how things work, updating
https://www.mediawiki.org/wiki/VisualEditor/Design/Software_overview and
the like, but we're too far from that to be able to commit to GCI for this
year, sorry.
J.
--
James D. Forrester
Product Manager, Editing
Wikimedia Foundation, Inc.
jforrester(a)wikimedia.org | @jdforrester
Please join us for the following tech talk:
*Tech Talk**:* The MediaWiki Content Translation Extension
*Presenter:* Joel Sahleen, Software Engineer, Wikimedia Language Engineering
*Date:* November 3rd
*Time:* 1800 UTC
<http://www.timeanddate.com/worldclock/fixedtime.html?msg=Tech+Talk%3A+Desig…>
Link to live YouTube stream <http://www.youtube.com/watch?v=x7f5Zfvit5E>
*IRC channel for questions/discussion:* #wikimedia-office
Google+ page
<https://plus.google.com/u/0/b/103470172168784626509/events/ckomd4k0uh1fbraf…>,
another
place for questions
*Talk description:*
Wikipedia is the largest online encyclopedia in the world. With more than
32 million articles in over 280 languages, Wikipedia's network of
language-specific MediaWiki sites has become an important global resource
for the preservation and distribution of knowledge. If you look at how
Wikipedia's content is distributed across these language-specific sites,
however, you find that much of it is only available in a few “major”
languages, with English having by far the most content. At the same time,
there is also content found only in smaller Wikipedia language editions
that really deserves a wider audience. In order for different language
communities to be able to share content with each other, we need to provide
their multilingual users with a quick and easy way to translate content
found in one Wikipedia language edition and then publish the result in
another Wikipedia language edition. This is where the MediaWiki Content
Translation extension comes in.
The MediaWiki Content Translation (CX) extension provides a community’s
multilingual users with an array of translation tools they can use to
bootstrap the creation of new articles in their target language from
existing articles in related source languages. These tools are powered by a
set of integrated translation services that are potentially adaptable to
other Wikimedia projects. The extension is currently being made available
as a beta feature for a limited set of test languages that includes
Spanish, Catalan and Portuguese. More languages are planned to be added in
the future based on community interest, impact and various technical
criteria.
This talk will give an overview of the MediaWiki Content Translation
extension’s key features and describe its general architecture. The focus
will be on identifying the problems Content Translation is meant to solve,
explaining how we have attempted to solve these problems and discussing the
value the extension can add to Wikipedia and other MediaWiki based projects.