When list=allusers is used with auactiveusers, a property 'recenteditcount'
is returned in the result. In bug 67301[1] it was pointed out that this
property is including various other logged actions, and so should really be
named something like "recentactions".
Gerrit change 130093,[2] merged today, adds the "recentactions" result
property. "recenteditcount" is also returned for backwards compatability,
but will be removed at some point during the MediaWiki 1.25 development
cycle.
Any clients using this property should be updated to use the new property
name. The new property will be available on WMF wikis with 1.24wmf12, see
https://www.mediawiki.org/wiki/MediaWiki_1.24/Roadmap for the schedule.
[1]: https://bugzilla.wikimedia.org/show_bug.cgi?id=67301
[2]: https://gerrit.wikimedia.org/r/#/c/130093/
--
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
Hi,
I find that the result of list=backlinks for page XXX is odd when one of
the backlinks is a redirect on a different page than XXX, but has also
contents with a link to XXX.
See for example backlinks to en:Spin-off
https://en.wikipedia.org/w/api.php?bltitle=Spin-off&continue=&action=query&…
Among the backlinks, there's for example "User talk:SummerPhD" and all
pages linking to this talk page :
- It's content contains a link to Spin-off page, so it's normal to see
it.
- It's also a redirect (bad practice to have a redirect page with
contents), so all backlinks to this talk page are also listed while they
have no relation at all to "Spin-off"
Is it normal to have this behaviour ?
Nico
buenos dias quisiera saber si hay alguna manera de carga masiva de datos
donde me facilite subir mis paginas sin necesidad del editor de wiki
o una manera q sea mas eficaz o externa al editor de media wiki para poder
actualizar el contenido de mi wiki
gracias de antemano
2016-05-07 7:00 GMT-05:00 <mediawiki-api-request(a)lists.wikimedia.org>:
> Send Mediawiki-api mailing list submissions to
> mediawiki-api(a)lists.wikimedia.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
> or, via email, send a message with subject or body 'help' to
> mediawiki-api-request(a)lists.wikimedia.org
>
> You can reach the person managing the list at
> mediawiki-api-owner(a)lists.wikimedia.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Mediawiki-api digest..."
>
>
> Today's Topics:
>
> 1. Re: “rvdiffto=prev” problem, why notcatched?
> (Brad Jorsch (Anomie))
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Fri, 6 May 2016 10:05:40 -0400
> From: "Brad Jorsch (Anomie)" <bjorsch(a)wikimedia.org>
> To: "MediaWiki API announcements & discussion"
> <mediawiki-api(a)lists.wikimedia.org>
> Subject: Re: [Mediawiki-api] “rvdiffto=prev” problem, why
> notcatched?
> Message-ID:
> <
> CAEepRStYYHPT4n-Lv+rdK_bv0_fzdodCgBRNGQnNmgMn37ycLQ(a)mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
>
> On Fri, May 6, 2016 at 3:21 AM, Shiyue Zhang <byryuer(a)gmail.com> wrote:
>
> > But the results only return the "diff" of the first two revisions, the
> > others are “notcatched":
> >
> [...]
>
> > Why? If someone know, please reply me, I really need your help!!!
> > Thanks a lot!!!
> >
>
> Because generating diffs is expensive, it only allows the generation of a
> limited number of not-already-cached diffs per request. The solution is to
> request diffs for fewer old revisions per request (e.g. lower your
> rvlimit).
>
> --
> Brad Jorsch (Anomie)
> Senior Software Engineer
> Wikimedia Foundation
>
TL;DR:
----
* All access to Wikimedia production sites/APIs should use https://
URLs, not http:// -- your bot/tool will break in the near future if it
does not!
* 2016-06-12 - insecure access is unsupported; starting on this date
we plan to break (deny with 403) 10% of all insecure requests randomly
as a wake-up call.
* 2016-07-12 - we plan to break all insecure requests.
----
Hi all,
As you may remember, all production Wikimedia wikis switched to
HTTPS-only for all canonical domainnames nearly a year ago:
https://blog.wikimedia.org/2015/06/12/securing-wikimedia-sites-with-https/
Since way back then, we've been forcing insecure HTTP requests to our
canonical domains over to HTTPS by using redirects and
Strict-Transport-Security, which is effective for the vast majority of
access from humans using browsers and apps.
In the time since, we've been chasing down various corner-case issues
where loopholes may arise in our HTTPS standards and enforcement. One
of the most-difficult loopholes to close has been the "Insecure POST"
loophole, which is discussed in our ticket system here:
https://phabricator.wikimedia.org/T105794 .
To briefly recap the "Insecure POST" issue:
* Most of our humans using browser UAs are not affected by it. They
start out doing GET traffic to our sites, their GETs get redirected to
HTTPS if necessary, and then any POSTs issued by their browser use
protocol-relative URIs which are also HTTPS.
* However, many automated/code UAs (bots, tools, etc) access the APIs
using initial POST requests to hardcoded service URLs using HTTP
(rather than HTTPS).
* For all of the code/library UAs out there in the world, there is no
universally-compatible way to redirect them to HTTPS. There are
different ways that work for some UAs, but many UAs used for APIs
don't handle redirects at all.
* Regardless of the above, even if we could reliably redirect POST
requests, that doesn't fix the security problem like it does with GET.
The private data has already been leaked in the initial insecure
request before we have a chance to redirect it. If we did some kind
of redirect first, we'd still just be putting off the inevitable
future date where we have to go through a breaking transition to
secure the data.
Basically, we're left with no good way to upgrade these insecure
requests without breaking them. The only way it gets fixed is if all
of our API clients in the world use explicit https:// URLs for
Wikimedia sites in all of their code and configuration, and the only
way we can really force them to do so is to break insecure POST
requests by returning a 403 error to tools that don't.
Back in July 2015, I began making some efforts to statistically sample
the User-Agent fields of clients doing "Insecure POST" and tracking
down the most-prominent offenders. We were able to find and fix many
clients along the way since.
A few months ago Bryan Davis got us further when he committed a
MediaWiki core change to let our sites directly warn offending
clients. I believe that went live on Jan 29th of this year (
https://gerrit.wikimedia.org/r/#/c/266958 ). It allows insecure POSTs
to still succeed, but sends the clients a standard warning that says
"HTTP used when HTTPS was expected".
This actually broke some older clients that weren't prepared to handle
warnings at all, and caused several clients to upgrade. We've been
logging offending UAs and accounts which trigger the warning via
EventLogging since then, but after the initial impact the rate
flattened out again; clients and/or users that didn't notice the
warning fairly quickly likely never will.
Many of the remaining UAs we see in logs are simply un-updated. For
example, https://github.com/mwclient/mwclient switched to
HTTPS-by-default in 0.8.0, released in early January, but we're still
getting lots of insecure POST from older mwclient versions installed
out there in the world. Even in cases where the code is up to date
and supports HTTPS properly, bot/tool configurations may still have
hardcoded http:// site config URLs.
We're basically out of "soft" ways to finish up this part of the HTTPS
transition, and we've stalled long enough on this.
** 2016-06-12 is the selected support cutoff date **
After this date, insecure HTTP POST requests to our sites are
officially unsupported. This date is:
* A year to day after the public announcement that our sites are HTTPS only
* ~ 11 months after we began manually tracking down top offenders and
getting them fixed
* ~ 4 months after we began sending warning messages in the response
to all insecure POST requests to the MW APIs
* ~ 1 month after this email itself
On the support cutoff date, we’ll begin emitting a “403 Insecure POST
Forbidden - use HTTPS” failure for 10% of all insecure POST traffic
(randomly-selected). Some clients will retry around this, and
hopefully the intermittent errors will raise awareness more-strongly
than the API warning message and this email did.
A month later (two months out from this email) on 2016-07-12 we plan
to break insecure access completely (all insecure requests get the 403
response).
In the meantime, we'll be trying to track down offending bots/tools
from our logs and trying to contact owners who haven't seen these
announcements. Our Community team will be helping us communicate this
message more-directly to affected Bot accounts as well.
Thank you all for your help during this transition!
-- Brandon Black
Sr Operations Engineer
Wikimedia Foundation
We plan to add more RESTBase endpoints to support the new "Explore feeds"
feature in the apps. The currently proposed names are listed in [1]. It
introduces a new top-level hierarchy, called "project".
If you have issues with the current proposal or ideas to improve them
please comment on the Phab ticket by Thursday, May 19, 2016.
[1] https://phabricator.wikimedia.org/T132597
Thank you,
Bernd Sitzmann
Android app & Mobile Content Service
Hello ,
I have a problem with API.
I want to get the diff between one revision and its previous revision. I
use api like this:
"
https://en.wikipedia.org/w/api.php?action=query&format=json&prop=revisions&…
"
But the results only return the "diff" of the first two revisions, the
others are “notcatched":
{
"revid": 715123287,
"parentid": 714799929,
"user": "SnoozeKing",
"timestamp": "2016-04-13T21:07:34Z",
"comment": "",
"diff": {
"notcached": ""
}
},
{
"revid": 714799929,
"parentid": 712263801,
"minor": "",
"user": "Smithderek2000",
"timestamp": "2016-04-11T22:43:14Z",
"comment": "Removed unclear pronoun",
"diff": {
"notcached": ""
}
},
Why? If someone know, please reply me, I really need your help!!!
Thanks a lot!!!
--
Zhang Shiyue
*Tel*: +86 18801167900
*E-mail*: byryuer(a)gmail.com, yuer3677(a)163.com
State Key Laboratory of Networking and Switching Technology
No.10 Xitucheng Road, Haidian District
Beijing University of Posts and Telecommunications
Beijing, China.
Hi there,
I'm wondering if it is possible to use the MediaWiki action API to ge the
localization of namespace string given the namespace number? For example
given the namespace number "1" ('Talk') how do I get the localized
namespace string in Indonesian ('Pembicaraan').
I'm aware of the existence of "action=query&meta=allmessages" but I
couldn't find the localized message name of namespace. I also could not
find the mapping from namespace number to the message name. So I'm
wondering if there is another way to get what I wanted.
Thank you very much.
Regards,
Kenrick
Hi John,
I have not decided yet whether to do any paging logic in my app (if thats
what you are asking).
But I want to get the whole list of images from all the pages in the query
and then I need to do some filtration.
-Sherry
On Mon, 2 May 2016 at 15:01 <mediawiki-api-request(a)lists.wikimedia.org>
wrote:
> Send Mediawiki-api mailing list submissions to
> mediawiki-api(a)lists.wikimedia.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
> or, via email, send a message with subject or body 'help' to
> mediawiki-api-request(a)lists.wikimedia.org
>
> You can reach the person managing the list at
> mediawiki-api-owner(a)lists.wikimedia.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Mediawiki-api digest..."
>
>
> Today's Topics:
>
> 1. Wrong result (sherry.ummen(a)gmail.com)
> 2. Re: Wrong result (John)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Sun, 01 May 2016 20:38:10 +0000
> From: "sherry.ummen(a)gmail.com" <sherry.ummen(a)gmail.com>
> To: "mediawiki-api(a)lists.wikimedia.org"
> <mediawiki-api(a)lists.wikimedia.org>
> Subject: [Mediawiki-api] Wrong result
> Message-ID:
> <CABKw4GEpYpT4dw=GdeCw5G=
> SNN7vn2eeGhgtQr48MiCLBvQ2Vg(a)mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
>
> hello,
>
> if i run following query :
>
>
> https://en.wikipedia.org/w/api.php?action=query&prop=images&pageids=4529820…
>
> I do not get images for all the pageids. I am not sure what is worn in this
> query.
>
> If I use single pageid then I get expected result.
>
> Please suggest or help!
>
> -Sherry
>