There is no API call that will just return a list of page names
DebConf18/DayTrip/Registration/A
DebConf18/DayTrip/Registration/B
DebConf18/DayTrip/Registration/C
like one sees in
https://wiki.debconf.org/wiki/Special:PrefixIndex/DebConf18/DayTrip/Registr…
without insisting on adding
pageid="7823" ns="0" to the result
even if one uses prop=info & ininfo=...
nor just printing it in the above plain text format without
any other junk added, even if one uses format=txt.
The best one can do is
wget -O z "https://wiki.debconf.org/api.php?action=query&generator=allpages&gapprefix=…"
perl -wnle 'print for m!DebConf18/DayTrip/Registration.+!g;' z
DebConf18/DayTrip/Registration/A
DebConf18/DayTrip/Registration/B
DebConf18/DayTrip/Registration/C
The API has traditionally ignored values beyond the allowed limit,
returning a warning for this situation since 2008(!). It's long past time
for this error situation to actually raise an error, as requested in
https://phabricator.wikimedia.org/T41936.
This is happening in https://gerrit.wikimedia.org/r/433742. It should be
deployed to Wikimedia wikis with 1.32.0-wmf.6 or later, see
https://www.mediawiki.org/wiki/MediaWiki_1.32/Roadmap for the schedule.
Logs indicate that few clients on Wikimedia wikis are hitting the warning.
You can check your client by seeing if you're receiving a "Too many values
supplied for parameter" warning, or by using Special:ApiFeatureUsage for
your client's user agent and looking for a "too-many-X" code.
If your client is affected, the solution is to divide the values into
batches of the appropriate size. Generally the limit is 50 values for
clients without the apihighlimits right and 500 for clients with that
right. The limits for any particular parameter are documented in the
auto-generated help and are available in machine-readable format via
action=paraminfo.
--
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
Hello maintainers
I was trying to use curl to POST something to an api.php when I get the
following error:
{"error":{"code":"notoken","info":"The "token" parameter must be
set.","*":"See https://wiki.octave.org/wiki/api.php for API usage.
Subscribe to the mediawiki-api-announce mailing list at <
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce> for
notice of API deprecations and breaking changes."}}
My api.php is as follows: *https://wiki.octave.org/wiki/api.php
<https://wiki.octave.org/wiki/api.php>*
My POSTFIELDs are as follows: username=myusrname&
password=mypswd&rememberMe=1&logintoken=b86c77706c772328b4ece0e5dfde31
b25af83fcf%2B%5C&loginreturnurl=http%3A%2F%2Fwiki.octave.org%2Fwiki%2Fapi.
php
Note that I'm using libcurl (curl_easy_perform() and alike) for my work. Please
help me where I'm going wrong. I'll be really thankful to you.
Looking forward in anticipation.
Thanks and Regards
Sahil
> Date: Tue, 1 May 2018 15:46:20 -0400
> From: "Brad Jorsch (Anomie)" <bjorsch(a)wikimedia.org>
> To: "MediaWiki API announcements & discussion"
> <mediawiki-api(a)lists.wikimedia.org>
> Subject: Re: [Mediawiki-api] MediaWiki API v1.19.7; edit token
> request; empty response
>
>
> On Tue, May 1, 2018 at 3:38 PM, tom schulze <tom.schulze(a)posteo.de> wrote:
>
>
>> however all I get is an empty response.
>>
>> I created a gist
>> <https://gist.github.com/tomschulze/16fcdf8b88f285ab29365e4e558cb5f5>
>> with my python code.
>>
> You're missing parameters in params_edit_token. Try adding 'titles' with at
> least one title (doesn't have to exist) and 'intoken' with value 'edit'.
>
>
That did it! Thanks so much!
Hi everybody,
I am trying to receive an edit token for the MediaWiki 1.19 API, however
all I get is an empty response.
I created a gist
<https://gist.github.com/tomschulze/16fcdf8b88f285ab29365e4e558cb5f5>
with my python code. The script is successfully logging into the wiki
using my admin credentials prior to sending the request for the edit
token. The WRITE_API is enabled.
I read through the API docs
<MediaWiki%20API%20edit%20token%20request%20empty%20response> and
searched on stackoverflow. However, I could not find any answers. This
is really bugging me...
What am I doing wrong, why is the server not responding with the edit token?
Kind regards,
Tom
Thanks so much Brian for your detailed answer, thanks to you I'm starting to understand something !
Viviana
________________________________
Da: Mediawiki-api <mediawiki-api-bounces(a)lists.wikimedia.org> per conto di mediawiki-api-request(a)lists.wikimedia.org <mediawiki-api-request(a)lists.wikimedia.org>
Inviato: domenica 29 aprile 2018 14:00
A: mediawiki-api(a)lists.wikimedia.org
Oggetto: Mediawiki-api Digest, Vol 129, Issue 2
Send Mediawiki-api mailing list submissions to
mediawiki-api(a)lists.wikimedia.org
To subscribe or unsubscribe via the World Wide Web, visit
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
or, via email, send a message with subject or body 'help' to
mediawiki-api-request(a)lists.wikimedia.org
You can reach the person managing the list at
mediawiki-api-owner(a)lists.wikimedia.org
When replying, please edit your Subject line so it is more specific
than "Re: Contents of Mediawiki-api digest..."
Today's Topics:
1. Wikimedia API redirect 301 (viviana paga)
2. Re: Wikimedia API redirect 301 (bawolff)
----------------------------------------------------------------------
Message: 1
Date: Sat, 28 Apr 2018 14:33:29 +0000
From: viviana paga <viviana.paga(a)hotmail.it>
To: "mediawiki-api(a)lists.wikimedia.org"
<mediawiki-api(a)lists.wikimedia.org>
Subject: [Mediawiki-api] Wikimedia API redirect 301
Message-ID:
<DB6PR06MB3128D1B493601C07A5FF46C5E48C0(a)DB6PR06MB3128.eurprd06.prod.outlook.com>
Content-Type: text/plain; charset="windows-1252"
Hi everyone,
I'm developing an Ajax webservice that query Wikimedia Api Service, but I get a very strange behavior; in fact my query sometimes works perfectly and sometimes doesn't work at all (301 internal redirect) .
This is the response by the server: X-Cors-Redirect-1: 301 https://commons.wikimedia.…metadata&sroffset=0&callback=?
Could you help me to understand why or what I miss in my code?
This is my code :
$.ajaxPrefilter(function (options) {
if (options.crossDomain && jQuery.support.cors) {
const https = (window.location.protocol === 'http:' ? 'http:' : 'https:');
options.url = https + '//cors-anywhere.herokuapp.com/' + options.url;
}
if ( !options.beforeSend) {
options.beforeSend = function (xhr) {
xhr.setRequestHeader('Api-User-Agent', 'OpenArtImages/Beta (http://localhost:8080; viviana.paga(a)hotmail.it; Wikipedia User: Vivsss)');
xhr.setRequestHeader('Content-Type', 'application/json; charset=UTF-8');
xhr.setRequestHeader('Origin', 'http://localhost:8080');
xhr.setRequestHeader('Strict-Transport-Security', 'max-age=106384710; includeSubDomains; preload');
xhr.withCredentials = true;
}
}
});
firstRequest = $.get( 'https://commons.wikipedia.org/w/api.php?origin=*&action=query&list=search&f…:'+inputWord+incategory+'+fileh:>600& &prop=imageinfo|pageids|titles&srnamespace=6&rawcontinue=&srinfo=totalhits|suggestion&srlimit='+limit+'&iiprop=timestamp|user|url|size|sha1|mime|metadata'+offset+'&callback=?',
function (response) {
///// ........ I get Data ......... /////
}
Thank you so much,
Viviana Paga
Hi everyone,
I'm developing an Ajax webservice that query Wikimedia Api Service, but I get a very strange behavior; in fact my query sometimes works perfectly and sometimes doesn't work at all (301 internal redirect) .
This is the response by the server: X-Cors-Redirect-1: 301 https://commons.wikimedia.…metadata&sroffset=0&callback=?
Could you help me to understand why or what I miss in my code?
This is my code :
$.ajaxPrefilter(function (options) {
if (options.crossDomain && jQuery.support.cors) {
const https = (window.location.protocol === 'http:' ? 'http:' : 'https:');
options.url = https + '//cors-anywhere.herokuapp.com/' + options.url;
}
if ( !options.beforeSend) {
options.beforeSend = function (xhr) {
xhr.setRequestHeader('Api-User-Agent', 'OpenArtImages/Beta (http://localhost:8080; viviana.paga(a)hotmail.it; Wikipedia User: Vivsss)');
xhr.setRequestHeader('Content-Type', 'application/json; charset=UTF-8');
xhr.setRequestHeader('Origin', 'http://localhost:8080');
xhr.setRequestHeader('Strict-Transport-Security', 'max-age=106384710; includeSubDomains; preload');
xhr.withCredentials = true;
}
}
});
firstRequest = $.get( 'https://commons.wikipedia.org/w/api.php?origin=*&action=query&list=search&f…:'+inputWord+incategory+'+fileh:>600& &prop=imageinfo|pageids|titles&srnamespace=6&rawcontinue=&srinfo=totalhits|suggestion&srlimit='+limit+'&iiprop=timestamp|user|url|size|sha1|mime|metadata'+offset+'&callback=?',
function (response) {
///// ........ I get Data ......... /////
}
Thank you so much,
Viviana Paga
Cross-post. Please see below.
Thanks!
-Adam
P.S. MWStake will be contacted under separate cover from Cindy.
---------- Forwarded message ----------
From: Edward Galvez <egalvez(a)wikimedia.org>
Date: Wed, Apr 4, 2018 at 5:37 PM
Subject: Wikimedia contributors survey is here: share your feedback
Hi everyone,
The Wikimedia Foundation is asking for your feedback in a survey. We want
to know how well we are supporting your work on and off wiki, and how we
can change or improve things in the future. The opinions you share will
affect the current and future work of the Wikimedia Foundation.
If you are volunteer developer, and have contributed code to any pieces of
MediaWiki, gadgets, or tools, please complete the survey. It is available
in various languages and will take between 20 and 40 minutes to complete.
*Follow this link to the Survey:* https://wikimedia.qualtrics.
com/jfe/form/SV_5ABs6WwrDHzAeLr?aud=DEV
If you have already seen a similar message on Phabricator, Mediawiki.org,
Discourse, or other platforms for volunteer developers, please don't take
the survey twice.
You can find more information about this survey on the project page
<https://meta.wikimedia.org/wiki/Community_Engagement_Insights/About_CE_>
and see how your feedback helps the Wikimedia Foundation support
contributors like you. This survey is hosted by a third-party service and
governed by this privacy statement
<https://wikimediafoundation.org/wiki/Community_Engagement_Insights_2018_Sur…>.
Please visit our frequently asked questions page
<https://meta.wikimedia.org/wiki/Community_Engagement_Insights/Frequently_as…>
to find more information about this survey.
Feel free to email me directly with any questions you may have.
Thank you!
Edward Galvez
--
Edward Galvez
Evaluation Strategist, Surveys
Learning & Evaluation
Community Engagement
Wikimedia Foundation
Hi!
I know that I'm able to get the "linkshere" prop for multiple pages like
this:
https://en.wikipedia.org/w/api.php?format=json&action=query&prop=linkshere&…
My problem is with the "lhlimit" parameter. I'd like to get 10 results for
each page, but it's currently gives my 10 results for ALL of the pages,
which means I have to call "lhcontinue" each time.
Is it possible to get 10 results for each page when asking for multiple
pages?
Or should I make a different call for each article in order to achieve
this?
Thanks,
Tal
Hi!
I have a question regarding the random articles generation in the API.
Let's say I'm using this URL to retrieve two random articles:
https://en.wikipedia.org/w/api.php?format=xml&action=query&generator=random…
I got an API response which contains articles A and B in this order:
1. A
2. B
Is there a chance (I know it's very small, but it's just for my
understanding), that I'll get the exact articles but in the opposite order?
Like this:
1. B
2. A
I'd like to know whether there is certain order for the random articles in
the API response, or it's also random.
Thanks!