Hi everyone,
I'm developing an Ajax webservice that query Wikimedia Api Service, but I get a very strange behavior; in fact my query sometimes works perfectly and sometimes doesn't work at all (301 internal redirect) .
This is the response by the server: X-Cors-Redirect-1: 301 https://commons.wikimedia.…metadata&sroffset=0&callback=?
Could you help me to understand why or what I miss in my code?
This is my code :
$.ajaxPrefilter(function (options) {
if (options.crossDomain && jQuery.support.cors) {
const https = (window.location.protocol === 'http:' ? 'http:' : 'https:');
options.url = https + '//cors-anywhere.herokuapp.com/' + options.url;
}
if ( !options.beforeSend) {
options.beforeSend = function (xhr) {
xhr.setRequestHeader('Api-User-Agent', 'OpenArtImages/Beta (http://localhost:8080; viviana.paga(a)hotmail.it; Wikipedia User: Vivsss)');
xhr.setRequestHeader('Content-Type', 'application/json; charset=UTF-8');
xhr.setRequestHeader('Origin', 'http://localhost:8080');
xhr.setRequestHeader('Strict-Transport-Security', 'max-age=106384710; includeSubDomains; preload');
xhr.withCredentials = true;
}
}
});
firstRequest = $.get( 'https://commons.wikipedia.org/w/api.php?origin=*&action=query&list=search&f…:'+inputWord+incategory+'+fileh:>600& &prop=imageinfo|pageids|titles&srnamespace=6&rawcontinue=&srinfo=totalhits|suggestion&srlimit='+limit+'&iiprop=timestamp|user|url|size|sha1|mime|metadata'+offset+'&callback=?',
function (response) {
///// ........ I get Data ......... /////
}
Thank you so much,
Viviana Paga
Cross-post. Please see below.
Thanks!
-Adam
P.S. MWStake will be contacted under separate cover from Cindy.
---------- Forwarded message ----------
From: Edward Galvez <egalvez(a)wikimedia.org>
Date: Wed, Apr 4, 2018 at 5:37 PM
Subject: Wikimedia contributors survey is here: share your feedback
Hi everyone,
The Wikimedia Foundation is asking for your feedback in a survey. We want
to know how well we are supporting your work on and off wiki, and how we
can change or improve things in the future. The opinions you share will
affect the current and future work of the Wikimedia Foundation.
If you are volunteer developer, and have contributed code to any pieces of
MediaWiki, gadgets, or tools, please complete the survey. It is available
in various languages and will take between 20 and 40 minutes to complete.
*Follow this link to the Survey:* https://wikimedia.qualtrics.
com/jfe/form/SV_5ABs6WwrDHzAeLr?aud=DEV
If you have already seen a similar message on Phabricator, Mediawiki.org,
Discourse, or other platforms for volunteer developers, please don't take
the survey twice.
You can find more information about this survey on the project page
<https://meta.wikimedia.org/wiki/Community_Engagement_Insights/About_CE_>
and see how your feedback helps the Wikimedia Foundation support
contributors like you. This survey is hosted by a third-party service and
governed by this privacy statement
<https://wikimediafoundation.org/wiki/Community_Engagement_Insights_2018_Sur…>.
Please visit our frequently asked questions page
<https://meta.wikimedia.org/wiki/Community_Engagement_Insights/Frequently_as…>
to find more information about this survey.
Feel free to email me directly with any questions you may have.
Thank you!
Edward Galvez
--
Edward Galvez
Evaluation Strategist, Surveys
Learning & Evaluation
Community Engagement
Wikimedia Foundation
Hi!
I know that I'm able to get the "linkshere" prop for multiple pages like
this:
https://en.wikipedia.org/w/api.php?format=json&action=query&prop=linkshere&…
My problem is with the "lhlimit" parameter. I'd like to get 10 results for
each page, but it's currently gives my 10 results for ALL of the pages,
which means I have to call "lhcontinue" each time.
Is it possible to get 10 results for each page when asking for multiple
pages?
Or should I make a different call for each article in order to achieve
this?
Thanks,
Tal
Hi!
I have a question regarding the random articles generation in the API.
Let's say I'm using this URL to retrieve two random articles:
https://en.wikipedia.org/w/api.php?format=xml&action=query&generator=random…
I got an API response which contains articles A and B in this order:
1. A
2. B
Is there a chance (I know it's very small, but it's just for my
understanding), that I'll get the exact articles but in the opposite order?
Like this:
1. B
2. A
I'd like to know whether there is certain order for the random articles in
the API response, or it's also random.
Thanks!
I have posted a similar message on the Wikimedia Developer Support, but
I still can't get my code running, here is my original post:
"I am writing a code to use the MediaWiki web service API to login to
the system. Based on the online API documentation, I used “action=query”
to get a login token and “action=clientlogin” to post the login request,
but I got “Invalid CSRF token” error. I am not sure what I missed, is
there a clientlogin sample code, either html or php or any format?"
My code is JavaScript running in the browser, let me know if anyone
wants to take a look of my testing code.
Regards,
Ren
Hi,
I am trying to get articles belong to WikiProject Democratic Republic of
the Congo. I tried both API calls and Quarry (I guess they shared the same
data source), but did not get any articles back. I wonder if anyone has any
sense what's going for this project in the database? I just used
"Democratic Republic of the Congo" as the query input. Thank you!
Hi,
I am trying to get the articles that are tagged to a particular project
using the API. My code was working like couple months ago, but not any
more.
Below is the url request I make, and it seems that projectpages is no
longer on the API:Lists <https://www.mediawiki.org/wiki/API:Lists> either.
I wonder if anyone would know what's going on, or point me to resources?
Thank you!
URL = 'https://en.wikipedia.org/w/api.php?action=query&format=
json&list=projectpages&wpplimit=500&wppassessments=1&
wppprojects=Military%20history'
There's an inconsistency in the handling of multi-value parameters where if
you pass a value consisting of only whitespace[1] it will be interpreted as
an empty set rather than as a single value consisting of whitespace. For
example, https://www.mediawiki.org/w/api.php?action=query&titles= correctly
handles the query as specifying no titles, but
https://www.mediawiki.org/w/api.php?action=query&titles=%20 is also treated
as specifying no titles rather than specifying a title consisting of a
space character.
With Gerrit change 405609,[2] the behavior will be changed so that the
latter query will behave more like
https://www.mediawiki.org/w/api.php?action=query&titles=_ in reporting that
the supplied title is invalid. The plan is that this will be deployed to
WMF wikis with 1.31.0-wmf.20 on February 6–8, see
https://www.mediawiki.org/wiki/MediaWiki_1.31/Roadmap for the schedule.
Passing an empty string as the value of a multi-valued parameter will
continue to be treated as a set of zero elements rather than a one-element
set containing the empty string.
Most clients should handle this change without major issue, as the
resulting response is consistent with the response when any other invalid
title is submitted. Clients that want to continue treating whitespace-only
input as no input should begin checking for whitespace-only input before
submitting it to the API.
[1]: Specifically: spaces (U+0020), tabs (U+0009), line feeds (U+000A), and
carriage returns (U+000D).
[2]: https://gerrit.wikimedia.org/r/#/c/405609/
--
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce