Hi Mediawiki-api mailing listers!
I'm trying to get the intro to a list of Wikipedia pages using the
"extracts" property with "exintro=True". This works fine for most sites,
but for a few of them the API returns an empty extract field. See for
example:
https://en.wikipedia.org/w/api.php?action=query&prop=extracts&titles=Anthem…
When looking at the page "https://en.wikipedia.org/wiki/Anthem" there
definitely seems to be text before the first section, so I think I should
be getting something. Indeed without the "exintro" parameter, I get the
expected return.
Any idea why this occurs?
Best,
Bertel
According to RFC 7231 § 3.1.1.5,[1] a POST request that does not include a
Content-Type header may be interpreted by the server in one of two ways:
1. It may assume application/octet-stream. In this case, PHP and the
Action API will not see the request as having any parameters, and so
will probably serve the auto-generated help page.[2]
2. It may "sniff" the content type. It's likely enough to correctly
guess application/x-www-form-urlencoded in this case, and therefore PHP and
the Action API will see the request as having the intended parameters.
It turns out that HHVM and PHP 7 (at least as used at Wikimedia) differ in
their behaviors: PHP 7 seems to choose option 1, while HHVM chooses option
2.
Thus, clients that have been generating POST requests to Wikimedia wikis'
Action APIs without a Content-Type header will have been receiving expected
results from HHVM but will now start receiving unexpected results as
Wikimedia's migration to PHP 7 proceeds.[3] Affected clients should be
updated to include the Content-Type header in their requests.
See https://phabricator.wikimedia.org/T230526 for some details on this
issue.
[1]: https://tools.ietf.org/html/rfc7231#section-3.1.1.5
[2]: As seen for example at https://www.mediawiki.org/w/api.php.
[3]: See https://phabricator.wikimedia.org/T176370 for progress on the
migration.
--
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
Hello Everyone,
Sorry for posting here. I am Jay Prakash. As part of my
GSoC project, I have developed the sample code in PHP, Javascript, and
MediaWiki JS to demonstrate the use of MediaWiki Action API modules. I
want/seek review, valuable suggestions, and feedback on the sample code
format and accuracy. Please give us your valuable suggestions and feedback
at T228549.[1]
Thank you :)
User:Jayprakash12345
Intern, GSoC 2019 w/o Wikimedia Foundation
[1] https://phabricator.wikimedia.org/T228549
Hello everyone.
Please see this link: https://phabricator.wikimedia.org/T212907
The problem in short description is that in some wikis there is a problem
with number of articles. Specially the big articles. When any article
exceed the transclusion limit of the templates, then the templates in the
bottom will not be shown. Also the article loads very very slow.
Then I find out that if we use Rest API
https://www.mediawiki.org/api/rest_v1/ to get the content of the pages that
contain the same problem, then it will loads very fast, and the problem
disappear.
So Aklapper ask to post this question in this email list. See the Phab ID
for more details.
Best regards.
Ahmed.
Hi,
API:Etiquette (https://www.mediawiki.org/wiki/API:Etiquette)page says there
is no hard and fast limit on API read requests,but how many requests have
MediaWiki had access restrictions in the past?
Regards,
Kazuma
An upgrade to the timestamp library used by MediaWiki is resulting in two
changes to the handling of timestamp inputs to the action API. There will
be no change to timestamps output by the API.
All of these changes should be deployed to Wikimedia wikis with
1.34.0-wmf.10.
Historically MediaWiki has ignored timezones in supported formats that
include timestamps, treating them as if the timezone specified were UTC. In
the future, specified timezones will be honored (and converted to UTC).
Historically some invalid formats were accepted, such
as "2019-05-22T12:00:00.....1257" or "Wed, 22 May 2019 12:00:00 A potato".
Due to improved validation, these will no longer be accepted.
Support for ISO 8601 and other formats has also been improved. See
https://www.mediawiki.org/wiki/Timestamp for details on the formats that
will be supported.
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
Hello,
I've been trying to have my user remotely sign-in through my
laravel/react site, but I've been reaching a dead-end.
After reading this post from 2017 :
https://laracasts.com/discuss/channels/general-discussion/single-sign-on-me…
I've followed the step for method 2 from:
https://www.mediawiki.org/wiki/API:Login
I've request a login token with the api with a "GET" call of
"/api.php?action=query&format=json&meta=tokens&type=login"
and received for example:
"97b2edb716fa7b13f2955c79e7f8f0205ceda2c6+\\"
Then I "POST" the result with the formData provided: (I am 100% sure
those credentials are valid as I can use them to login)
1.
action:
clientlogin
2.
username:
TestUser
3.
password: ********
4.
loginrequests:
5.
loginmessageformat:
wikitext
6.
loginreturnurl:
http://localhost:3000/
7.
logintoken:
f90d08a1b279a521d24a4f629b678bb35ceda63d+\
8.
format:
json
The response that I get back is:
{"warnings":{"main":{"*":"Unrecognized parameters: username,
password."}},"clientlogin":{"status":"FAIL","message":"The supplied
credentials could not be
authenticated.","messagecode":"authmanager-authn-no-primary"}}
What am I missing to correctly remote login a user?
Any help would be appreciated.
Daniel
Hi,
For one of my projects, I need to be able to keep the most up to date
version of wikipedia html pages for a few languages like en, zh, de, es, fr
etc. So this is done currently in two steps,
1. Listen to changes on stream API documented here
<https://wikitech.wikimedia.org/wiki/Event_Platform/EventStreams> and then
extract the page titles.
2. For each of the titles, get the latest HTML using the Wikipedia REST api
<https://en.wikipedia.org/api/rest_v1/#/Page%20content/get_page_title__title_>
and
persist the HTML.
I understand that in order to avoid the 429 (Too many requests error), we
need to make sure we limit the api request to 1 per second. Just wanted to
check if we can make requests to different languages like en.wikipedia.org,
fr.wikipedia.org etc in parallel or do those requests also need to be done
in serial manner (1 per second), in order to not hit HTTP 429 error.
Please let me know if you need more information.
--
Regards,
Aadithya
--
Sent from my iPad3
With the merge of Icb674095,[1] use of API action=logout will require a
CSRF token. This was considered a security issue, so the usual deprecation
process was not followed. See T25227[2] for details.
Clients that do not use a CSRF token with action=logout will receive a
badtoken error message ***and will not be logged out***.
This change should be deployed to Wikimedia wikis with 1.34.0-wmf.3. See
https://www.mediawiki.org/wiki/MediaWiki_1.34/Roadmap for a schedule.
Overall client impact is expected to be relatively low, as gathered
statistics indicate there are relatively few users of this API call. None
the less, maintainers should check their code for use of action=logout and
update as necessary to maintain expected operation.
[1]: https://gerrit.wikimedia.org/r/c/mediawiki/core/+/504565
[2]: https://phabricator.wikimedia.orgdo not use /T25227
<https://phabricator.wikimedia.org/T25227>
[3]: https://phabricator.wikimedia.org/T25227#4902709
--
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce