Hi,
I'm having some trouble when I try to fetch data for swimmers.
When I do allpages request (
https://wiki.swimrankings.net/api.php?action=query&list=allpages&format=json)
I get only some 4 non-important pages
There are no pages like rankingDetail that could contain the data. I can't
even get those pages to see if data can be extracted from them.
If you look, for example, parse request for Main_Page:
https://wiki.swimrankings.net/api.php?action=parse&page=Main_Page&prop=text…
you can see that it returns HTML content
There are no other pages except these 4 pages above so I cannot fetch any
swimmer's data.
Can I somehow get data for individual swimmers... ?
Vladimir
I am looking to query the API to get back a list of all Wikimedia projects
and, hopefully, some relevant metadata.
My first question is, is 'namespace' the term used to describe a Wikimedia
project? Also, is there a query I can use to do that? I'm having some
issues finding that information in the documentation.
Thanks in advance,
Ed
Hello,
I am a web developer and I was tasked with creating new mobile
app that would allow people to compare different swimmers and
their times in various swimming styles.
My question is:
1. Can I use your API freely or is it a paid service?
2. Can I store data that your API provides in my database?
Thank you
Since April 2010,[1] when no lgtoken is passed to the Action API
action=login it will return a "NeedToken" response including the token to
use. While this method of fetching the login token was deprecated in
January 2016,[2] it is still present for the benefit of clients that have
not yet been updated and is not (yet) being removed.
The NeedToken response was also being returned when an lgtoken was supplied
but could not be validated due to session loss. While this made sense back
in 2010 when the NeedToken response was the only way to fetch the login
token, these days it is mainly confusing[3] and a way for clients with
broken cookie handling to wind up in a loop.
With the merge of Gerrit change 586448,[4] the API will no longer return
NeedToken when lgtoken was supplied. If the token cannot be validated due
to session loss, a "Failed" response will be returned with a message
referring to session loss as the problem.
This change should be deployed to Wikimedia sites with 1.35.0-wmf.28 or
later, see https://www.mediawiki.org/wiki/MediaWiki_1.35/Roadmap for a
schedule.
Note that the change HAS NOT been deployed to Wikimedia sites as of the
time of this email. If your client's ability to log in broke on 6 April
2020, the cause is most likely an unrelated change to Wikimedia's
infrastructure that caused some HTTP headers to be output with HTTP/2
standard casing, i.e. "set-cookie" rather than the traditional
"Set-Cookie". See https://phabricator.wikimedia.org/T249680 for details and
further discussion of that situation.
[1]: https://www.mediawiki.org/wiki/Special:Code/MediaWiki/64677
[2]:
https://lists.wikimedia.org/pipermail/mediawiki-api-announce/2016-January/0…
[3]: https://phabricator.wikimedia.org/T249526
[4]: https://gerrit.wikimedia.org/r/c/mediawiki/core/+/586448
--
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
Hi All,
I'm writing an API module that I'd like to take 'complex' data for one of
the parameters, e.g.
{
"action": "example",
"token": "<some csrf token>",
"data": [
{"key": "value1", "otherkey": "othervalue1"},
{"key": "value2", "otherkey": "othervalue2", "optionalkey":
"optionalvalue"},
{"key": "value3", "otherkey": "othervalue3"}
]
}
The list in data can grow to several hundred elements all of which contain
the same 2 keys with an optional third. For reference, I'm developing on
mw1.31, but I can swap to something more recent if that's where new
possibilities become available.
This module takes POST requests, so I'd just send it as part of the request
body, but when it comes to deserialising it, I'm not really sure what to
deserialise it to as everything seems to assume primitive values, e.g.
strings, integers, etc. when it comes to setting the parameter type. I'm
not averse to claiming it's a string and deserialising it myself, but my
attempts so far haven't proved successful. I also tried writing a phpunit
test which displayed different behaviour (it seems to have quietly lost the
value) and using the mwapi python library for an integration/end-to-end
test which caused a third behaviour by quietly joining the elements in the
list into a string. I get the feeling what I'm trying to do is at the very
least non-standard for mediawiki.
Does anyone have experience in trying to do this or know of prior art that
does something similar?
Cheers,
Matt
Just out of curiosity, is there any recommended way to parse out "which" parameter goes wrong. For example, we have "unknown_action" before. Now it gets merged into "badvalue", and the response looks like this:
{
"error": {
"code": "badvalue",
"info": "Unrecognized value for parameter \"action\": test.",
"*": "See https://test2.wikipedia.org/w/api.php for API usage. Subscribe to the mediawiki-api-announce mailing list at <https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce> for notice of API deprecations and breaking changes."
},
"servedby": "mw1361"
}
Do I have to parse into .error.info content to know the parameter name? Would its content change (such as localized) under certain situation?
Thanks,
Xinyan
-----Original Message-----
From: Mediawiki-api <mediawiki-api-bounces(a)lists.wikimedia.org> On Behalf Of mediawiki-api-request(a)lists.wikimedia.org
Sent: Wednesday, February 5, 2020 8:00 PM
To: mediawiki-api(a)lists.wikimedia.org
Subject: Mediawiki-api Digest, Vol 150, Issue 4
Send Mediawiki-api mailing list submissions to
mediawiki-api(a)lists.wikimedia.org
To subscribe or unsubscribe via the World Wide Web, visit
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
or, via email, send a message with subject or body 'help' to
mediawiki-api-request(a)lists.wikimedia.org
You can reach the person managing the list at
mediawiki-api-owner(a)lists.wikimedia.org
When replying, please edit your Subject line so it is more specific than "Re: Contents of Mediawiki-api digest..."
Today's Topics:
1. [Mediawiki-api-announce] BREAKING CHANGE: Parameter
validation error codes (Brad Jorsch (Anomie))
2. [Mediawiki-api-announce] BREAKING CHANGE: Stricter validation
of integer-type parameters (Brad Jorsch (Anomie))
3. Re: [Mediawiki-api-announce] BREAKING CHANGE: Parameter
validation error codes (Furkan Gözükara)
----------------------------------------------------------------------
Message: 1
Date: Tue, 4 Feb 2020 13:24:32 -0500
From: "Brad Jorsch (Anomie)" <bjorsch(a)wikimedia.org>
To: mediawiki-api-announce(a)lists.wikimedia.org
Subject: [Mediawiki-api] [Mediawiki-api-announce] BREAKING CHANGE:
Parameter validation error codes
Message-ID:
<CAEepRSsewEZrOO97166BXLsEQxXc2QZ5Gdyxetkwj21yLfdsZw(a)mail.gmail.com>
Content-Type: text/plain; charset="utf-8"
The error codes that may be changing are some of those representing invalid values for API parameters. Notably, the following will change:
- "noX", indicating that a required parameter was not specified, becomes
"missingparam".
- "unknown_X", indicating that an unrecognized value was specified for
an enumerated-value parameter, becomes "badvalue".
- "too-many-X", indicating that too many values were supplied to a
multi-valued parameter, becomes "toomanyvalues".
- "baduser_X", "badtimestamp_X", and so on become "baduser",
"badtimestamp", and so on.
Note this is not a comprehensive list, other codes may be changing as well.
These changes make the error codes more predictable, at the expense of not indicating in the code which parameter specifically triggered the error. If you have a use case where knowing which parameter triggered the error is needed, please let us know (by replying to this message or by filing a request in Phabricator) and we'll add the necessary error metadata.
The human-readable text is also changing for some of these errors (with or without error code changes), and for a few the error metadata may be changing (e.g. "botMax" changes to "highmax" for limit-type warnings in non-back-compat error formats).
This change will most likely go out to Wikimedia wikis with 1.35.0-wmf.19.
See https://www.mediawiki.org/wiki/MediaWiki_1.35/Roadmap for a schedule.
--
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.wikimedia.org/pipermail/mediawiki-api/attachments/20200204/54…>
-------------- next part --------------
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
------------------------------
Message: 2
Date: Tue, 4 Feb 2020 13:24:34 -0500
From: "Brad Jorsch (Anomie)" <bjorsch(a)wikimedia.org>
To: mediawiki-api-announce(a)lists.wikimedia.org
Subject: [Mediawiki-api] [Mediawiki-api-announce] BREAKING CHANGE:
Stricter validation of integer-type parameters
Message-ID:
<CAEepRSs8vkkwQD_xKv2WR+qmJnzxhD9Uv_LfRc81bUQeYw5KoA(a)mail.gmail.com>
Content-Type: text/plain; charset="utf-8"
Various unusual values for integer-type parameters to the Action API will no longer be accepted. Acceptable values will consist of an optional sign (ASCII `+` or `-`) followed by 1 or more ASCII digits.
Values that were formerly allowed, and will now result in a "badinteger"
error, include:
- Values with extraneous whitespace, such as " 1".
- "1.9", formerly interpreted as "1".
- "1e1", formerly interpreted as either "1" or "10" at various times.
- "1foobar", formerly interpreted as "1"
- "foobar", formerly interpreted as "0".
Most clients should already be using correct formats, unless they are taking direct user input without validation.
This change will most likely go out to Wikimedia wikis with 1.35.0-wmf.19.
See https://www.mediawiki.org/wiki/MediaWiki_1.35/Roadmap for a schedule.
--
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.wikimedia.org/pipermail/mediawiki-api/attachments/20200204/31…>
-------------- next part --------------
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
------------------------------
Message: 3
Date: Tue, 4 Feb 2020 23:52:58 +0300
From: Furkan Gözükara <monstermmorpg(a)gmail.com>
To: "MediaWiki API announcements & discussion"
<mediawiki-api(a)lists.wikimedia.org>
Subject: Re: [Mediawiki-api] [Mediawiki-api-announce] BREAKING CHANGE:
Parameter validation error codes
Message-ID:
<CAB+rNzBACGY79rSbtm0bRz8JFwEEeNW5kYSskPGY2rS55ocaTA(a)mail.gmail.com>
Content-Type: text/plain; charset="utf-8"
How can i get file full path from file template?
For example * {{audio|en|en-us-dictionary.ogg|Audio (US, California)}}
On Tue, Feb 4, 2020 at 9:25 PM Brad Jorsch (Anomie) <bjorsch(a)wikimedia.org>
wrote:
> The error codes that may be changing are some of those representing
> invalid values for API parameters. Notably, the following will change:
>
> - "noX", indicating that a required parameter was not specified,
> becomes "missingparam".
> - "unknown_X", indicating that an unrecognized value was specified for
> an enumerated-value parameter, becomes "badvalue".
> - "too-many-X", indicating that too many values were supplied to a
> multi-valued parameter, becomes "toomanyvalues".
> - "baduser_X", "badtimestamp_X", and so on become "baduser",
> "badtimestamp", and so on.
>
> Note this is not a comprehensive list, other codes may be changing as well.
>
> These changes make the error codes more predictable, at the expense of
> not indicating in the code which parameter specifically triggered the
> error. If you have a use case where knowing which parameter triggered
> the error is needed, please let us know (by replying to this message
> or by filing a request in Phabricator) and we'll add the necessary error metadata.
>
> The human-readable text is also changing for some of these errors
> (with or without error code changes), and for a few the error metadata
> may be changing (e.g. "botMax" changes to "highmax" for limit-type
> warnings in non-back-compat error formats).
>
> This change will most likely go out to Wikimedia wikis with 1.35.0-wmf.19.
> See https://www.mediawiki.org/wiki/MediaWiki_1.35/Roadmap for a schedule.
>
> --
> Brad Jorsch (Anomie)
> Senior Software Engineer
> Wikimedia Foundation
> _______________________________________________
> Mediawiki-api-announce mailing list
> Mediawiki-api-announce(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
> _______________________________________________
> Mediawiki-api mailing list
> Mediawiki-api(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.wikimedia.org/pipermail/mediawiki-api/attachments/20200204/2a…>
------------------------------
Subject: Digest Footer
_______________________________________________
Mediawiki-api mailing list
Mediawiki-api(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
------------------------------
End of Mediawiki-api Digest, Vol 150, Issue 4
*********************************************
Hi folks.
*This is my first time using this mail list*, so if this is not the right
place to ask this kind of question please lemme know about how I should
proceed in this case.
*Question*
I have basically downloaded from MediaWiki API a lot of pages related to
mathematics. Some of them are just *duplicated of the same Article*, but
with one difference being their title, such as different way os calling the
same subject, or letter that differs from one and another, ao so on and so
forth.
One example that I can show you right away is:
- "Adição_de_*s*egmentos", and
- "Adição_de_*S*egmentos",
both written in portuguese (my native language). The only difference
between the titles are just the lowercase and uppercase of the letter
"s".As I was testing on the URL's, it seems that *they both are the same
article, but redirecting from different links to the official "title".*
Keeping in mind those kind of duplicates, when I've started *to analyse the
statistics of views on a specific article*, while going through its cases,
I was expecting to receive the following structure of data:
- The old ones (deprecated) would hold views until some day X, and then
it would have nothing to further count and show;
- The up-to-date titles would have data starting from day X and then
would hold until the last day that I want to analyse.
Nothing too crazy to expect from the database. But that was not what
happened. *There are plenty of articles that are still receiving views even
though they all redirect to another article*. At first, I've just thought
that people are getting to the articles's content with different links
available on search engines, such as google, so all views must be
independent from one another. The problem is, after testing on the google
platform different search for *the same Wikipedia's article I can only get
the* *up-to-date articles, not the old ones.*
1. How can this be possible?
2. But more important for me, are all acesses on the deprecated articles
made by bots or old links available on old pages from other sites?
3. Are the count on all different article's title independent?
4. If so, how could I be able to even track all the possible acesses on
a particular subject to create an effective study o it?
Anyway, this is (if I remember well) the fourth time I'm trying to get a
proper answer for my question, and I'm hopping I'll get it soon.
Thanks!
Marco Antonio
Graduando em Matemática Pura na USP | Divulgador Científico
<https://www.facebook.com/ViaSaber> <https://www.linkedin.com/in/magcastro/>
<https://www.instagram.com/marcoantoniograziano/>
The error codes that may be changing are some of those representing invalid
values for API parameters. Notably, the following will change:
- "noX", indicating that a required parameter was not specified, becomes
"missingparam".
- "unknown_X", indicating that an unrecognized value was specified for
an enumerated-value parameter, becomes "badvalue".
- "too-many-X", indicating that too many values were supplied to a
multi-valued parameter, becomes "toomanyvalues".
- "baduser_X", "badtimestamp_X", and so on become "baduser",
"badtimestamp", and so on.
Note this is not a comprehensive list, other codes may be changing as well.
These changes make the error codes more predictable, at the expense of not
indicating in the code which parameter specifically triggered the error. If
you have a use case where knowing which parameter triggered the error is
needed, please let us know (by replying to this message or by filing a
request in Phabricator) and we'll add the necessary error metadata.
The human-readable text is also changing for some of these errors (with or
without error code changes), and for a few the error metadata may be
changing (e.g. "botMax" changes to "highmax" for limit-type warnings in
non-back-compat error formats).
This change will most likely go out to Wikimedia wikis with 1.35.0-wmf.19.
See https://www.mediawiki.org/wiki/MediaWiki_1.35/Roadmap for a schedule.
--
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce