Hello,
I am writing a Java program to extract the abstract of the wikipedia page
given the title of the wikipedia page. I have done some research and found
out that the abstract with be in rvsection=0
So for example if I want the abstract of 'Eiffel Tower" wiki page then I am
querying using the api in the following way.
http://en.wikipedia.org/w/api.php?action=query&prop=revisions&titles=Eiffel…
and parse the XML data which we get and take the wikitext in the tag <rev
xml:space="preserve"> which represents the abstract of the wikipedia page.
But this wiki text also contains the infobox data which I do not need. I
would like to know if there is anyway in which I can remove the infobox data
and get only the wikitext related to the page's abstract Or if there is any
alternative method by which I can get the abstract of the page directly.
Looking forward to your help.
Thanks in Advance
Aditya Uppu
When list=allusers is used with auactiveusers, a property 'recenteditcount'
is returned in the result. In bug 67301[1] it was pointed out that this
property is including various other logged actions, and so should really be
named something like "recentactions".
Gerrit change 130093,[2] merged today, adds the "recentactions" result
property. "recenteditcount" is also returned for backwards compatability,
but will be removed at some point during the MediaWiki 1.25 development
cycle.
Any clients using this property should be updated to use the new property
name. The new property will be available on WMF wikis with 1.24wmf12, see
https://www.mediawiki.org/wiki/MediaWiki_1.24/Roadmap for the schedule.
[1]: https://bugzilla.wikimedia.org/show_bug.cgi?id=67301
[2]: https://gerrit.wikimedia.org/r/#/c/130093/
--
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
TL;DR: I'm proposing deprecating a bunch of parameters. See
https://phabricator.wikimedia.org/T164106.
In the action API, there are two ways to parse a page/revision: using
action=parse, or using the rvparse parameter to action=query&prop=revisions.
Similarly, there are two ways to get a diff: using action=compare, or using
parameters such as rvdiffto to action=query&prop=revisions. And then
there's action=expandtemplates versus the rvexpandtemplates parameter to
prop=revisions. This is a somewhat annoying bit of code duplication.
Further, the prop=revisions versions of these features have somewhat
strange behavior. rvparse forces rvlimit=1. rvdiffto and related parameters
will sometimes output "notcached" with no way to directly handle the
situation.
So, I propose deprecating all of these parameters. The parameters that
would be deprecated are the 'rvdifftotext', 'rvdifftotextpst', 'rvdiffto',
'rvexpandtemplates', 'rvgeneratexml', 'rvparse', and 'rvprop=parsetree'
parameters to prop=revisions, and the similarly named parameters to
prop=deletedrevisions, list=allrevisions, and list=alldeletedrevisions.
Following the normal action API deprecation policy, they'd output warnings
but would continue to function until usage drops sufficiently or until it
becomes too much trouble to fix them, and they wouldn't receive new feature
development.
If anyone would object to this plan, please reply at
https://phabricator.wikimedia.org/T164106, or here if you really hate
Phabricator. If there aren't major objections, I'll probably do the
deprecation in the next week or two. Thanks.
--
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
Hello everyone,
I am trying to login using action=clientlogin via javascript Ajax call and
backend as well but for some reason I am always getting same error {
"code": "badtoken", "info": "Invalid CSRF token.",....} If I use same
parameters and call from POSTman it works and I get logged in.
What could be wrong? Am I missing something thats not required by call from
POSTman. I also posted the same question on API talk
<https://www.mediawiki.org/wiki/API_talk:Login#action.3Dclientlogin>
This is the link to the code for reference
Link to github java code
<https://github.com/Institute-Web-Science-and-Technologies/okb-toollabs/blob…>
Thank you,
Mujtaba Rafiq
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA256
Hi everyone,
I need an API request which lists of user names of all users who did an
edit (or upload) within the last 6 months.
Is that somehow possible? Sadly the "Active Users" do not provide a
possibility for a range / starting date. Or is this a configuration
within the LocalSettings?
Regards,
Dennis Roczek
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2
Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/
iQIcBAEBCAAGBQJY+euWAAoJEM4+Qf3OKrbZFo8QAMp9gw53RBPodG1VPZCCfnjf
L6IJnPEuSH1aVPiaPBMoQnEdhN6Eh7TDIpNDxu9AUPv3+ExLCIztqfh9ZhpTxBJ/
DjaLJsLOxYoR4kutXv55shKowuZK2l1zti+4KtwtVknHiT60Mf6pQWti86fwg+eJ
S2OmVuzWZK39zFHTeuXePkWEb80JOO7DNZokvzRjHXYGPvzVymb596ZZB76h4cM9
ah+hdLRAkJOM3iwzl/Av3pj5ECrfbq1hBCDQGuPp2xMWSQkb5tR2NVBbYnf2TCuE
HwR7DqcVxskEgWU1IEVVq3oKIZIqrLAZg9vIJG3/tV3RekSYDDPSdH4P2b4yuC6S
4Qg2MQnnr8Q2TDDhTc8c3q9qRa20FOKQ7IZhi7E6O6oVY64JiAxT+f9MeFQYMAgO
DEsirpxxOr+sXZYp5pBBfQbjdYIKx1Ety/+ab4uXOZVrleNr0U+tcGd3U42GL9DE
1oIaJTT9iGkh483hF8ttPNt+9KwCCOqzWRgYghkD0BftjepOjTlyjYS3RSd0RsEj
QFCegZqKDkTHA2N75IT5GT9pLW5HHgjPUS1XKRdyB4bjVl8Uxb0eR9XDLWOCuXPa
rIkoox8hcYlsMSLjwfW+SZUaPBmQ27dJYtRpTx3RH07acy1jVuuYNgbtaVr9xV4L
IGl4kWjB1ezsiqWqoHcM
=EkcI
-----END PGP SIGNATURE-----
Hello. I am writing a crawler and a parser to obtain structured information
from Wiktionary pages
However, i saw that a site called as yourdictionary already have structured
information from Wiktionary
So i wonder, are there any APIetc that does provide structured information?
Or they have written their own custom crawler and parser?
For example
page : https://en.wiktionary.org/wiki/game
some structured information from yourdictionary . com
http://imgur.com/a/FZHGW
So i dont want to waste my time if any easier way to obtain in such time
because i believe i will be writing a lot of cases
To sum up, my question is, are there any way to obtain machine readable
format of pages?
Or i have code my own custom parser?
Dear list members,
yesterday I fiddled around with my MW-API (v1.27.2) and a registered
bot-user (Special:BotPassword) using the Httpful PHP client library [0].
My goal is to edit and create pages with my bot.
[0] <https://github.com/nategood/httpful/>
Making GET queries is working very well but I can’t login. − So I need
some advices from you, please.
Like I understood I have to make 4 requests:
1. = GET to "api.php?action=query&meta=tokens&type=login"
2. = POST to "api.php?action=login".
3. = POST to "api.php?action=query&meta=tokens&type=csrf"
4. = POST to "api.php?action=edit&[…]"
I made a small code sample that you can find here [1].
[1] <https://pastebin.com/AX9fuxRX>
With the 1st request I save the login-token and the cookie (from the
header). − This is working well.
Making the 2nd request I have to send "lgname", "lgpassword" and
"lgtoken" in the body and the cookie in the header. − But then I get the
API warning:
"Fetching a token via action=login is deprecated. Use
action=query&meta=tokens&type=login instead."
The response includes the result ("NeedToken") and "token",
"cookieprefix", "sessionid".
If I use the body-parameters as URL-parameters I get the API warning:
"The following parameters were found in the query string, but must be
in the POST body: lgpassword, lgtoken".
So I think I need to know:
Which steps/requests I have to do − with which: HTTP method, URL and
parameters, body data.
(Step 3 and 4 aren’t possible for me
Excuse me if that question(s) are more noobish …! − I read the
documentation and some mailinglist posts. But I couldn’t find any hint
of the ‚big picture‘ (of the process).
Thanks a lot (in advance for a solution) and best regards
Kai
looks like REL1_28 is broken. When you attempt to clone the extensions
directory, 2ColConflict is prompting for a password. (Ran into similar
issue in REL1_27 and it was because the extension was removed without the
proper backporting of removals)