The tags query module has a tgprop parameter to specify which properties of
the tag should be returned. One of these properties was 'name', but the
name was always included in the response regardless of whether this
property was included in tgprop.
Since most uses aren't specifying 'name' but we can assume clients are
depending on the name being included (since there's otherwise no way to
identify which tag is which), we've removed the nonfunctional 'name' as a
valid option for tgprop. This will result in a warning that 'name' is not a
valid value for tgprop for those few clients that are specifying 'name'
there.
This change will not affect the functionality of any clients unless the new
warning somehow breaks them. It should be deployed to WMF wikis with
1.31.0-wmf.18 or later. Clients can safely stop specifying 'name' in tgprop
immediately, though, since it doesn't do anything.
For further information, see https://phabricator.wikimedia.org/T185058.
--
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
Hello everyone,
I am trying to use the MediaWiki API to create a dictionary based on
categories or lists on Wikipedia. I would like to be able to select a
category, or perhaps a list page, and get all members of that list.
I've done some reading of the API, and implemented a prototype. It works a
little bit but only when the data is structured just perfectly for my
purposes. For example, I can easily get a list of all of the
English-language films. I'm using the action=query and list=categorymembers
for this. I end up with 500 films at a time, and I can continue as needed
to get all 60k or so. This is because there is a category that is tagged to
each English-language film's individual page.
On the other hand, if I want to get a list of all National Hockey League
(NHL) players, this is a lot more difficult. The category "Category:Lists
of National Hockey League players" exists, but it's a category of lists of
players. Much of the categorization of Wikipedia turns out to be in lists,
not categories. I could write a webscrapper for this but that would
probably be very unreliable.
Is there a standardized way to deal with lists and sublists that I might
have missed? I don't mind write a bunch of code to recursively crawl
sublists and expand them. But I would like to avoid something as
not-standard as web scrapping the content because it will be very fragile.
Thank you for the help,
-mike
Hello!
I hope this is the correct place to ask this question.
My name is Tal, and I'm developing my own version of the Wiki Game
<https://en.wikipedia.org/wiki/Wikipedia:Wiki_Game> (or Wiki Race, some may
say) for Android devices. There is a great multiplayer version
<https://thewikigame.com/>of it for the iPhone and Web, but I think there
should be a worthy version to the Android.
My version is still in progress, and I put a lot of effort into it,
including the gameplay experience (single-player and multiplayer) and the
design.
_________________________
I'm writing this email because I'd like to consult with some wiki experts
about one of the main things which on my mind, and it's my usage of the
Mediawiki API. I'd like to launch the game and upload it to the Play Store
in a few months, but I can't keep working on it while knowing that I might
put my efforts in danger if I will do so, because of unfair usage of the
Wiki API.
Well, the game is based on generating two random articles: the first
article (the starting point) and the target article (the end point) which
the user should reach to using only the hyperlinks inside the articles.
The thing is, that unlike the currently exist Wiki Game
<https://thewikigame.com/> version which is played by many users together
at the same time (by joining "game rooms"), or using pre-generated
articles, - I'm generating the articles live for each user individually.
Each user can play in single-player mode and generate his own pair of
articles, his own game, a thing which is gonna keep the game infinite for
all of the players.
My generator generates two articles before each game and the user has 15
seconds to decide whether he would like to start the game or not. After the
15 seconds are up the generator is generating a new pair of articles. While
the time is running, the user may refresh the articles manually and ask to
generate a new pair.
So why am I worried about my app's usage of the API?
Well, not all of the target articles are reachable through the hyperlinks
in the mobile version of Wikipedia, because the templates view in the
mobile version is very limited and there are articles without references
from other articles, and I don't want the users to get stuck.
In order to overcome this , and make sure that the target article is
actually reachable, I need to make about 5 requests to the REST API each
articles generation (one request after another, not together). The
algorithm was made after a very long time of research by me, and I think
its efficient for the purpose.
So, I assume that the average player is gonna play about an hour in a day,
so he is gonna refresh his pair about 15 times, which are 75 requests from
the API in a given day for a user. Let's say (hopefully and hypothetically)
the game goes pretty viral - well, there are gonna be a lot of requests to
the API at the same time, and this makes me unsure about my way of action.
I don't mind sending my random articles generator code (as long it is not
copied :P ) for a review, and work together with somebody to make the game
better and fair, and I'm also willing to donate a part of the game incomes
(if there will be any) to the Wikimedia Foundation, but there is one thing
I'm not gonna do, and it's launching the game before I'm making sure I do a
fair use of the API. I know Wikipedia is taking care millions and billions
of requests everyday, and still - I can't move on without knowing
everything is fine.
This information <https://www.mediawiki.org/wiki/API:Etiquette> about the
usage limits is not very explicit, and it doesn't put the line between good
and bad, and maybe I'm using it badly. I will accept every answer
respectfully.
Thanks in advance,
Tal
Hi All!
First at all, I would like to thank you for your awesome work on the
mediawiki code.
I'm currenlty working on the ApiEditPage.php of Mediawiki API.
Indeed, my aim is to create wiki pages directly from a mobile phone using
mediawiki API (1.29) and the GeoData Extension (1.29).
Unfortunately, I found out that when inserting a geodata tag
({{#coordinates:xxx}}) in a page using ApiEditPage.php, the coordinates are
not saved in the 'geo_tags' database as they should.
But they are saved correctly into db when I open the created page in a
browser and save it manually.
So I tried to find basic solution like adding an equivalent "manual save
command" but no luck for the moment.
I also tried to replace APIEditBeforeSave by EditFilterMergedContent within
the Hooks::run command unsuccessfully.
Would you have any idea to perform a correct "manual save command" to force
mediawiki to process the geotags written inside the page created by API?
Thanks in advance,
JC
(+mediawiki-api)
Hi Chris,
I think many of us may be having trouble answering this because it's not
quite clear what you're trying to do. Can you be more concrete about what
categories (or what category scheme) you have in mind?
Wikipedia doesn't have a single, overarching hierarchy of categories. A
page may be associated with any number of categories (including zero).
Some of these categories may be subcategories of other categories. Editors
may freely create and remove categories, and add and remove page
associations with these categories.
The REST API currently doesn't expose any category information, but you can
obtain category information through follow up requests to the Action API (
https://en.wikipedia.org/w/api.php). For example:
To get all categories associated with the page "Marfa, Texas" on English
Wikipedia:
https://en.wikipedia.org/w/api.php?action=query&prop=categories&titles=Marf…
To get all pages associated with the category "Category:Cities in Presidio
County, Texas":
https://en.wikipedia.org/w/api.php?action=query&list=categorymembers&cmtitl…
Best,
Michael
On Wed, Oct 25, 2017 at 5:13 PM, Christopher Smyth <chris(a)inflighto.com>
wrote:
> Hello,
>
> We’re a small app development company that has integrated Wikipedia
> content into a geo-locating iOS app. The app is working well and the Wiki
> content is displaying correctly. However, we’d like to categorise the
> Wikipedia content into three categories rather than just one.
>
> Is there a way to filter and categorise Wikipedia content that is accessed
> through the REST API? We only use content that is geo-coded (ie has
> latitude and longitude) information associated with each article.
>
> How should we go about configuring our API integration so that we can
> split Wikipedia content according to its top-level categories? Is there a
> way to do this?
>
>
>
> Many thanks for your assistance with this request.
>
>
>
> Regards,
>
> Chris Smyth
>
>
>
>
>
> Christopher Smyth
>
> Director
>
> Inflighto
>
> chris(a)inflighto.com
>
> +61 (0)417 298 598 <+61%20417%20298%20598>
>
>
>
> [image: Inflighto_Source_file_tm_Original Horizontal - EMAIL SIGNATURE
> SMALL] <https://www.inflighto.com/>
>
>
>
>
>
> _______________________________________________
> Services mailing list
> Services(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/services
>
>
Hi Team,
I am stuck while Implementing the AutoLogin. Any help would be great.
I am using MediaWiki for my purpose. I have a website and would like to give the MediaWiki URL When a user clicks the links of MediaWiki It should redirect with MediaWiki URL with AutoLogin.
Currently, I have setup in my localhost.I am using login API for this and able to create a new user in MediaWiki. So when I am using the API for login I am able to get login success response from API. but when it redirects to MediaWiki localhost links I can't see the logged in user.
Below Environment I am using for development:
Media wiki version: 1.26System: Ubuntu14.04Server: Apache2Language: PHP/MySQL.
Looking forward to hearing back from you!
Thank you,Sudhir Gupta.
Previously, if none of the values supplied for the pltitles, tltemplates,
clcategories, or imimages parameters to prop=links, prop=templates,
prop=categories, or prop=images were valid titles, the parameter would be
ignored and all links, templates, categories, or images would be returned.
With the merge of Gerrit change 347879,[1] this situation will result in no
links, templates, categories, or images being returned, as none match the
invalid titles supplied.
Note that submitting an empty value for one of these parameters will
continue to ignore the parameter and return all links, templates,
categories, or images, since this seems to be relatively common practice.
[1]: https://gerrit.wikimedia.org/r/#/c/347879/
--
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
Hi,
I am trying to fetch the deleted revisions of a page using API
<https://www.mediawiki.org/wiki/API:Deletedrevisions>, but meet an error
saying that "You don't have the permission to view a page's deleted
history". I wonder if it is caused by my inappropriate operation, or just
the feature has been blocked by Wikipedia? Thank you!