Hi,
I am trying to compile a list of duplicate images in Wikimedia Commons. I
am iterating through the list of images using the generator=allimages API
and using the continue option to get the next set. But the api gets stuck
at 𪎥-seal.svg and it does not return the next set or the continue option.
Here is the url I am using:
https://commons.wikimedia.org/w/api.php?action=query&generator=allimages&pr…
Can anyone help me with this? If there is an alternative, that would be
great.
Thanks,
Sreejith Kulamgarath.
Hi,
I've posted this also on the API:Edit talk page
https://www.mediawiki.org/wiki/API_talk:Edit#How_does_%22recreate%22_parame…
Apparently, without this "recreate" parameter, an article can be recreated
even if it was deleted since the retrieval of the basetimestamp. Is this an
expected behavior? From the current description (*Override any errors about
the page having been deleted in the meantime*) I understand that you have
to set it to true if you want to be able to recreate the page, so no giving
this parameter should raise an error. Can you clarify the document and
behavior?
Discussion on enwiki :
https://en.wikipedia.org/w/index.php?title=User_talk:GoingBatty&diff=995218…
Thanks
Nico
Hi all,
is it true that the english wikipedia page name is always the same of
dbpedia and wikimedia commons name?
As an example, https://en.wikipedia.org/wiki/*Paris = *
http://dbpedia.org/page/*Paris = *https://commons.wikimedia.org/wiki/*Paris*
Thank you.
Best,
Luca
---------- Forwarded message ---------
From: Jon Robson <jrobson(a)wikimedia.org>
Date: Fri, Sep 11, 2020 at 12:49 PM
Subject: INFO: Announcing removal of noimages parameter from deprecated
action=mobileview API
To: <mediawiki-api-announce(a)lists.wikimedia.org>
The action=mobileview API was marked as deprecated at the end of the 2019
[1]. As we refactor code relating to this API we have decided to remove
support for its `noimages` parameter which allowed users to request a page
with img tags replaced with spans.
Note: The `noimages` parameter also appeared inside the action=parse API
[2] however was a no-op. It will no longer appear on action=parse requests
as a result of this change.
For clients wanting to perform img tag removals, this will need to be done
by those scripts themselves, the PHP code is provided for your reference [3]
If you have any questions or concerns about this change, please feel free
to follow up on the Phabricator ticket [4].
[1] https://phabricator.wikimedia.org/T210808
[2]
https://en.wikipedia.org/wiki/Special:ApiSandbox#action=parse&format=json&p…
[3]
https://gerrit.wikimedia.org/r/c/mediawiki/extensions/MobileFrontend/+/6265…
[4] https://phabricator.wikimedia.org/T262580
My question is where is my token and how can i use the token for API calls?
I am getting this error when i
https://en.wiktionary.org/w/api.php?action=edit§ion=4&format=json&conti…
post it
ERROR:
See https://en.wiktionary.org/w/api.php for API usage. Subscribe to
the mediawiki-api-announce mailing list at
&lt;https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce&;amp;gt;
for notice of API deprecations and breaking changes.
I'm looking into the Wikiloop Battlefield data
http://battlefield.wikiloop.org/feed/mix and trying to query the data via
the API using revision id.
My attempts appear to return with "badrevids".
Am i using the API incorrectly?
Here is my sample python code...
import requests
S = requests.Session()
URL = "https://www.mediawiki.org/w/api.php"
PARAMS = {
"action": "query",
"prop": "revisions",
"revids": 961090023,
"rvprop": "ids|timestamp|user|comment|content",
"rvslots": "main",
"formatversion": "2",
"format": "json"
}
R = S.get(url=URL, params=PARAMS)
DATA = R.json()
PAGES = DATA["query"]
for page in PAGES:
print(page)
This message is a notice about forthcoming removal of a response field in
the REST API’s /page/summary endpoint. [1] This message was posted to
wikitech-l and is being cross-posted on mediawiki-api-announce.
The endpoint is used in the Page Previews (“hovercards”) functionality on
the classic web (desktop) and Android & iOS experiences for Wikipedia, in
addition to numerous external experiences. We don't anticipate impacts on
the mainline Wikipedia experiences from this forthcoming field removal, as
the endpoint will still be operational.
The REST API's /page/summary endpoint provides an api_urls field in its
response with links to several other REST API endpoints supported by the
Page Content Service.
Three of the api_urls subfields refer to experimental endpoints that have
been removed from the REST API altogether in favor of newer API endpoints.
1. media (/page/media)
2. references (/page/references)
3. metadata (/page/metadata).
api_urls also contains subfields referring to stable endpoints that
continue to exist in the REST API:
1. summary (which is self referential)
2. edit_html (/page/html - the Parsoid HTML)
3. talk_html (/page/html/Talk:<title> - the Parsoid HTML for the
corresponding Talk page)
Wikimedia’s Product Infrastructure team intends to remove the api_urls
field of the /page/summary response.
A cursory review at https://codesearch.wmflabs.org/ and Wikimedia Git
mirrors suggests api_urls isn’t in use in consuming code.
Review of web logs suggests traffic for the following endpoints:
- The media endpoint is at about 5% of its original traffic before its
decommission and it appears to be from old Wikipedia for Android clients.
This is expected.
- The references endpoint’s Wikipedia for Android traffic does not seem
present and its other traffic appears to be from non-user application
software based on User-Agent header components.
-The metadata endpoint’s traffic seems to have all but stopped.
This change is being announced in advance of the change because the
endpoint is advertised as stable. [2]
Please update your clients if you rely on the presence of the api_urls
field. If this change poses a problem for your clients, please do let us
know as soon as possible at the tracking task:
https://phabricator.wikimedia.org/T247991
We plan to remove the api_urls field described here on or after 14 July
2020.
Thank you.
Adam Baso
Director of Engineering
Wikimedia Foundation
[1]
https://en.wikipedia.org/api/rest_v1/#/Page%20content/get_page_summary__tit…
[2] Refer to
https://www.mediawiki.org/wiki/API_versioning#End_point_stability and
https://www.mediawiki.org/wiki/Wikimedia_Product/Wikimedia_Product_Infrastr…
for more information on stability designations.
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
Hello,
I'm trying to add Meta as a wiki in my tool, WPCleaner.
Meta has differences with the usual languages wikis, like the fact that a
page can have translations. Such translations are handled by
Extension:Translate, which prevents updating the translated pages through
API:Edit (getting error tpt-target-page when trying to update such page).
How can I know before using API:Edit that it will not work?
If possible, when retrieving basic information about the page (like
API:Query:Info).
My need is to skip such pages (avoid getting their content and working on
them) for the moment.
Nico