<a href="http://higssoftware.com/thesis-writing-services.php">HIGS </a>- thesis writing service offers the best quality filled writing services for your thesis or dissertation.<a href="http://higssoftware.com/thesis-writing-services.php">HIGS offers customized and genuine thesis report writing services for PhD research scholars.</a> We do not only offer report writing, but we also help you in entire thesis writing and we provide complete support in explaining the research objective and outcome of research
Hey,
This API call works when I do it in Python using requests:
requests.get("https://en.wikipedia.org/w/api.php",
params={"action":"parse","page":"Aphrodisiac","prop":"wikitext","format":"json"})
but when I do it with wget or curl like this:
wget
https://en.wikipedia.org/w/api.php?action=parse&page=Aphrodisiac&prop=wikit…
it doesn’t return pure JSON, it returns some kind of HTML and I don’t think
I even see the desired data inside.
But that URL works in the browser.
Why would it have this behavior? How do curl/wget differ from a browser
calling this string?
Thank you,
Julius
Hey,
This seems to be the homepage for the "REST API":
https://www.mediawiki.org/wiki/Wikimedia_REST_API
but I don't see any documentation there for API parameters.
For example, someone gave me this example API call:
https://en.wikisource.org/api/rest_v1/page/html/Little_Essays_of_Love_and_V…
But I don't see information in the first link above that you should append
"page" and "html" to the wikisource API endpoint.
Is there documentation for this API?
Also, is this API an attempt to replace the old one or something? How does
it differ from the MediaWiki Action API and the MediaWiki REST API?
For example, do they fetch different kinds of information or something?
Also, would the "parse" prop in the Action API be a good way to get
plaintext from a Wikisource book?
Thank you,
Julius
Hey,
Does anyone think it would be possible, or has it been done before, to take
some kind of downloaded dictionary data from a corpus and try to
automatically upload it to Wiktionary?
For example, if you had a very large dictionary over words in a language
and their part of speech, could you check if the word was already present
in Wiktionary and if so skip it, and otherwise create entries for all
remaining words, with their part of speech?
Thank you,
Julius
Hey,
It seems relatively easy to export any section of a Wikisource book with
the "download" button, but there is no way to export just the preface /
title page. If you press "download" on the title page, the section name is
just the book name, so you download the entire book instead of just the
preface.
For example: https://en.wikisource.org/wiki/Little_Essays_of_Love_and_Virtue
Is this intentional? Is there any good way to export the plaintext of just
the preface, via some command / functionality / API?
Thank you,
Julius
Yesterday, the MobileView Action API was removed from all Wikimedia
production servers [1].
The API was originally built to service our apps but over time that has
been replaced with the Page Content Service [2]. The original service was
not being maintained, and usage was very low. If you were using the
MobileView API we urge you to check this API out as it is a more powerful,
better cached alternative.
We marked the API as hard deprecated in December 2019 [3], after which it
was marked in ApiSandbox and in the response as deprecated, and support has
been dwindling since then, for example, the noimages parameter was dropped
in September 2020 [4].
In March 2022 we removed the remaining production blocker for removing the
code: language variant support [5]. The Page content service was
prreviously using the mobile view API for language variant views but has
now been rewired to use the better supported core action=parse API. If you
need to support language variant views and previously couldn't because of
the lack of support in the page content service this should no longer be an
obstacle..
Since April 2022 users of the API would have been seeing an inline banning
warning them of the upcoming breakage [6]. We got no feedback from these
banners so were comfortable with pushing forward.
Impact on user scripts was judged as low [7] with only 14 scripts impacted
so this has not been announced on tech news.
If you are impacted by this change, I apologise that previous
communications have failed you and I'd love to hear from you about how we
could have done this deprecation better. If I can support you in any way in
making sure your apps/gadgets/scripts are working again please feel free to
reply to this email either privately or publicly or to raise a topic on the
MobileFrontend talk page [8].
Finally, thanks to the engineers who helped make this deprecation possible.
Jon
[1] https://phabricator.wikimedia.org/T186627
[2] https://www.mediawiki.org/wiki/Page_Content_Service
[3] https://phabricator.wikimedia.org/T210808
[4] https://phabricator.wikimedia.org/T262580,
[5] https://phabricator.wikimedia.org/T236733
[6] https://phabricator.wikimedia.org/T286836
[7]
https://global-search.toolforge.org/?q=%5B%27%22%5C%3D%5Dmobileview®ex=1…
[8]
https://www.mediawiki.org/wiki/Extension_talk:MobileFrontend?tableofcontent…
_______________________________________________
Mediawiki-api-announce mailing list -- mediawiki-api-announce(a)lists.wikimedia.org
To unsubscribe send an email to mediawiki-api-announce-leave(a)lists.wikimedia.org