Action=parse can take multiple titles, and you can get
other page metadata in addition to just HTML output.
Not to mention you can bundle it into one request with
action=parse|query.
-Chad
On Feb 23, 2009 3:14 PM, "Michael Dale" <mdale(a)wikimedia.org> wrote:
it would be really nice if we could get html output from the api query
... this would avoid issuing doing dozens of action parse requests
separately.
It apperas to be mentioned pretty regularly... does anyone know if a bug
to that end has been filed.. I will plop one in there if none exists
(did not find any with in my quick search)
--michael
Bryan Tong Minh wrote:
> On Fri, Feb 20, 2009 at 6:45 PM, marco tanzi <tanzi.marco(a)gmail.com>
wrote:
>
>> I received a correct json object, but the content of the revision is
>> full of data I do not need like {{....}} [[...]] ecc. I would like to
>> get only the clean description, only text (like the one visible from
>> the wiki website).
>>
>>
>
> Run the parsed text (action=parse) through an HTML parser that strips
> all the tags.
>
>
> Bryan
>
> _______________________________________________
> Mediawiki-api mailing list
> Mediawiki-api(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
>
_______________________________________________
Mediawiki-api mailing list
Mediawiki-api(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
Is it possible to use the API to query by geo graphical coordinates? E.g.
articles near an LONG/LAT ? if so, could you point me to any examples?
Thanks
Julio
Hi folks,
I am trying to work with the wikipedia API and i am having some little
problems :-)
I can fetch the main description of the topic i am looking for using:
http://en.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=conten…
I received a correct json object, but the content of the revision is
full of data I do not need like {{....}} [[...]] ecc. I would like to
get only the clean description, only text (like the one visible from
the wiki website).
How can I do that? there is some parser to clean my json object?
hope someone could help me out|!
kind regards
Marco
Hello!
I'm developing a new pywikipedia bot that will parse the lonely pages
and images (especially images) but I don't trust too much of the HTML
code so I prefer to use the APIs instead. Could you please link me how
to get a list of lonely images and/or pages? I've tried to find it on my
own but I haven't succeeded. If it doesn't exist, is there someone so
polite to add this feature?
Thanks for the help,
Filnik.
Hi guys,
I am writing a ruby application to retrieve the wikipedia data: the
main description and the main image (the one on the box in the left
side).
As parameter I have the cruid of the wiki page, so I call the wiki API
to get the data, now start the problems:
- Main description:
I call the following link to retrieve the json object with the data of
the main description
http://en.wikipedia.org/w/api.php?action=query&pageids=52780&prop=revisions…
the object is well formed but the text is on wikipedia format.
How is possible to convert it into a plain text? (without {{ }}, [ ]
and <ref>)
is it possible to get a text plain directly ?
- Main img (if present)
my second problem is to find the right image to show after a research
I have tried to fetch the main image of a wiki page using the
following link:
http://en.wikipedia.org/w/api.php?action=query&pageids=52780&prop=images&fo…
but this object that i receive contains all the images of the page
without specify where this images are used.
how is possible to know exactly the image used on the left box of the
wiki page?
anyone can help me?
Kind regards
Marco
Hi folks,
I would like to know if it is possible to retrieve from a wikipedia
page only the main description and, if it is available, the image.
For example I would like to get only the main description and the
image of the U2 band from http://en.wikipedia.org/wiki/index.html?curid=52780
. How can I do this? I looked on the Wikipedia API (http://en.wikipedia.org/w/api.php
) but I haven't found nothing that feet my needs.
It would be great if there is a web service that retrieve an XML/JSON
object with this data
Hope to hear from you soon!
regards,
Marco
I need some help - I've implemented the API using CodeIgniter for a
clients site and everything works on the site except for the wiki pages,
which are sluggish.
Here is my controller which lists the function calls being made and in
what order:
http://pastebin.com/m4021c1a7
And here is my model where the actual API calls are made:
http://pastebin.com/dd7c504c
The api.php file exists on the same server that is making the calls, so
it should be instant but the pages are so very slow, and only in the
instance of using the MW API.
Any help or insight anybody could give would be MUCH appreciated.
- Mark
Hello!
Refering to http://fr.wikipedia.org/w/api.php?action=query&generator=exturlusage&geuque…
:
I would expect a extlinks element nested in each page element, since
each page has at least a *.yu link ... Here, only a single <page>
element provides the extlinks element.
Am I missing something big here, or is this a nice bug ?
Thanks,
--
Nicolas Dumazet — NicDumZ [ nɪk.d̪ymz ]
In r46845 [1], the issue raised in bug 11430 [2] a year and a half ago
was finally addressed: when the API was asked to produce huge amounts of
data (for instance the content of 500 revisions at 280 KB each), it
would run out of memory trying to store and process it. To prevent this
from happening, the amount of data the API can return is now limited.
This means that the behavior of requests that used to run out of memory
has changed: they will return fewer results than the limit, even though
there are more results available (they'll still set query-continue
right, though). For instance, the aforementioned request would return
about 300 revisions and set a query-continue for the rest.
Roan Kattouw (Catrope)
[1] http://www.mediawiki.org/wiki/Special:Code/MediaWiki/46845
[2] https://bugzilla.wikimedia.org/show_bug.cgi?id=11430
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce