Hello,
I am writing a Java program to extract the abstract of the wikipedia page
given the title of the wikipedia page. I have done some research and found
out that the abstract with be in rvsection=0
So for example if I want the abstract of 'Eiffel Tower" wiki page then I am
querying using the api in the following way.
http://en.wikipedia.org/w/api.php?action=query&prop=revisions&titles=Eiffel…
and parse the XML data which we get and take the wikitext in the tag <rev
xml:space="preserve"> which represents the abstract of the wikipedia page.
But this wiki text also contains the infobox data which I do not need. I
would like to know if there is anyway in which I can remove the infobox data
and get only the wikitext related to the page's abstract Or if there is any
alternative method by which I can get the abstract of the page directly.
Looking forward to your help.
Thanks in Advance
Aditya Uppu
Hi guys,
When I do a query for categories using the API I would only like to get the
categories that are listed in the wiki page. For example when I query the
Brocoli page, I get this:
<categories>
<cl ns="14" title="Category:All articles with unsourced statements"/>
<cl ns="14" title="Category:Articles containing Italian-language text"/>
<cl ns="14" title="Category:Articles with 'species' microformats"/>
<cl ns="14" title="Category:Articles with unsourced statements from January
2013"/>
<cl ns="14" title="Category:Brassica oleracea"/>
<cl ns="14" title="Category:Commons category with local link same as on
Wikidata"/>
<cl ns="14" title="Category:Cultivars"/>
<cl ns="14" title="Category:Edible plants"/>
<cl ns="14" title="Category:Inflorescence vegetables"/>
<cl ns="14" title="Category:Pages with citations having bare URLs"/>
</categories>
But I am only interested in these (You'll see it at the bottom of this page
http://en.wikipedia.org/wiki/Brocoli):
<cl ns="14" title="Category:Brassica oleracea"/>
<cl ns="14" title="Category:Cultivars"/>
<cl ns="14" title="Category:Edible plants"/>
<cl ns="14" title="Category:Inflorescence vegetables"/>
I was wondering if there is any workaround this?
Thanks
Chux
Is it possible to create datasets within the MediaWiki software so that
multiple pages can have unique tables displaying data from a single source?
I am helping maintain a wiki and one problem we are currently having is we
need to display the same data on multiple pages with slightly different
ordering of columns, omission of columns, etc. Our current method is to
just manually edit multiple tables whenever we need to add or remove data,
which is extremely inefficient to say the least.
If this is not currently possible, would there be a way I can formally
suggest the implementation of json/xml/whatever datasets to be used in this
manner?
Hi.
Recently I've been toying with writing code to automatically list
articles in certain wikiprojects based on certain criteria. An example
would be usage of certain templates or spelling error detection. To
enable the code to detect when articles have been "fixed" in a
relatively fast manner, I'd need to keep the database updated using a
greater interval than the XML dumps can provide. Then I thought of the
mediawiki API. What methods do you think are the most suited for the task?
With regards,
Svavar Kjarrval
It looks to me like I was blacklisted for EN Wikipedia API requests. My
website has been using the API for awhile now, but suddenly I'm getting 403
errors coming back. I've tried to follow all the rules, but if I've missed
something I'm happy to make changes. What disturbs me most is that there
was no attempt to contact me before being blacklisted to let me know there
was a problem. Where should I go to find out what the problem is?
Thanks,
Robert
Hello all.
I want to ask is it possible to know the total rows if I use list query? For example, I want to know how many editor are there, before making this kind of request: api.php?action=query&list=allusers&augroup=editor and repeatly using param query-continue aufrom later.
And is it possible to sort the query result. For example by userid or registration time?
Thank you.
Regards,
William
hello everybody:
could you please address to the css styling the wiki mobile ?
I can't find it, it is loaded from php but it looks that a container is
missing (the one wmbedding the page #container table)
I0'd like tostule the json object
--
Luigi Assom
Skype contact: oggigigi
https://en.wikipedia.org/wiki/Wikipedia:Village_pump_(technical)#The_Wikipe…
Jake (User:Ocaasi) needs some help and has written up a spec -- I
include part of it here:
Background: The Wikipedia Adventure (TWA) is an onboarding game--a
guided tour to teach new editors how to contribute to Wikipedia. In the
game players are invited to help out at a hypothetical article (Earth),
and along the way they learn skills while interacting with simulated peers.
Goal: Make TWA players feel like they are actually receiving messages
from other editors, when in fact they are just sending messages to
themselves.
Method: Use the Mediawiki Edit API. Have a button (or a link) on a
Wikipedia subpage of Wikipedia:TWA/ use the API to add target text to a
target page. Because different messages are received at different points
in the game, the ability to customize the target text and target page is
critical.
Implementation: Build a JavaScript userscript stored in the user’s
common.js page. The beta-version of the game will later deliver this
script as a gadget (set in user preferences, turned on by default, and
only active from the Wikipedia:TWA/ subspace).
There's more at the Village Pump page. Anyone have a little time to help?
--
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation
Hi,
With the new extension, Disambiguator [1], I was wondering how to retrieve
the list of all disambiguation pages in a wiki.
Is there a way to retrieve the list of pages having the disambiguation
property set ?
Nico
[1] http://lists.wikimedia.org/pipermail/wikitech-l/2013-July/070277.html