I have been able to get the baclinks for a particular page using the the
api.php
http://en.wikipedia.org/w/api.php?action=query&list=backlinks&bltitle=Delhi…
I have used the php format so that it can be easily parsed.Is there any way
to get all the backlinks in one go.Moreover i am only concerned with the
number of backlinks. It is possible to get that?
--
Ankuj Gupta
Computer Engineering
NSIT , India
I have went through it and i was worked on the example that it is
providing but still i am confused how can it be used
On Sun, Jan 18, 2009 at 11:49 PM, Nicolas Dumazet <nicdumz(a)gmail.com> wrote:
> mmm you're probably not on mediawiki-api, sorry for this
>
>
> ---------- Forwarded message ----------
> From: Nicolas Dumazet <nicdumz(a)gmail.com>
> Date: 2009/1/18
> Subject: Re: [Mediawiki-api] How to use mediawiki api
> To: MediaWiki API announcements & discussion <mediawiki-api(a)lists.wikimedia.org>
>
>
> ah, I was pretty sure I saw your name somewhere before. A GSoCer =)
>
> Have you tried simply reading http://en.wikipedia.org/w/api.php ? It
> gives a simple overview of most of the API's possibilities. I also
> links to.... http://www.mediawiki.org/wiki/API the API documentation
> =)
>
> Regarding your question, searching for "backlinks" on the first page
> links you to a good example:
> http://en.wikipedia.org/w/api.php?action=query&list=backlinks&bltitle=Main%…
>
> GSoC is a lot about looking by yourself for documentation, about being
> independent. Come on', dig a bit more, the answer wasn't so hard to
> find =)
>
> 2009/1/18 Ankuj Gupta <ankuj2004(a)gmail.com>:
>> Hi
>>
>> I am working on my wikipedia project. I require to know the number of
>> back links for a page. For this i am planning to use the mediawiki
>> api. But i need help regarding how can be they be used. Can api be
>> used offline.
>> Thans
>>
>> --
>> Ankuj Gupta
>> Computer Engineering
>> NSIT
>> India
>>
>> _______________________________________________
>> Mediawiki-api mailing list
>> Mediawiki-api(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
>>
>
>
>
> --
> Nicolas Dumazet — NicDumZ [ nɪk.d̪ymz ]
>
>
>
> --
> Nicolas Dumazet — NicDumZ [ nɪk.d̪ymz ]
>
--
Ankuj
Hi there,
I just started using the MediaWiki API today. I'm writting a script to
insert about 500 new articles into the Wiki. I'm using PHP5, MedaWiki
1.13.3.
I've decided to use CURL to communicate with the APIs on my local machine.
So far I can login, receive my lgtoken and request an edit token - which is
all good. What I'm noticing however is when I use the examples on the site
to send an action=edit I get this returned back:
Array
(
[error] => Array
(
[code] => help
[info] =>
[*] =>
Plus the entire manual after it. What's even more strange is when I read the
the manual more closely I don't see any references to action=edit being
available. I have however found the same manual on other sites which list it
accordingly. I'm completely baffled. Apparently 1.13.3 *does* have support
for API editing.
I've attached a copy of the script I'm using as well just in case.
Thanks in advance,
Dave
Gentlemen, let's say one is afraid one day the forces of evil will
confiscate one's small wiki, so one wants to encourage all loyal users to
keep a backup of the whole wiki (current revisions fine, no need for
full history).
OK, we want this to be as simple as possible for our loyal users, just
one click needed. (So forget Special:Export!)
And, we want this to be as simple as possible for our loyal
administrator, me. I.e., use existing facilities, no cronjobs to run
dumpBackup.php (or even mysqldump, which would be giving up too much
information) and then offering a link to what they produce.
The format desired is for later making a new wiki via Special:Import,
so indeed the Special:Export or dumpBackup.php --current outputs are
the desired format.
I just can't figure out the right
http://www.mediawiki.org/wiki/API URL recipe...
api.php ? action=query & generator=allpages & format=xmlfm & ...?
Could it be that the API lacks the "bulk export of XML formatted data"
capability of Special:Export?
If one click is not enough, then at least one click per Namespace. I
would just have the users backup Main: and Category:, for example.
Embedding the API URL would be no problem, I would just use
[{{SERVER}}/api.php?... Backup this whole site to your disk]
Hi,
I'm doing some experimentation with a webapp and thought the online API
could be a good way to deliver documentation to the users. Not all of my
webapp is open-source though, so how does the GPL license of the original
source-code when you're using MediaWiki as a service?
Is it possible to parse multiple pages at once or to get your search
results as html?
for example I run something like:
http://commons.wikimedia.org/w/api.php?format=jsonfm&action=query&generator…
And I get all the results "revisions": ["*"] as wiki-text ... ideally I
could get those results as html is there anyway to do that?
--michael
hi,
I have a question regarding the parsed internal links. They have the format:
<a href="/wiki/Foo" ...
Is it possible to have the hostname shown too, i.e.
www.mywiki.example.com/wiki/...
Best regards,
dian
hi,
as described on the MW API help pages [1], the parse method should
return parsed wikitext. But when I test the method, the result includes:
<!--
NewPP limit report
Preprocessor node count: 2/1000000
Post-expand include size: 40/2097152 bytes
Template argument size: 0/2097152 bytes
Expensive parser function count: 0/100
-->
This was on my test wiki-system, but direct over MW the result is the same.
Any ideas how to fix this?
Best regards,
dian
[1]: http://www.mediawiki.org/wiki/API:Expanding_templates_and_rendering