Hi,
I am trying to update/create articles on our Wiki using the mediawiki API
(using last version found on http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/includes/api)
Login phase (action=login) works correctly and returns <LoginToken> and
<LoginUserId> information.
Then I try to retrieve the token of one page using query action with
following parameters:
<WIKIURL>/api.php?action=query&titles=<PageName>&prop=info&intoken=edit&lgtoken=<LoginToken>&lgusername=<MyLogin>&lguserid=<LoginUserId>
Unfortunatelly, I always get the following response (xml format), what
ever the login used (even if user has Sysopt rights):
<?xml version="1.0" encoding="utf-8"?><api><error
code="inpermissiondenied" info="Action 'edit' is not allowed for the
current user" /></api>
I don't understand this error and don't find documentation about it.
Do I need to configure a specific parameter on Wiki to allow "edit" ?
Thanks for your help.
Marion Leclerc
I propose to make same limits (500 on slow and 5000 on fast queries) to
all users for all actions (maybe expect content requests). Reasons:
1. It would be less expensive to fetch 500 (5000) rows on *one query*
than to do *10 times the same query.
2. Usually it's very hard to determine if you may use extended limits
(for now it's group "sysop" (you won't beleive me, but may not exist on
wiki) or "bot" right). I propose to at least add new user right called
"apihighlimits". Anyway, it cause user to do extra query.
--VasilievVV
I noticed that the live wikimedia servers now have a version of the API
that respects the maxlag parameter, which is great. Thanks to the people
who got that working.
The correct action when getting a maxlag error is to sleep for at least
as long as the current lag. A nice feature of the HTTP maxlag error
produced by index.php is that an HTTP header 'Retry-After' that directly
tells how long to wait.
I looked at the API code this afternoon to see if I could add another
field to the error object that would directly include the lag, so that
I don't have to parse it out of the text of the info field. I dicovered
that the error objects at the moment can only have two fields, 'code'
and 'info'.
That explained to me why bug 11404 was resolved the way it was. In that
bug, there was no easy way to distinguish between different sorts of
internal errors without parsing the info field. It was resolved by
changing the error code to directly reflect the type of internal error
encountered.
I don't mind writing a patch to permit different instances of
UsageException to carry extra information in addition to 'code' and
'info', but I don't want to invest the coding time unless other people
think it's worthwhile.
- Carl
Hi,
First off, nice work on the API :-) Nice structure, quite simple to extend.
I'm building an API module for the wikidata extension, which will
basically provide an XML interface to wikidata terms. In doing so I ran
into a couple of issues that I hope somebody on this list can help me with.
My first question is: how do I place an API module in an extension?
Wikidata has it's own branch, so I can simply put my API module and it's
formatters in the includes/api directory and modify both
includes/api/ApiMain.php and includes/AutoLoader.php. But I'd much
rather put my code in the Wikidata extension and not modify anything in
the branch. I found an svn commit that suggests that this is possible,
but I fail to see how.
My second question is: how can a custom printer learn about the format
parameter passed in the request?
My module uses it's own formatter. I created one for several reasons,
one being that I had trouble getting a big data structure into the
ApiResult object. Maybe I just didn't understands the interface. That
wasn't the only reason, I felt like I needed an different internal
representation. Anyway, there's a nice hook for creating a custom
printer: the getCustomPrinter method. But unfortunately that method
doesn't get the value of the format parameter passed in. My module can
format different types of output, but right now I have to solve this by
adding my own format parameter (which gets prefixed with 'wd' for
wikidata). Is there any way I can find out what that parameter was?
Many thanks,
Maarten
--
*Maarten van Hoof*
maarten.vanhoof(a)edia.nl
*Edia* - Educatie Technologie
Asterweg 19D12 | 1031 HL Amsterdam
*T* 020 716 36 12 | *F* 020 716 36 13 | *M* 06 245 392 15 | www.edia.nl
<http://www.edia.nl>
I know there was some discussion about changing the parameter, so maybe this
is just the code being worked out, but the query for backlinks seems to be
broken - even the example on the API help returns nothing.
--
----------------------------
Ian Cabell
RadiantWeb Services
( 877 ) 406 - 6272 [toll free]
ian(a)radiantweb.net
http://www.radiantweb.net
"There is more than one right way .. to make it perfect!"
Hi,
i am new here and I have one question, is there an API client for PHP
available? Maybe stupid, but I need one to communicate (read&write) with
mediawiki from another project, written in php ;-)
Why you dont use SOAP or XML-RPC?
Greetings from Germany
Christian Thiele
--
Freiberuflicher Softwareentwickler
Christian Thiele
August-Bebel-Straße 48
04275 Leipzig
Tel. 0178 4860018
Thanks I was just about to reply to you and say that I got it working
with cookies. Interesting that I can't get it to work with the tokens.
It must believe the session is ended when doing it without cookies.
---Working PHP Edit Code---
echo "<pre>";
print_r($_POST);
$apiURL = "http://".$_SERVER['HTTP_HOST']."/phpapps/wiki/api.php";
$loginURL =
array("action"=>"login","lgname"=>"jkuter","lgpassword"=>"","lgdomain"=>
"ad.mathworks.com","format"=>"php");
$getLoginResponse = unserialize(getURL($apiURL,$loginURL));
print_r($getLoginResponse);
$titleURLParams =
array("action"=>"query","prop"=>"info","titles"=>"BatPresubmitTest","int
oken"=>"edit","format"=>"php");
print_r($titleURLParams);
$getTitleResponse = unserialize(getURL($apiURL,$titleURLParams));
print_r($getTitleResponse);
//$newPageURLParams =
array("action"=>"query","prop"=>"info","titles"=>"BatPresubmitTest","int
oken"=>"edit",
//
"lgtoken"=>$getLoginResponse['login']['lgtoken'],"lgusername"=>$getLogin
Response['login']['lgusername'],
//
"lguserid"=>$getLoginResponse['login']['lguserid'],"format"=>"php");
//$makeNewPage = unserialize(getURL($apiURL,$newPageURLParams));
//print_r($makeNewPage);
echo "</pre>";
function getURL($url,$vars){
include "Snoopy.class.php";
$snoopy = new Snoopy;
$snoopy->cookies["inside_wikiToken"] =
$_COOKIE["inside_wikiToken"];
$snoopy->cookies["inside_wikiUserID"] =
$_COOKIE["inside_wikiUserID"];
$snoopy->cookies["inside_wikiUserName"] =
$_COOKIE["inside_wikiUserName"];
$snoopy->cookies["inside_wiki_session"] =
$_COOKIE["inside_wiki_session"];
$snoopy->submit($url,$vars);
$snoopy->setcookies();
return $snoopy->results;
}
---End Working PHP Code---
-----Original Message-----
From: Roan Kattouw [mailto:roan.kattouw@home.nl]
Sent: Tuesday, November 13, 2007 12:23 PM
To: Jason Kuter
Subject: Re: [Mediawiki-api] FW: Problems Using API Via PHP
Jason Kuter schreef:
> Understood, I just wish it worked without an additional class. Using
> Snoopy works but my user is not being authenticated. It must be
another
> issue.
>
> Thanks for the reply,
>
> jason
Use $snoopy->setCookies(); after doing action=login to force Snoopy to
remember the authentication cookies.
Roan Kattouw (Catrope)
I understand that this part of the API is not implemented fully etc,
etc. However I am having an issue I can't get around which is the
following:
&prop is converted by php to the "proportional" char when sending via
post, get, using curl, fopen, file_get_contents. This whole process
works just fine if I execute via the URL in the browser, so I know it
can happen. Is there a way around using &prop=info to get a token to
complete the page creation process? Why is &prop being used when it's a
reserved html special char? It seems silly to be blocked by such a
little thing but I can't seem to get around it and would rather not do
this whole thing is javascript since this script will have a lot of work
to do when its complete. My assumption is that many people will run
into this when the api becomes more popular, because who doesn't love
php?
Thanks,
jason
--Info---
* MediaWiki: 1.12alpha
* PHP: 5.1.2 (apache2handler)
* MySQL: 5.0.18-log
---Begin PHP Code---
echo "<pre>";
print_r($_POST);
$apiURL = "http://".$_SERVER['HTTP_HOST']."/phpapps/wiki/api.php";
$loginURL =
"action=login&lgname=jkuter&lgpassword=&lgdomain=ad.mathworks.com&format
=php";
$getLoginResponse = unserialize(getURL($apiURL,$loginURL));
print_r($getLoginResponse);
$titleURL =
"action=query&prop=info&titles=BatPresubmitTest&intoken=edit&lgtoken=".
$getLoginResponse['login']['lgtoken']."&lgusername=".
$getLoginResponse['login']['lgusername']."&lguserid=".
$getLoginResponse['login']['lguserid'].
"&format=php";
$getTitleResponse = unserialize(getURL($apiURL,$titleURL));
print_r($getTitleResponse);
$newPageURL =
"action=query&prop=info&titles=BatPresubmitTest&intoken=edit&lgtoken=".
$getLoginResponse['login']['lgtoken']."&lgusername=".
$getLoginResponse['login']['lgusername']."&lguserid=".
$getLoginResponse['login']['lguserid'].
"&format=php";
//$makeNewPage = unserialize(getURL($apiURL,$newPageURL));
//print_r($makeNewPage);
echo "</pre>";
function getURL($url,$vars){
$ch = curl_init();
curl_setopt($ch,CURLOPT_URL,$url);
curl_setopt($ch,CURLOPT_POST,1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($ch,CURLOPT_POSTFIELDS,$vars);
$content = curl_exec($ch);
curl_close($ch);
return $content;
}
---End PHP Code---
---Begin output with &prop---
Array
(
[RCAlink] => Insert link here
[ReqDesRevDoclink] => Insert link here
[LEGORevlink] => Insert link here
[APIClasslink] => Insert link here
[CRChecklistLink] => Insert link here
[CRList] => Insert list here
[TestsAddedDescription] => Insert description here
[RunSuiteList] => Insert list here
[InteractiveBashingList] => Insert list here
[BaTStages] => Insert list here
[ForeignFilesMissing] => Insert link here
[BuildPlatformsList] => Insert list here
[TestPlatformsList] => Insert list here
[TestingBottleneck] => YNTestingBottleneck
[FinalMergeCheck] => YNFinalMergeCheck
[SubmitJob] => YNSubmitJob
[JobNo] => dsfdsf
[Submit_Checklist] => Submit
)
Array
(
[login] => Array
(
[result] => Success
[lguserid] => 1004
[lgusername] => Jkuter
[lgtoken] => 4ee852dbbe6ad67022af0873aaa21098
[cookieprefix] => inside_wiki
[sessionid] =>
)
)
Array
(
[error] => Array
(
[code] => inpermissiondenied
[info] => Action 'edit' is not allowed for the current user
)
)
--End Output With &prop---
---Begin Output Without &prop---
Array
(
[RCAlink] => Insert link here
[ReqDesRevDoclink] => Insert link here
[LEGORevlink] => Insert link here
[APIClasslink] => Insert link here
[CRChecklistLink] => Insert link here
[CRList] => Insert list here
[TestsAddedDescription] => Insert description here
[RunSuiteList] => Insert list here
[InteractiveBashingList] => Insert list here
[BaTStages] => Insert list here
[ForeignFilesMissing] => Insert link here
[BuildPlatformsList] => Insert list here
[TestPlatformsList] => Insert list here
[TestingBottleneck] => YNTestingBottleneck
[FinalMergeCheck] => YNFinalMergeCheck
[SubmitJob] => YNSubmitJob
[JobNo] => dsfdsf
[Submit_Checklist] => Submit
)
Array
(
[login] => Array
(
[result] => Success
[lguserid] => 1004
[lgusername] => Jkuter
[lgtoken] => 4ee852dbbe6ad67022af0873aaa21098
[cookieprefix] => inside_wiki
[sessionid] =>
)
)
Array
(
[query] => Array
(
[pages] => Array
(
[-1] => Array
(
[ns] => 0
[title] => BatPresubmitTest
[missing] =>
)
)
)
)
--End Output Without &prop---
Searching the list and I figure that the search API is currently broke? If
so I guess I can / should use the "list=allpages" API for a temporary hack.
Any god-beings out there that can advise?
Ben Srour wrote:
> >* I'm sorry, I don't know what that means. :)
> *>*
> *>* What action should I take as a consumer of the API?
> *>*
> *>* Thanks
> *>* Ben
> *
> As a consumer? Wait for the god-beings to fix it.
> Or you may become a developer and fix it ;)
>
>
Hi All:
I have blown 8 hrs over 2 days trying to figure out what is wrong with my PHP program to try to do a query.
I am trying to build a simple PHP program to get the contents of recent changes on a wiki I admin.
Here is the code, with a couple of things "anonymized" (surrounded by << >>):
========= begin code ===========
<?php
//login and get the correct security values to pass in on subsequent queries
$wikiURL = "<<mydomain>>/mediawiki";
$logincmd = $wikiURL."/api.php?action=login&lgname=Sgg&lgpassword=<<mypwd>>&format=json";
$handle = fopen($logincmd,"rb");
$contents = stream_get_contents($handle);
fclose($handle);
$loginVals = json_decode($contents, true);
if($loginVals["login"]["result"] != "Success"){
echo"Failed to login, result returned ", $loginVals["login"]["result"];
exit(-1);
}
//set up the security parm values and create query
$userid = $loginVals["login"]["lguserid"];
$username = $loginVals["login"]["lgusername"];
$token = $loginVals["login"]["lgtoken"];
$query = $wikiURL."/api.php?action=query&list=recentchanges&format=xml&lgtoken=$token&lgusername=$username&lguserid=$userid";
echo"
Query",$query,"
";
//execute the get and print the results
$handle = fopen($query,"rb");
$contents = stream_get_contents($handle);
fclose($handle);
print_r("
Query results");
print_r($contents);
print_r("
end query results
");
?>
===============end code ================
The login works fine, returns reasonable looking tokens. The Query results, however are empty.
When I copy the query that was produced (note the echo statement), and paste that into a browser (that I have previously used to login to the wiki, I get some query results back. When I paste that query into a browser I rarely use (ie haven't logged into the wiki), I get empty result set.
So, am I misusing the login parms somehow on the query url? Are the login parms not working? Should I use cookie manipulation instead?
Please help
sgg