Hi All:
I have blown 8 hrs over 2 days trying to figure out what is wrong with my PHP program to try to do a query.
I am trying to build a simple PHP program to get the contents of recent changes on a wiki I admin.
Here is the code, with a couple of things "anonymized" (surrounded by << >>):
========= begin code ===========
<?php
//login and get the correct security values to pass in on subsequent queries
$wikiURL = "<<mydomain>>/mediawiki";
$logincmd = $wikiURL."/api.php?action=login&lgname=Sgg&lgpassword=<<mypwd>>&format=json";
$handle = fopen($logincmd,"rb");
$contents = stream_get_contents($handle);
fclose($handle);
$loginVals = json_decode($contents, true);
if($loginVals["login"]["result"] != "Success"){
echo"Failed to login, result returned ", $loginVals["login"]["result"];
exit(-1);
}
//set up the security parm values and create query
$userid = $loginVals["login"]["lguserid"];
$username = $loginVals["login"]["lgusername"];
$token = $loginVals["login"]["lgtoken"];
$query = $wikiURL."/api.php?action=query&list=recentchanges&format=xml&lgtoken=$token&lgusername=$username&lguserid=$userid";
echo"
Query",$query,"
";
//execute the get and print the results
$handle = fopen($query,"rb");
$contents = stream_get_contents($handle);
fclose($handle);
print_r("
Query results");
print_r($contents);
print_r("
end query results
");
?>
===============end code ================
The login works fine, returns reasonable looking tokens. The Query results, however are empty.
When I copy the query that was produced (note the echo statement), and paste that into a browser (that I have previously used to login to the wiki, I get some query results back. When I paste that query into a browser I rarely use (ie haven't logged into the wiki), I get empty result set.
So, am I misusing the login parms somehow on the query url? Are the login parms not working? Should I use cookie manipulation instead?
Please help
sgg
I've been lurking on the mailing list for a while, and I'd like to
contribute to API development. I'm not new to PHP programming or
MediaWiki, but I am new to SVN, and I'm having trouble checking out
the source to start playing with it. I apologize greatly for such a
question on the list, but is there a doc about accessing MediaWiki's
svn server I can scan somewhere? Or, just the address to access the
server? I've used ViewVC to look at the branches and trunk, but when I
copy and paste that address to TortoiseSVN, I get the following error.
Thanks for the help, and I look forward to contributing!
Error: PROPFIND of '/mediawiki/branches/ApiEdit_Vodafone': 405 Method
Not Allowed (http://svn.wikimedia.org)
Eddie
I know that I can find what categories a page is in using api.php?
action=query&titles=Pagename&prop=categories
but how do I get all the pages that have are members of a particular
category? I'm guessing the format would be something like
api.php?action=query&titles=Category:Foo&property=SOMEPROPERTY,
assuming it's implemented.
Thanks,
Timothy
Hi,
I am thinking about starting to write a standard Python
wrapper-interface thingy to the MediaWiki API. I thought I would write
here first and see what people think and if anyone knows of any
similar work done already (perhaps in pywikipediabot?).
The only relevant thing I found so far is
http://en.wikipedia.org/wiki/User:Yurik/Query_API/User_Manual#Python
The reason I am thinking of doing it because whenever I do serious API
queries (like checking a day's worth of logs) I always use Python to
manipulate whatever I get back, and also because the API generally
only lets you get 500 items and I haven't found a good way to loop
through the log items by timestamp, if you know what I mean. And so
each time I do a query I just rewrite basically the same thing and I'm
sick of it, and if I'm doing this maybe other people are too.
So, thoughts?
cheers,
Brianna
--
They've just been waiting in a mountain for the right moment:
http://modernthings.org/
WebSnail wrote:
>
> I'm trying to use the API in line with phpBB2 in the login process so that
> when the phpBB2 user logs in they automatically get logged into the wiki
> as well. Overall it seems to be working ok except for one small snag...
>
> It doesn't remember the login and whilst the echo output indicates that
> login was successful, as soon as I go to the wiki I find the old
> "Login/create account" link all over again.
>
The PHPBB extension can do this. See also
http://www.mediawiki.org/wiki/Extension:PHPBB/Users_Integration
Roan Kattouw
--
View this message in context: http://www.nabble.com/Login-parms-in-query-URL-doesn%27t-seem-to-work-tf467…
Sent from the WikiMedia API mailing list archive at Nabble.com.
Hi,
I want use the api to list all pages that are in any of the categories
which are in Category:Category_redirects on commons. So I tried
http://commons.wikimedia.org/w/api.php?action=query&generator=categorymembe…
but apparently it's not that easy...
IMHO "list=categorymembers" should not insist on "cmcategory", but
rather try to fall back to the generator list.
Additionally, the 500 limit could be imposed on the actual output; as
the generator is internal, it could have 5000 or unlimited (depending
on context).
Cheers,
Magnus
Ok thanks Roan. I thought so. Perhaps a note should be made on the Wiki documenting the API that the login parms on the query don't work.
I tried messing with the cookies using the php_http extension, and I eventually gave up when this error appeared:
HTTP/1.1 500 Internal Server Error Date: Mon, 22 Oct 2007 21:02:05 GMT Server: Apache/2.2.3 (Unix) DAV/2 mod_ssl/2.2.3 OpenSSL/0.9.8d PHP/5.2.0 mod_apreq2-20051231/2.5.7 mod_perl/2.0.2 Perl/v5.8.7 X-Powered-By: PHP/5.2.0 Content-Length: 79 Connection: close Content-Type: text/html register_globals security paranoia: trying to overwrite superglobals, aborting.
How the heck did this happen?
Can someone write a simple sample PHP app that logs in issues query and echos the results? I thought this would be simple but evidently not.
Is there some additional setting/config I need to make on the wiki itself in order to allow a php program to issue a query just like a human using a browser?
sgg
---- Roan Kattouw <roan.kattouw(a)home.nl> wrote:
> sggraham(a)nc.rr.com schreef:
> > So, am I misusing the login parms somehow on the query url? Are the login parms not working? Should I use cookie manipulation instead?
> >
> Yes, you should use cookies. You'll get a cookie sent to you on
> action=login, which you should send back at every next request. In PHP,
> Snoopy [1] makes handling cookies easier. Login parameters the way you
> tried to use them currently don't work. Maybe they will in the future,
> who knows.
>
> Catrope
>
> [1] http://snoopy.sourceforge.net/
>
>
> _______________________________________________
> Mediawiki-api mailing list
> Mediawiki-api(a)lists.wikimedia.org
> http://lists.wikimedia.org/mailman/listinfo/mediawiki-api
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Somehow I forgot to notify the list of this; however, there has recently
been a breaking change in the recentchanges query. As of revision
26615[1], instead of specifying, for instance, type="0" for each entry
in the list, it will specify type="edit". Other variants include "move,"
"log," "new," and "move over redirect." This change was implemented in
response to bug 11632[2]. This change is not yet live on Wikimedia
wikis, but will be soon. Please update any code using this query
accordingly, and if there are any questions or concerns, please do not
hesitate to contact me or the list.
Thanks.
[1]http://svn.wikimedia.org/viewvc/mediawiki?view=rev&revision=26615
[2]http://bugzilla.wikimedia.org/show_bug.cgi?id=11632
- --
Daniel Cannon (AmiDaniel)
http://amidaniel.com
cannon.danielc(a)gmail.com
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFHE8trFRAT5u/mSaMRAn7CAJ0X/D+qwoMUcDwlG1UdcayQ3ohwMQCdFRHp
lXw/V8xtrbU2/ndZ/zx0Ud4=
=mgJ8
-----END PGP SIGNATURE-----