Hello Geeks,
I have a custome application of mine. I would like integrate mediawiki along
with my application. My question is how is it possible to integrate user
registration. I couldnt find any API for user registration. My platform is
made in PHP and on framework Symfony.
I am currently trying to integrate using cURL library but again cURL POSTING
to the action page,
http://myhost.com/mediawiki/index.php?title=Special:UserLogin&action=su…
not properly working.
Please help me to integrate with my application.
Thanks
--
Sanil S
Technology Consultant
Co-Founder: http://www.mobme.in
Blog: http://www.iamatechie.com
Linked In: http://www.linkedin.com/in/sanilchandran
Hi folks. I'm looking through the API reference documentation (
http://wiki.endocardial.com/wiki/api.php). I was hoping that there would be
API support for uploading a file to a wiki, but I don't see any such
function. Is it possible? Is there any interest in making such a feature
available?
Thanks,
--Steve
I'm interested in the biggest number of revisions I can get through a
single request.
The documentation here http://en.wikipedia.org/w/api.php says that
rvlimit would be a good parameter to pass if you want
to fetch some revisions of an article and you want to set the number of
revisions you receive :
rvlimit - limit how many revisions will be returned (enum)
No more than 500 (5000 for bots) allowed.
I tried that myself and only got to at most 50 revisions per request.
Can I get more than 50 revisions per request ?
I understand that the program I'm running must be identified as a bot.
Do I have to make an account for it and set in the notify wikipedia that it is a bot ?
How can this notification be made ?
What other 'benefits' does a bot have ?
Thank you
I'm searching for a way to get diffs of revisions starting from some
date and limiting the number of results.
Or getting all the diffs starting from the current one to the oldest one.
The revisions are in pairs diff(Newest,AlmostNew) ,
diff(AlmostNew,AlmostAlmostNew) ... and so forth.
action=history seems to be doing by default for the first 5
http://en.wikipedia.org/w/index.php?title=Language&feed=rss&action=history
I also found a way to get the first 50
http://en.wikipedia.org/w/index.php?title=Language&feed=rss&action=history&…
But can I get them all ? Can I get all the diffs ?
Or diffs starting from a date ?
Thank you
Could we add some way for this query to get the combined edits of two
or more users on the same page? I know you can query several pages at
once but I'd want something like
&users=Tom|Dick|Harry&titles=The_weather_in_London if possible.
Thanks.
—C.W.
Hey,
Anyone know of a simple way to get a list of links in a section of a page
(prop=links for a section)?
Had a user wanting it for AWB, and I seem to remember either speaking to
Roan about it, or someone else had a way to do it.. As I seem to remember
the way suggested being quite long winded..
http://en.wikipedia.org/wiki/Wikipedia_talk:AutoWikiBrowser/Feature_requests
#To_list_the_links_in_a_section_of_a_page
Thanks
Sam Reed (Reedy)
Hi,
I'm trying to get a hold of the wikipedia dump , in particular
enwiki-latest-pages-meta-history.xml.bz2
It seems that on the page where it's supposed to be
(http://download.wikipedia.org/enwiki/latest/) it's weighing at 0.6KB
whereas I was used for it to be 147GB
What happened to the data and where did it went ?
Also , on the wikipedia (
http://en.wikipedia.org/wiki/Wikipedia_database ) page I read
"As of January 17 </wiki/January_17>, 2009 </wiki/2009>, it seems that
all snapshots of pages-meta-history.xml.7z hosted
at http://download.wikipedia.org/enwiki/ are missing. The developers at
Wikimedia Foundation are working to address this issue
(http://lists.wikimedia.org/pipermail/wikitech-l/2009-January/040841.html).
There are other ways to obtain this file"
I checked the other ways of obtaining the file that they describe , none
worked.
Why did the dumps vanished and how can I download a copy of them ?
Thank you
Hi ,All:
The contents of following is the part of my edit
page's program.
There are no response if I use this program to modify some pages content
, but some pages are ok.
Could you please to help me to look at whether there is something wrong?
Thanks.
.....
$post =
"title=$name&action=edit&basetimestamp=$ts&starttimestamp=$sts&token=$to
ken&summary=$summary$extra&text=$newtext";
$response = $this->postAPI($wiki, 'api.php', $post);
......
private function postAPI($wiki, $url, $postdata = "")
{
$wiki = $this->wiki($wiki);
$ch = curl_init();
$url = $wiki . $url;
if ($postdata !== "") {
$postdata .= "&";
}
$postdata .= "format=php&maxlag=" . $this->max_lag;
//echo $url.'?'.$postdata;exit;
//echo "Final postdata: $postdata<br />\n";
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_COOKIEJAR, 'cookie.txt');
curl_setopt($ch, CURLOPT_COOKIEFILE, 'cookie.txt');
curl_setopt($ch, CURLOPT_POSTFIELDS, $postdata);
curl_setopt($ch, CURLOPT_HTTPHEADER, array('Content-Type:
application/x-www-form-urlencoded;charset=UTF-8'));
curl_setopt($ch, CURLOPT_HEADER, false);
$response = curl_exec($ch);
//print_r( unserialize($response));exit;
if (curl_errno($ch)) {
return curl_error($ch);
}
curl_close($ch);
return unserialize($response);
}