I filed http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=527536 Debian Bug report logs - #527536 Request For Packaging: libmediawiki-api-perl -- replacement for libmediawiki-perl in the hope that somebody puts it into Debian.
Hello,
is it possible to get directly on disk the xml file produced by the api command below
api.php?action=query&generator=allpages&gapnamespace=0&gaplimit=1000&format=xml
using lynx -dump for example (I don't know how to do it) or an other batch mode procedure ?
Best Francois Colonna
On Wed, May 13, 2009 at 11:12:06AM +0200, Colonna Francois wrote:
Hello, is it possible to get directly on disk the xml file produced by the api command below
api.php?action=query&generator=allpages&gapnamespace=0&gaplimit=1000&format=xml
using lynx -dump for example (I don't know how to do it) or an other batch mode procedure ?
Use wget, curl, or any other program that downloads a URL and saves it to disk.
Le jeudi 14 mai 2009 à 09:19 -0400, Brad Jorsch a écrit :
On Wed, May 13, 2009 at 11:12:06AM +0200, Colonna Francois wrote:
Hello, is it possible to get directly on disk the xml file produced by the api command below
api.php?action=query&generator=allpages&gapnamespace=0&gaplimit=1000&format=xml
using lynx -dump for example (I don't know how to do it) or an other batch mode procedure ?
Use wget, curl, or any other program that downloads a URL and saves it to disk.
I tried wget
http://localhost/~wiki/mediawiki/api.php?action=query&generator=allpages...
it results in :
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"> <html> <head> <title>MediaWiki API Result</title> </head> <body> <br/> <small> You are looking at the HTML representation of the XML format.<br/> HTML is good for debugging, but probably is not suitable for your application.<br/> See <a href='http://www.mediawiki.org/wiki/API'>complete documentation</a>, or <a href='/~wiki/mediawiki/api.php'>API help</a> for more information. </small> <pre> <span style="color:blue;"><?xml version="1.0"?></span> <span style="color:blue;"><api /></span> </pre> </body> </html> <!-- Served in 0.184 secs. -->
What did I miss ? Thanks
Francois Colonna
2009/5/14 Colonna Francois colonna@lct.jussieu.fr:
I tried wget
http://localhost/~wiki/mediawiki/api.php?action=query&generator=allpages...
it results in :
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
Are you sure you got the format= parameter right? Also, try querying en.wikipedia.org , does that work?
Roan Kattouw (Catrope)
On Thu, May 14, 2009 at 2:58 PM, Colonna Francois colonna@lct.jussieu.fr wrote:
Le jeudi 14 mai 2009 à 09:19 -0400, Brad Jorsch a écrit :
On Wed, May 13, 2009 at 11:12:06AM +0200, Colonna Francois wrote:
Hello, is it possible to get directly on disk the xml file produced by the api command below
api.php?action=query&generator=allpages&gapnamespace=0&gaplimit=1000&format=xml
using lynx -dump for example (I don't know how to do it) or an other batch mode procedure ?
Use wget, curl, or any other program that downloads a URL and saves it to disk.
I tried wget
http://localhost/~wiki/mediawiki/api.php?action=query&generator=allpages...
it results in :
[snip]
The & character has a special meaning on the command-line. Try putting quotation marks around the url, so
wget 'http://localhost/~wiki/mediawiki/api.php?action=query&generator=allpages...'
Sam
2009/5/14 Sam Korn smoddy@gmail.com:
The & character has a special meaning on the command-line. Try putting quotation marks around the url, so
wget 'http://localhost/~wiki/mediawiki/api.php?action=query&generator=allpages...'
Gah, of course; ignore my post.
Roan Kattouw (Catrope)
It was kinda silly, but I too have the same confusion Roan
On Thu, May 14, 2009 at 10:04 PM, Roan Kattouw roan.kattouw@gmail.comwrote:
2009/5/14 Sam Korn smoddy@gmail.com:
The & character has a special meaning on the command-line. Try putting quotation marks around the url, so
wget '
http://localhost/~wiki/mediawiki/api.php?action=query&generator=allpages... '
Gah, of course; ignore my post.
Roan Kattouw (Catrope)
Mediawiki-api mailing list Mediawiki-api@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
Thanks Sam!
On Thu, May 14, 2009 at 10:03 PM, Sam Korn smoddy@gmail.com wrote:
On Thu, May 14, 2009 at 2:58 PM, Colonna Francois colonna@lct.jussieu.fr wrote:
Le jeudi 14 mai 2009 à 09:19 -0400, Brad Jorsch a écrit :
On Wed, May 13, 2009 at 11:12:06AM +0200, Colonna Francois wrote:
Hello, is it possible to get directly on disk the xml file produced by the
api
command below
api.php?action=query&generator=allpages&gapnamespace=0&gaplimit=1000&format=xml
using lynx -dump for example (I don't know how to do it) or an other batch mode procedure ?
Use wget, curl, or any other program that downloads a URL and saves it to disk.
I tried wget
http://localhost/~wiki/mediawiki/api.php?action=query&generator=allpages...
it results in :
[snip]
The & character has a special meaning on the command-line. Try putting quotation marks around the url, so
wget ' http://localhost/~wiki/mediawiki/api.php?action=query&generator=allpages... '
Sam
-- Sam PGP public key: http://en.wikipedia.org/wiki/User:Sam_Korn/public_key
Mediawiki-api mailing list Mediawiki-api@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
Le jeudi 14 mai 2009 à 15:03 +0100, Sam Korn a écrit :
On Thu, May 14, 2009 at 2:58 PM, Colonna Francois colonna@lct.jussieu.fr wrote:
Le jeudi 14 mai 2009 à 09:19 -0400, Brad Jorsch a écrit :
On Wed, May 13, 2009 at 11:12:06AM +0200, Colonna Francois wrote:
Hello, is it possible to get directly on disk the xml file produced by the api command below
api.php?action=query&generator=allpages&gapnamespace=0&gaplimit=1000&format=xml
using lynx -dump for example (I don't know how to do it) or an other batch mode procedure ?
Use wget, curl, or any other program that downloads a URL and saves it to disk.
I tried wget
http://localhost/~wiki/mediawiki/api.php?action=query&generator=allpages...
it results in :
[snip]
The & character has a special meaning on the command-line. Try putting quotation marks around the url, so
wget 'http://localhost/~wiki/mediawiki/api.php?action=query&generator=allpages...'
Sam
Thanks Sam
it works.
Regards, F.C.
mediawiki-api@lists.wikimedia.org