Hi!
I'm new to this list (and have not yet found a way to search the archives), so maybe this topic has been discussed to death before. If so, I'd appreciate a hint (when, subject, etc. so I can download the proper month from the archives) and ask to accept my apologies..
I'm working on a project where there are a lot of (generated) pages on one side and a mediawiki 1.5.1 on the other side. The administrator of the wiki is a bit overly careful, so I cannot use the import functionality (which works without a problem on my test-wiki at home). Nonetheless, I've about 2500 wiki pages to upload... "Open page - copy from editor - paste into firefox - save - open next page - etc.pp." was hard enough for the test pages until everything worked like a charm, but by now all the converting and generating is just a "make all" and a few minutes of runtime - and uploading would be weeks...
Is there a (linux command line) tool out there which accepts a filename with wiki content, a wiki address and page name, a username and password (or cookie file path) and uploads the file's contents to the wiki? Alternatively, is it possible to achieve this somehow with wget? Programming & shell skills are available.
I found a bunch of media uploaders, but this is just about uploading normal pages, not about uploading media - there are some pictures in the project, but they can be handled manually, as they are few.
Yours, Christian Treczoks
There are lots of different libraries for writing scripts that interact with mediawiki, checkout http://en.wikipedia.org/wiki/Wikipedia:CREATEBOT#Programming_languages_and_l... a list.
It would be fairly easy to write a script to create pages from a database/xml/text files etc
-- Chris
On Mon, Jan 24, 2011 at 4:19 PM, ct@braehler.com wrote:
Hi!
I'm new to this list (and have not yet found a way to search the archives), so maybe this topic has been discussed to death before. If so, I'd appreciate a hint (when, subject, etc. so I can download the proper month from the archives) and ask to accept my apologies..
I'm working on a project where there are a lot of (generated) pages on one side and a mediawiki 1.5.1 on the other side. The administrator of the wiki is a bit overly careful, so I cannot use the import functionality (which works without a problem on my test-wiki at home). Nonetheless, I've about 2500 wiki pages to upload... "Open page - copy from editor - paste into firefox - save - open next page - etc.pp." was hard enough for the test pages until everything worked like a charm, but by now all the converting and generating is just a "make all" and a few minutes of runtime - and uploading would be weeks...
Is there a (linux command line) tool out there which accepts a filename with wiki content, a wiki address and page name, a username and password (or cookie file path) and uploads the file's contents to the wiki? Alternatively, is it possible to achieve this somehow with wget? Programming & shell skills are available.
I found a bunch of media uploaders, but this is just about uploading normal pages, not about uploading media - there are some pictures in the project, but they can be handled manually, as they are few.
Yours, Christian Treczoks
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
mediawiki-l-bounces@lists.wikimedia.org schrieb am 24.01.2011 14:07:12:
There are lots of different libraries for writing scripts that interact
with
mediawiki, checkout http://en.wikipedia.org/wiki/Wikipedia: CREATEBOT#Programming_languages_and_librariesfor a list.
Ahh, that is exactly what I was looking for. I have not thought about "BOT" as a meme, that was the missing link. Thank you for your help! This will ease the burden and speed up my project!
It would be fairly easy to write a script to create pages from a database/xml/text files etc
I had a look at the perl MediaWiki::API, which looks exactly what I'm looking for.
Thanks again for the tipp!
Yours, Christian Treczoks
mediawiki-l@lists.wikimedia.org