Hello,
since I installed and try a new Wiki-System (V.1 .4.0) there remains one question: which way to export articles or other content out of the system ?
In the database I found the "old"-table containing the first article version.
Is there a smart way to get the raw content of the recent version or does it need every time compiling each change, that is done ?
I need to export the content for exchanging it with other databased CMS / Newssystems.
thanx for replying.
GW
On Mon, 21 Mar 2005 14:50:59 +0100, Gerhard Wendebourg gw@web-hh.de wrote:
since I installed and try a new Wiki-System (V.1 .4.0) there remains one question: which way to export articles or other content out of the system ?
There are a number of options, depending on your exact needs.
1) use the Special:Export page to generate either the current version or history of one or many pages, wrapped in some XML with things like title, timestamp, etc
2) add "?action=raw" (or "&action=raw", if it's already a "?..." type URL) to the URL of a page to return the raw wikitext of that page (with a "text/x-wikimarkup" content type to avoid IE sniffing for HTML)
3) use an appropriate MySQL utility to dump or examine the 'cur' and/or 'old' tables - the latest revision of each article is stored in 'cur', and all others in 'old' (this will change in 1.5, but you don't need to worry about that). Note that the namespace is stored numerically in a seperate field, so an entry with cur_namespace=0 and cur_title='Foo' is the page [[Foo]], while that with cur_namespace=1 and cur_title='Foo' is [[Talk:Foo]] (see "includes/Defines.php" for the full list). The rest of the structure is probably self-explanatory give or take a bit of experimentation.
+) For more complex interactions, see the "Python Wikipedia Robot Framework", at http://pywikipediabot.sf.net
++) And, seeing as you said "or other content", Special:Recentchanges, Special:Newpages, and possibly others, have RSS and Atom feeds (see links in the "toolbox")
HTH
Rowan Collins schrieb:
On Mon, 21 Mar 2005 14:50:59 +0100, Gerhard Wendebourg gw@web-hh.de wrote:
since I installed and try a new Wiki-System (V.1 .4.0) there remains one question: which way to export articles or other content out of the system ?
There are a number of options, depending on your exact needs.
- use the Special:Export page to generate either the current version
or history of one or many pages, wrapped in some XML with things like title, timestamp, etc
- add "?action=raw" (or "&action=raw", if it's already a "?..." type
URL) to the URL of a page to return the raw wikitext of that page (with a "text/x-wikimarkup" content type to avoid IE sniffing for HTML)
- use an appropriate MySQL utility to dump or examine the 'cur'
and/or 'old' tables
Thank you very much for this hint and your extended explanations. I hadn't realized the function of the cur-table, since this seems to be the approprate interface for exchanging content, while I'm mostly working with MYSQL-database-applications.
GW
(Unrelated to issue at hand) On Mon, 21 Mar 2005 16:13:24 +0000, Rowan Collins rowan.collins@gmail.com wrote:
On Mon, 21 Mar 2005 14:50:59 +0100, Gerhard Wendebourg gw@web-hh.de wrote: 2) add "?action=raw" (or "&action=raw", if it's already a "?..." type URL) to the URL of a page to return the raw wikitext of that page (with a "text/x-wikimarkup" content type to avoid IE sniffing for HTML)
Has this changed? I thought it was "text/x-wiki"
-- Jamie ------------------------------------------------------------------- http://endeavour.zapto.org/astro73/ Thank you to JosephM for inviting me to Gmail! Has lots of invites.
On Mon, 21 Mar 2005 17:16:00 -0500, Jamie Bliss astronouth7303@gmail.com wrote:
(with a "text/x-wikimarkup" content type to avoid IE sniffing for HTML)
Has this changed? I thought it was "text/x-wiki"
Er, whatever - I meant to say "something like ...", it was just from memory. No change that I know of.
On Mon, 21 Mar 2005 14:50:59 +0100, Gerhard Wendebourg gw@web-hh.de wrote:
since I installed and try a new Wiki-System (V.1 .4.0) there remains one question: which way to export articles or other content out of the system ?
There are a number of options, depending on your exact needs.
1) use the Special:Export page to generate either the current version or history of one or many pages, wrapped in some XML with things like title, timestamp, etc
2) add "?action=raw" (or "&action=raw", if it's already a "?..." type URL) to the URL of a page to return the raw wikitext of that page (with a "text/x-wikimarkup" content type to avoid IE sniffing for HTML)
3) use an appropriate MySQL utility to dump or examine the 'cur' and/or 'old' tables - the latest revision of each article is stored in 'cur', and all others in 'old' (this will change in 1.5, but you don't need to worry about that). Note that the namespace is stored numerically in a seperate field, so an entry with cur_namespace=0 and cur_title='Foo' is the page [[Foo]], while that with cur_namespace=1 and cur_title='Foo' is [[Talk:Foo]] (see "includes/Defines.php" for the full list). The rest of the structure is probably self-explanatory give or take a bit of experimentation.
+) For more complex interactions, see the "Python Wikipedia Robot Framework", at http://pywikipediabot.sf.net
++) And, seeing as you said "or other content", Special:Recentchanges, Special:Newpages, and possibly others, have RSS and Atom feeds (see links in the "toolbox")
HTH
mediawiki-l@lists.wikimedia.org