This is a very nice script but little known about how it works, especially for people like me...
I have some questions..
1-How does it login to the wiki? I didn't give it information...didn't set AdminSettings.php..does it take the info from LocalSettings.php?
2-How does it add the pages to the wiki? I mean, does it check for every page and compare then it updates it?
3-Is it normal that it works for some xx number of pages, then it ends without any errors but at the same time ends without finishing off the dump? when It starts again, it goes fast through until the number it ended at, then it becomes slow..works for a bit..then it ends...
4-What happens if you have a loaded database of an old dump, then you started it with a new dump?
Thanks!
Mohamed Magdy wrote:
This is a very nice script but little known about how it works, especially for people like me...
I have some questions..
1-How does it login to the wiki? I didn't give it information...didn't set AdminSettings.php..does it take the info from LocalSettings.php?
2-How does it add the pages to the wiki? I mean, does it check for every page and compare then it updates it?
3-Is it normal that it works for some xx number of pages, then it ends without any errors but at the same time ends without finishing off the dump? when It starts again, it goes fast through until the number it ended at, then it becomes slow..works for a bit..then it ends...
4-What happens if you have a loaded database of an old dump, then you started it with a new dump?
Thanks!
Guys, did you get that email? or you got it but don't know? or you got but just ignoring me because my questions are so ridicules and I should go RTFM.. if so..where?
On 4/25/07, Mohamed Magdy mohamed.m.k@gmail.com wrote:
Guys, did you get that email? or you got it but don't know? or you got but just ignoring me because my questions are so ridicules and I should go RTFM.. if so..where?
You could start from http://meta.wikimedia.org/wiki/Data_dumps
HTH, IRL.
On 4/24/07, Ivan Lanin ivanlanin@gmail.com wrote:
On 4/25/07, Mohamed Magdy mohamed.m.k@gmail.com wrote:
Guys, did you get that email? or you got it but don't know? or you got but just ignoring me because my questions are so ridicules and I should go RTFM.. if so..where?
You could start from http://meta.wikimedia.org/wiki/Data_dumps
HTH, IRL.
I knew that it existed from this page ;)
I knew a couple of things..
1-if you used it with the uncompressed xmls, it will be faster (obviously ) than the bziped... 2-it doesn't matter if it ends before it finishes because it don't reinsert the stuff again but I think it just checks if there is a difference between the records in the db and the xml file...because it goes through the previously inserted records fast then slows when it starts to enter completely new records...and that is great IMO but wouldn't it be nice if we can specify from which record it starts? 3-I didn't try it yet under linux but with windows, it takes a lot (almost) all computer resources, it even makes another programs to close..just ends them...weird.. 4-I think it don't need db details because it uses Special:Import that uses mediawiki and also that is why it is slower than mysqlimport....
Can someone add this feature? the ability for it to sleep for x number of seconds after importing y number of pages? and we can set them in it...
What is "HTH, IRL." anyway?
Another thing..what happens when I have a newer dump? I mean.. do I have to clear the database in order for it to be updated? or importDump.php will just update the existing pages from the new dump and add newly created pages? thanks
In my experience, it adds a new revision if the timestamp on the dump XML is more recent than the timestamp on the existing page. Otherwise it ignores the import.
Jim
On Apr 28, 2007, at 7:46 AM, Mohamed Magdy wrote:
Another thing..what happens when I have a newer dump? I mean.. do I have to clear the database in order for it to be updated? or importDump.php will just update the existing pages from the new dump and add newly created pages? thanks
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikitech-l
===================================== Jim Hu Associate Professor Dept. of Biochemistry and Biophysics 2128 TAMU Texas A&M Univ. College Station, TX 77843-2128 979-862-4054
Jim Hu wrote:
In my experience, it adds a new revision if the timestamp on the dump XML is more recent than the timestamp on the existing page. Otherwise it ignores the import.
Ah, thanks a lot!.. so say..after importing 6 dumps (each with only the newest revision) then I will have 6 revisions in page history..right?
Is there a sql command to 'cleanup' and delete all revisions of all pages except the last revision?
Jim
Mohamed
On Apr 28, 2007, at 7:46 AM, Mohamed Magdy wrote:
Another thing..what happens when I have a newer dump? I mean.. do I have to clear the database in order for it to be updated? or importDump.php will just update the existing pages from the new dump and add newly created pages? thanks
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikitech-l
===================================== Jim Hu Associate Professor Dept. of Biochemistry and Biophysics 2128 TAMU Texas A&M Univ. College Station, TX 77843-2128 979-862-4054
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikitech-l
Mohamed Magdy wrote:
Jim Hu wrote:
In my experience, it adds a new revision if the timestamp on the dump XML is more recent than the timestamp on the existing page. Otherwise it ignores the import.
Ah, thanks a lot!.. so say..after importing 6 dumps (each with only the newest revision) then I will have 6 revisions in page history..right?
Is there a sql command to 'cleanup' and delete all revisions of all pages except the last revision?
maintenance/deleteOldRevisions.php
wikitech-l@lists.wikimedia.org