Thanks All, but my question is till there :) let me rephrase , suppose we have the following dumps list
http://dumps.wikimedia.org/enwiki/20130403/

at there there are so much files with name  enwiki-20130403-pages-meta-current27.xml-p029625001p039009132.bz2  of same data as well different SQL files for different tables.
Q1: SQL files clearly tells the mapping to database table but the ".bz2" not telling about database mapping.  
Q2: There are multiple "
".bz2"" file with same name , we should take the largest size file ?

Please let me know about that .


Regards,
IMran


On Thu, Apr 11, 2013 at 1:20 PM, Ariel T. Glenn <ariel@wikimedia.org> wrote:
Óôéò 11-04-2013, çìÝñá Ðåì, êáé þñá 08:42 +0200, ï/ç Federico Leva
(Nemo) Ýãñáøå:
> Imran Latif, 11/04/2013 08:26:
> > Thanks for replying, Your reply  makes sense. I just need to confirm
> > that if i used the following Dump
> > http://dumps.wikimedia.org/fiwiki/20130323/
> >
> > And download all sql and XML files and populates my table using some
> > utility, then the whole Wikipedia data is configured ?
> > I mean to say that this dump provide me whole data of Wikipedia,
> > including content, revision history etc. Or i need something more.
>
> Well, importDump.php is horrible and we have a thread "making imports
> suck less" 20 min before yours. :)
> But Ariel made some nice pages:
> <https://meta.wikimedia.org/wiki/Data_dumps/Tools_for_importing#Converting_to_SQL_first>
> is the way and
> <https://meta.wikimedia.org/wiki/Data_dumps/Import_examples> has a
> viable tutorial.
> In some cases you may prefer to avoid some stuff you don't need, or even
> to populate some tables on your own... add your examples, as the page
> says. ;-)
>
> Nemo
Uh oh I gotta update those now that the new tools went out.

Thanks for reminding me :-D

Ariel



_______________________________________________
Xmldatadumps-l mailing list
Xmldatadumps-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l