Hello everyone ,
I am very new to the mediawiki-api . I am planning on using this to extract geo - information about places . I have been referring to this tutorial by scraper wiki https://blog.scraperwiki.com/2011/12/how-to-scrape-and-parse-wikipedia/ . Though I am not sure if I should be using this api or the offline dump , and what is the difference between the two data sets ? I need to parse a lot of data (the articles as well as extract the geo-coordinates of places ) . Please help me out with this . Thank you !