One "typical" approach for a data set this type and size is Mix'n'match:
https://tools.wmflabs.org/mix-n-match/

If you get a list of IDs and names, let me know.


On Tue, Sep 6, 2016 at 7:17 PM Brill Lyle <wp.brilllyle@gmail.com> wrote:
Hi Wikidatans,

After going past my 500th edit on Wikidata #Whee! I was hoping to dip my toe into doing something that would involve a larger scale project, like adding database information to Wikidata.

There's a database I use all the time that is excellent, rich, deep, and well-deployed -- at JewishGen.org

main search page: http://www.jewishgen.org/Communities/Search.asp
example page: http://data.jewishgen.org/wconnect/wc.dll?jg~jgsys~community~-524980

I started a Property proposal here:

https://www.wikidata.org/wiki/Wikidata:Property_proposal/Place#JewishGen_Locality_ID_English

I have also contacted the folks over at JewishGen to ask if they might provide me with raw data, initially even just with the locality page IDs, then hopefully more rich / fuller data that's in the database.

I was wondering if this is 

(a) the typical approach people use when importing data
(b) if you have any advice / best practices to share
(c) also, if I should try and do a wget to scrub for this data (if that's even possible)? do people do this to grab data?

This information, I envision being used as part of a unique identifier that could be built into infoboxes, and might also be a sort of templatized box even (although I don't hugely love the issue of restricted / redirected editing away from Wikipedia). But I would really like to see this information in a pathway to Wikipedia. I think it would improve a lot of these town pages, a lot of which are stubs.

Best -- and thanks in advance for any advice,

Erika


Erika Herzog
Wikipedia User:BrillLyle
_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata