Hi!
I try to extract all mappings from wikidata to the GND
authority file,
along with the according wikipedia pages, expecting roughly 500,000 to
1m triples as result.
As a starting note, I don't think extracting 1M triples may be the best
way to use query service. If you need to do processing that returns such
big result sets - in millions - maybe processing the dump - e.g. with
wikidata toolkit at
https://github.com/Wikidata/Wikidata-Toolkit - would
be better idea?
However, with various calls, I get much less triples
(about 2,000 to
10,000). The output seems to be truncated in the middle of a statement, e.g.
It may be some kind of timeout because of the quantity of the data being
sent. How long does such request take?
--
Stas Malyshev
smalyshev(a)wikimedia.org