Thank you for making the measurements. Can you estimate the time for item Q183 specifically? Since it is 1000 entities weighing 19 MB, this means that on average the entities were 19 KB. Germany on the other hand is much larger, and it makes we wonder how it scales to that size.
On Oct 7, 2014 6:47 AM, "Jeroen De Dauw" <jeroendedauw@gmail.com> wrote:_______________________________________________Hey,I got some data to share. Walking through the dump for the first 1000 entities (~19Mb), took 0.008 seconds per item, where in each step the following things where done:* read line from file* json_decode the line* use the EntityDeserializer to turn the array into DataModel objectsGiven that these entities are on average a lot bigger than the typical one found in Wikidata, it looks like the average deserialization time is a few milliseconds. So now I really wonder why people are blaming DataModel 1.0. Everything seems to indicate most time is spend in Wikibase.git and MediaWiki.
CheersSoftware craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
_______________________________________________
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech