Appears we are also hitting an uncaught exception when trying to view Q183.
We have narrowed down where this occurs. Most items, this does not occur. It could be, due to the size of the item, we hit some php bug or issue.
details: https://bugzilla.wikimedia.org/show_bug.cgi?id=71519#c24
Cheers, Katie
On Tue, Oct 7, 2014 at 4:23 PM, Denny Vrandečić vrandecic@gmail.com wrote:
Thank you for making the measurements. Can you estimate the time for item Q183 specifically? Since it is 1000 entities weighing 19 MB, this means that on average the entities were 19 KB. Germany on the other hand is much larger, and it makes we wonder how it scales to that size. On Oct 7, 2014 6:47 AM, "Jeroen De Dauw" jeroendedauw@gmail.com wrote:
Hey,
I got some data to share. Walking through the dump for the first 1000 entities (~19Mb), took 0.008 seconds per item, where in each step the following things where done:
- read line from file
- json_decode the line
- use the EntityDeserializer to turn the array into DataModel objects
Given that these entities are on average a lot bigger than the typical one found in Wikidata, it looks like the average deserialization time is a few milliseconds. So now I really wonder why people are blaming DataModel 1.0. Everything seems to indicate most time is spend in Wikibase.git and MediaWiki.
Cheers
-- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Evil software architect at Wikimedia Germany ~=[,,_,,]:3
Wikidata-tech mailing list Wikidata-tech@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
Wikidata-tech mailing list Wikidata-tech@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-tech