Hi!
Which script, please, and which dump? (The conversation was not forwarded so I don't have the context.)
Sorry, the original complaint was:
I apologize if I missed something, but why the current JSON dump size
is ~25GB while a week ago it was ~58GB? (see https://dumps.wikimedia.org/wikidatawiki/entities/20190617/)
But looking at it now, I see wikidata-20190617-all.json.gz is comparable with the last week, so looks like it's fine now?