> How can I speed up the queries processing even more?
imho: drop the unwanted data as early as you can ... ( ~ aggressive prefiltering ; ~ not import )
> Any suggestion will be appreciated.
in your case ..
- I will try to write a custom filter for pre-filter for 2 million parameters ... ( simple text parsing .. in GoLang; using multiple cores ... or with other fast code )
- and just load the results to PostgreSQL ..
I have a good experience - parsing the and filtering the wikidata json dump (gzipped) .. and loading the result to PostgreSQL database ..
I can run the full code on my laptop .... and the result in my case ~ 12 GB in the PostgreSQL ...
the biggest problem .. the memory requirements of "2 million parameters" .. but you can choose some fast key-value storage .. like RocksDB ...
but there are other low tech parsing solutions ...
Regards,
Imre
Best,
Imre