I'm very curious if you can run at Wikipedia scale with such a trie in memory on a normal computer (e.g. with only tens of GiB of memory). Please let us know if you actually get this into production (or just submit the script for inclusion in the framework, it sounds really useful) 


Pe vineri, 12 iulie 2019, Lucas Werkmeister <mail@lucaswerkmeister.de> a scris:

You probably want to use a trie for this – I found several available Python implementations, but I don’t know what their advantages or disadvantages are, so I’ll just list them in alphabetical order:


On 12.07.19 04:43, Huji Lee wrote:
Hi all,

I am working on a bot that fetches a list of anonymous editors on fawiki, uses WHOIS to retrieve more info about that IP, and uses a number of online APIs to check if the IP is a proxy or not.[1]

I would like to improve the code by implementing a CIDR cache, so that if I run whois on and determine that its ASN range is and then I encounter in the next iteration of my for loop, I would quickly determine this IP also belongs to the same range and skip the WHOIS part for it.

The search space would consist of IP ranges like " -" (these are the beginning and end IP addresses of the range). Obviously, we can convert these IPs to Hex to make comparisons easier. Given an IP like, we need the object to efficiently determine if it already has an IP range that encompasses this given IP and if so, return the previously cached details for that IP pair. If not, we will store that in cache.

The part that I am not fully clear about is the following: how can I avoid having to loop through every range in the cache? Is there a way to create a hash function that checks two inequality comparisons efficiently?



pywikibot mailing list