and to add my own view:
>regarding large ram requests; keep in mind that on modern OSes, the OS will only grant a percentage of real RAM, and a percentage of Virtual RAM; I think the ratio is 40/60 in windows as is in redhat linux. I rarely create a hash with less than 9 million elements, yet app rarely uses more than 32mb of ram.
>Secondly, with some pre-processing, you can convert a hash table to a lookup table - like a foreignkey of a database. To this effect, for example I use sqlite memory db, since then I can mix my sql statements. I'll post some pseudo code later on this
>third, for efficiency you can use just 1 hash table for all your hashing need providing you use a generic hash function (like crc32) which gives few collissions; crc32 is quite good in that. I am using mixed C/PB code to further increase the hash efficiency but the gain is minor...about 5%
http://www.purebasic.fr/english/viewtop ... macro+hash
But there is a final twist for "small" hash tables
There was a post once for creating small hashes using macros...was a question I asked in fact, but I can't find it. Removing the overhead of calling functions (by using macros) and using as simple a hash as you can get away with, well, you have even faster hashing.