I'm writing a hashing algorithm that generates arbitrarily long hashes. the problem is, if i use one long string/list of data, the computer will run out of ram. one of the only efficient way to generate the hash is the Merkle-Damgård construction (the other methods might become copyrighted, so i'd rather not use them). however, that involves breaking data up into many pieces, called blocks, and then running each block through the actual algorithm. the question is, how big should the blocks be? I want them to be at least B (output length in nibbles) * 4 bits long, but not too large. however, the last block of data could be much smaller than the number generated by the function (that i want to find), causing errors
too big blocks→ too much time running through each block, RAM runs out
too small blocks→ the end result will be too small, too much time is wasted running through the numerous blocks, wrong output size
given A = (any value ≥0) input size in bytes, B = (any value >0) output size in nibbles (4 bits), is there a good function f(A) or f(A,B) that will tell me how large a block should be?
program page: http://calccrypto.wikidot.com/hash-2x-v2
Visit calccrypto.wikidot.com for detailed descriptions of algorithms and other crypto related stuff (not much yet, so help would be appreciated).