I have asked this qestion on several other forums and never get an answer, so let me test this forum out.
I am not use to working with binary data and I am stumped here.
I have a program that stores member passwords in a MS SQL 2000 database. The plain text limit of the password is 30 characters. Then that plain text is hashed by SHA512 using a salt that is 8-16 bytes long. The size of the salt is randomly chosen by the Random() function each time a new password is hashed. Now I want to store that password which is now hashed into one varbinary field in the table and the salt stored in a seperate varbinary field in the database, both fields are in the same table. What I need to know is what size to I need to make my varbinary field?? I don’t want it too large its just wasting space, and I don’t want it too small so that the maxium password (30 characters, hashed using the maximum salt of 16 bytes) wont fit in the field. I also need to make sure that the salt value will fit in its field. I have been told just make it 200 for the size, because it will automatically size itself and will only use what is required to store the data, and 200 should be large enough to hold the data. Basically they are just guessing. I need to know how much to make this field. If it was just a nvarchar field I would have no problem 30 characters plain text is nvarchar 30 in the database. But what is 30 characters plain text hashed with a maximum salt of 16 using a SHA512 (which is stored in a byte in C#) convert to as far as a binary field goes? Does the binary field still store just 0 and 1 so the max the binary field would be is 30 0’s and 1’s in some order? If that’s the case I should make my binary field size 30. I’m not even sure if the SHA512 and salt add additional length to the password so a 30 byte password ends up being 500 bytes after its been hashed. Its hard for me to test this because the hash is stored directly into an array which is saved to a binary field.
Can someone please help me so I can finish my program. All I need to know is how to calculate this, maybe using a calculator. I need to know how to find the size myself too, for future reference. If you post the size I need that’s greate, but can you please post how you would calculate this problem so I can figure it out on my own next time?
hash algorithims, such as the MD and SHA series, output fixed-length data regardless of the size of the input data. either consult the SQL server manual to find out how many bytes the function returns, or make the field very large and test it out.
for example, mysql returns 32 characters for MD5 and 40 characters for SHA1.
So if the output from the function was 40 bytes for SHA512 then I need to make the varbinary field size 40? or is the size in binary different then the size in characters? The SHA512 hash code is written in C# not in SQL… not sure if that makes a difference or not.
Thanks for the reply. I registered a few weeks ago but this is my first post.
if you were using a VARCHAR field, then the size in bytes = # of characters, provided you’re using a single-byte character set. i’m not sure of the exact behavior of a VARBINARY field. maybe rudy could shed som light on that?
An MD5 hash actually returns 16 bytes. Hex decoded that turns into the 32 charcacters one gets from MySql or PHP.
Not certain off the top of my head what SHA1 does, but figuring out should not be too hard. Just set a breakpoint in your code and see how long the byte array is when you get the return from the crypto function.
On the SQL end, you should use a binary field not varbinary since the length is fixed no matter what you do. Just make certain to pick the correct length.
True but most are concerned with retrieval and write sizes…anything over an SHA3 is not going to be stored in one “row” for 2k and 2/5k if you store as binary. You have to add n+4 for storage.