I have an application that generates a AES Key (using Security.Cryptography). I take that AES Key, convert it to string and put it in a cookie like this:
string keyToSend = Encoding.UTF8.GetString(CurrentKey);
HttpCookie sessionKeyCookie = new HttpCookie("SessionKey", JsonConvert.SerializeObject(keyToSend));
keyToSend looks like this: "���K��Ui ����&��Ӂ*��()".
Then, I want to take back that key and use it to decrypt something, and I do this:
string keyString = JsonConvert.DeserializeObject<string>(context.Cookies["SessionKey"].Value);
byte[] ascii = Encoding.ASCII.GetBytes(cevaString);
byte[] utf8 = Encoding.UTF8.GetBytes(cevaString);
byte[] utf32 = Encoding.UTF32.GetBytes(cevaString);
Also, my keyToString looks like this: "���K��Ui ����&��Ӂ*��()".
And my browser cookie looks like this: "�\u0010��K��Ui �\u0010�\u000f�\u001f\u0005�\u0012\u0018&��Ӂ*��()\u001e"
The initial key should have 256bits, so 32 entries in that array, but all my variables (ascii, utf8, utf32) have different lengths. Why is that, how can I retrieve the cookie and convert it to a byte[32] array?
It sounds like CurrentKey
is arbitrary binary data - not a UTF-8 encoded string. If you've got arbitrary data which you need to encode as a string (e.g. an image, or encrypted or compressed data) you're usually best off using Base64 or hex encoding. Base64 is pretty easy:
string keyToSend = Convert.ToBase64String(CurrentKey);
...
byte[] recoveredKey = Convert.FromBase64String(keyString);
See more on this question at Stackoverflow