www.digitalmars.com         C & C++   DMDScript  

digitalmars.D.learn - Strange Bug

reply Chris M. <chrismohrfeld comcast.net> writes:
I have no idea if this is an issue with D, or OpenSSL, or if I'm 
just doing something completely wrong. I'm writing a program that 
will either encrypt or decrypt a string using AES in ECB mode 
(for a school assignment) and it's giving me a very strange bug.

encrypt and decrypt are both bools, either one or the other is 
set based on command-line arguments passed to the program. aesKey 
is a 128-bit value hashed from user input, and is (obviously) 
used as the key for encryption. inputStr is either the string to 
be encrypted or decrypted (also passed in as a command-line 
argument). I grabbed the OpenSSL bindings from Deimos to do this.

if(encrypt)
{
     AES_KEY encKey;
     auto encOut = new ubyte[inputStr.length];

     // Encrypt and convert to base64
     AES_set_encrypt_key(aesKey.ptr, aesKey.sizeof * 8, &encKey);
     AES_ecb_encrypt(inputStr.ptr, encOut.ptr, &encKey, 
AES_ENCRYPT);
     writeln(Base64.encode(encOut));
}
else if(decrypt) // I'd leave this as else, but it's here for 
explanation purposes
{
     AES_KEY decKey;
     auto decLength = Base64.decodeLength(inputStr.length);
     auto decB64 = new ubyte[decLength], decOut = new 
ubyte[decLength];

     // convert back from base64 and decrypt
     decB64 = Base64.decode(inputStr); // Yes I checked, and 
decB64 has exact the same contents as encOut from the if block
     AES_set_decrypt_key(aesKey.ptr, aesKey.sizeof * 8, &decKey);
     AES_ecb_encrypt(decB64.ptr, decOut.ptr, &decKey, AES_DECRYPT);
     writeln(cast(char[]) decOut);
}

However, this isn't working for a very strange reason (spits back 
garbage instead of the string I originally encrypted).

Here's the problem. I tried running this without the if-else 
statements (i.e. encrypting and decrypting all in one run of the 
program, code below). If I leave in the base64 encoding and 
decoding, and use decB64 as the input to decrypt, it still 
doesn't work. However, if I decrypt with encOut directly, or 
assign encOut to decB64, it somehow works.

AES_KEY encKey;
auto encOut = new ubyte[inputStr.length];

// Encrypt and convert to base64
AES_set_encrypt_key(aesKey.ptr, aesKey.sizeof * 8, &encKey);
AES_ecb_encrypt(inputStr.ptr, encOut.ptr, &encKey, AES_ENCRYPT);

auto decLength = 
Base64.decodeLength(Base64.encode(encOut).length);
AES_KEY decKey;
auto decB64 = new ubyte[decLength], decOut = new ubyte[decLength];

// convert back from base64 and decrypt
decB64 = Base64.decode(Base64.encode(encOut));
// doesn't work unless I uncomment out the following line, or 
just use encOut directly
//decB64 = encOut;
AES_set_decrypt_key(aesKey.ptr, aesKey.sizeof * 8, &decKey);
AES_ecb_encrypt(decB64.ptr, decOut.ptr, &decKey, AES_DECRYPT);
writeln(cast(char[]) decOut);

tl;dr The decryption doesn't work unless I pass it the exact same 
buffer (including not only contents, but apparently the exact 
same memory address) that I used to receive output from encryption

Does anyone have any idea where the issue may lie, or how I could 
fix this?

Here's the full program if you want to take a look
http://pastebin.com/KyY103Ac
Jan 20
next sibling parent Timothee Cour via Digitalmars-d-learn <digitalmars-d-learn puremagic.com> writes:
This and some other recent posts (`Is this a bug?`, `Hopefully a simple
question...`). If you want help (and help other ppl who search for similar
issues), could you please make the subject more descriptive?



On Fri, Jan 20, 2017 at 12:19 AM, Chris M. via Digitalmars-d-learn <
digitalmars-d-learn puremagic.com> wrote:

 I have no idea if this is an issue with D, or OpenSSL, or if I'm just
 doing something completely wrong. I'm writing a program that will either
 encrypt or decrypt a string using AES in ECB mode (for a school assignment)
 and it's giving me a very strange bug.

 encrypt and decrypt are both bools, either one or the other is set based
 on command-line arguments passed to the program. aesKey is a 128-bit value
 hashed from user input, and is (obviously) used as the key for encryption.
 inputStr is either the string to be encrypted or decrypted (also passed in
 as a command-line argument). I grabbed the OpenSSL bindings from Deimos to
 do this.

 if(encrypt)
 {
     AES_KEY encKey;
     auto encOut = new ubyte[inputStr.length];

     // Encrypt and convert to base64
     AES_set_encrypt_key(aesKey.ptr, aesKey.sizeof * 8, &encKey);
     AES_ecb_encrypt(inputStr.ptr, encOut.ptr, &encKey, AES_ENCRYPT);
     writeln(Base64.encode(encOut));
 }
 else if(decrypt) // I'd leave this as else, but it's here for explanation
 purposes
 {
     AES_KEY decKey;
     auto decLength = Base64.decodeLength(inputStr.length);
     auto decB64 = new ubyte[decLength], decOut = new ubyte[decLength];

     // convert back from base64 and decrypt
     decB64 = Base64.decode(inputStr); // Yes I checked, and decB64 has
 exact the same contents as encOut from the if block
     AES_set_decrypt_key(aesKey.ptr, aesKey.sizeof * 8, &decKey);
     AES_ecb_encrypt(decB64.ptr, decOut.ptr, &decKey, AES_DECRYPT);
     writeln(cast(char[]) decOut);
 }

 However, this isn't working for a very strange reason (spits back garbage
 instead of the string I originally encrypted).

 Here's the problem. I tried running this without the if-else statements
 (i.e. encrypting and decrypting all in one run of the program, code below).
 If I leave in the base64 encoding and decoding, and use decB64 as the input
 to decrypt, it still doesn't work. However, if I decrypt with encOut
 directly, or assign encOut to decB64, it somehow works.

 AES_KEY encKey;
 auto encOut = new ubyte[inputStr.length];

 // Encrypt and convert to base64
 AES_set_encrypt_key(aesKey.ptr, aesKey.sizeof * 8, &encKey);
 AES_ecb_encrypt(inputStr.ptr, encOut.ptr, &encKey, AES_ENCRYPT);

 auto decLength = Base64.decodeLength(Base64.encode(encOut).length);
 AES_KEY decKey;
 auto decB64 = new ubyte[decLength], decOut = new ubyte[decLength];

 // convert back from base64 and decrypt
 decB64 = Base64.decode(Base64.encode(encOut));
 // doesn't work unless I uncomment out the following line, or just use
 encOut directly
 //decB64 = encOut;
 AES_set_decrypt_key(aesKey.ptr, aesKey.sizeof * 8, &decKey);
 AES_ecb_encrypt(decB64.ptr, decOut.ptr, &decKey, AES_DECRYPT);
 writeln(cast(char[]) decOut);

 tl;dr The decryption doesn't work unless I pass it the exact same buffer
 (including not only contents, but apparently the exact same memory address)
 that I used to receive output from encryption

 Does anyone have any idea where the issue may lie, or how I could fix this?

 Here's the full program if you want to take a look
 http://pastebin.com/KyY103Ac
Jan 20
prev sibling parent reply Rene Zwanenburg <renezwanenburg gmail.com> writes:
On Friday, 20 January 2017 at 08:19:57 UTC, Chris M. wrote:
 I have no idea if this is an issue with D, or OpenSSL, or if 
 I'm just doing something completely wrong. I'm writing a 
 program that will either encrypt or decrypt a string using AES 
 in ECB mode (for a school assignment) and it's giving me a very 
 strange bug.
 
 [...]
 
     auto encOut = new ubyte[inputStr.length];

     // Encrypt and convert to base64
     AES_set_encrypt_key(aesKey.ptr, aesKey.sizeof * 8, &encKey);
     AES_ecb_encrypt(inputStr.ptr, encOut.ptr, &encKey, 
 AES_ENCRYPT);
 
 Here's the problem. I tried running this without the if-else 
 statements (i.e. encrypting and decrypting all in one run of 
 the program, code below). If I leave in the base64 encoding and 
 decoding, and use decB64 as the input to decrypt, it still 
 doesn't work. However, if I decrypt with encOut directly, or 
 assign encOut to decB64, it somehow works.
My guess: The encrypted output will be a bit longer than your input. You're not getting an out of bounds exception during encryption since OpenSSL only has the pointer to write to, not a buffer length it can check. The memory behind your buffer is apparently committed, and will be written to by OpenSSL. This is why using the same buffer to decrypt works: it continues to read after the end of the buffer. (In case it's not clear, writing and reading past the end of the buffer is really bad) I expect OpenSSL to have a helper function to calculate the required buffer size for a given input length. Use that to allocate the buffer.
Jan 20
parent Chris M. <chrismohrfeld comcast.net> writes:
On Friday, 20 January 2017 at 11:58:39 UTC, Rene Zwanenburg wrote:
 On Friday, 20 January 2017 at 08:19:57 UTC, Chris M. wrote:
 [...]
My guess: The encrypted output will be a bit longer than your input. You're not getting an out of bounds exception during encryption since OpenSSL only has the pointer to write to, not a buffer length it can check. The memory behind your buffer is apparently committed, and will be written to by OpenSSL. This is why using the same buffer to decrypt works: it continues to read after the end of the buffer. (In case it's not clear, writing and reading past the end of the buffer is really bad) I expect OpenSSL to have a helper function to calculate the required buffer size for a given input length. Use that to allocate the buffer.
Turns out that was the issue, I needed to reserve more space for the encrypted string with this cipherLen = (clearLen/16 + 1) * 16; The fun of working with C libraries Anyway, thanks for the help
Jan 20