www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - New encryption block...

reply Era Scarecrow <rtcvb32 yahoo.com> writes:
  Something coming to mind is the idea of making a small algorithm 
to be used with other already existing encryption functions to 
extend the blocksize of encryption with minimal complexity 
growth. In theory this would extend a blocksize of 4 or 8 bytes, 
to say 64 bytes. Not only that each stage could use a different 
encryption type (or all the same).

  let's assume a very small blocksize of 2 bytes, and extend it to 
8. So logically it could be (numbers refer to original offsets):

  12 34
  56 78

  Each block (4) is encrypted normally. Then it's rearranged to 
say:

  35 28
  71 64

  Then encrypted again. Preferably a the rearranging is complex 
enough to do several steps before calling it done.

  Lastly there could be an optional salt block. Such a block would 
allow multiple instances/results of the same encrypted data and 
plenty of dummy data too, helping to hide identically encrypted 
data up to the combinations of the size of the salt.
Oct 09 2016
next sibling parent reply sarn <sarn theartofmachinery.com> writes:
On Sunday, 9 October 2016 at 20:33:29 UTC, Era Scarecrow wrote:
  Something coming to mind is the idea of making a small 
 algorithm to be used with other already existing encryption 
 functions to extend the blocksize of encryption with minimal 
 complexity growth. In theory this would extend a blocksize of 4 
 or 8 bytes, to say 64 bytes. Not only that each stage could use 
 a different encryption type (or all the same).
Do you have an application in mind? There are plenty of good algorithms to choose from today that already have a block size that's big enough (unlike, e.g., DES, which was too small).
  Then encrypted again. Preferably a the rearranging is complex 
 enough to do several steps before calling it done.
Remember, you have a few baselines to compare against for speed and security: 1. Encrypting once with something well accepted like AES-GCM or AES-CBC 2. Encrypting multiple times using other algorithms End users won't want to permute+encrypt multiple times unless you can show there's a benefit in the speed/security tradeoff, so you'll need to think about that in the design.
  Lastly there could be an optional salt block. Such a block 
 would allow multiple instances/results of the same encrypted 
 data and plenty of dummy data too, helping to hide identically 
 encrypted data up to the combinations of the size of the salt.
This sounds like the IV in off-the-shelf block modes: https://en.wikipedia.org/wiki/Block_cipher_mode_of_operation It's really not optional, though, because there are many attacks on encryption with poor IVs. If you're interested in this stuff, I strongly recommend these challenges: http://cryptopals.com/
Oct 09 2016
parent reply Era Scarecrow <rtcvb32 yahoo.com> writes:
On Monday, 10 October 2016 at 03:15:07 UTC, sarn wrote:
 End users won't want to permute+encrypt multiple times unless 
 you can show there's a benefit in the speed/security tradeoff, 
 so you'll need to think about that in the design.
The largest portion would be that much like a hash, one small change will change the entire thing rather than a smaller portion (with the original blocksize). The multiple re-arranging and encryption steps is to ensure small changes affects every other block it was part of. Just thinking that if someone makes a database of say the first 4 bytes expected in a file format (like gzip, bzip2, others, etc) then they can map most of the keys and immediately know how to decrypt it (assuming it's of a particular file/stream type). The larger block size also allows for multiple keys so you could push past far past the limitations of a single block cipher. As for a specific example, not really. Something fairly small, so personal documents and the like or archives, unlike say multi-media where it doesn't contain any personal data (probably). The only other idea is another multi-step process used for when generating hashes/keys or the like which is to slow down or make it annoyingly difficult to brute force passwords from a hashfile. Alternatively with the salting having it for encrypted communication would help hide sentences/replies where you reply the same thing over and over again. Y>Do you have the stuff? M>Yes Y>Did you stash it in the place? M>Yes Y>Do you like Lasagna? M>Yes or something like that :P Oh well. My question was mostly an idea, having something to look over for block ciphers will be an interesting read (when I get to it)
Oct 10 2016
parent sarn <sarn theartofmachinery.com> writes:
On Monday, 10 October 2016 at 09:54:32 UTC, Era Scarecrow wrote:
  The largest portion would be that much like a hash, one small 
 change will change the entire thing rather than a smaller 
 portion (with the original blocksize). The multiple 
 re-arranging and encryption steps is to ensure small changes 
 affects every other block it was part of.
With CBC block mode, for example, all blocks later in the data stream are changed if one block is changed. Earlier blocks aren't changed because CBC processes data in a single pass (which is an important practical requirement for a lot of applications). If you wanted all the blocks to change, two passes would be enough.
  Just thinking that if someone makes a database of say the 
 first 4 bytes expected in a file format (like gzip, bzip2, 
 others, etc) then they can map most of the keys and immediately 
 know how to decrypt it (assuming it's of a particular 
 file/stream type).
Yep, this is one of the many reasons all secure block modes must use an IV (or equivalent). BTW, if anyone's interested, here's a explanation of a real attack on short block size ciphers that doesn't assume background knowledge: https://blog.cryptographyengineering.com/2016/08/24/attack-of-week-64-bit-ciphers-in-tls/ (The defence is to stop using crypto that was looking bad in the 90s.)
Oct 10 2016
prev sibling parent reply Era Scarecrow <rtcvb32 yahoo.com> writes:
On Sunday, 9 October 2016 at 20:33:29 UTC, Era Scarecrow wrote:
  Something coming to mind is the idea of making a small 
 algorithm to be used with other already existing encryption 
 functions to extend the blocksize of encryption with minimal 
 complexity growth.
For fun I'm experimenting with this. So far seems fairly promising, although I'm sure I'm writing it very insecurely. Maybe it would be better for random number generation rather than secure encryption? Not sure. Anyways, 16bit replacement, extending to 64bit via reordering, and 8 unique xor's between stages. Once I get the 576 block finished (1 for salt) I'll probably publish my ugly code for consideration and to be torn apart for security issues.
Oct 12 2016
next sibling parent Era Scarecrow <rtcvb32 yahoo.com> writes:
On Wednesday, 12 October 2016 at 10:34:52 UTC, Era Scarecrow 
wrote:
  Anyways, 16bit replacement, extending to 64bit via reordering, 
 and 8 unique xor's between stages. Once I get the 576 block 
 finished (1 for salt) I'll probably publish my ugly code for 
 consideration and to be torn apart for security issues.
https://github.com/rtcvb32/Side-Projects/blob/master/block_shuffle.d Here's my ugly code, doing 2byte, 8byte, and 72byte blocks, (could do a 648byte block as well using the same pattern, or 520bytes), includes an RNG. Rather than 8 passes I reduced it to 3 passes. Currently no decided license, so... if anyone wants to glance it over.
Oct 15 2016
prev sibling parent reply Kagamin <spam here.lot> writes:
On Wednesday, 12 October 2016 at 10:34:52 UTC, Era Scarecrow 
wrote:
 Maybe it would be better for random number generation rather 
 than secure encryption? Not sure.
It's used in windows CRNG to compute a big hash of big amount of entropy. BTW if you encrypt something twice, isn't it decryption?
Oct 17 2016
parent reply Era Scarecrow <rtcvb32 yahoo.com> writes:
On Monday, 17 October 2016 at 08:20:23 UTC, Kagamin wrote:
 On Wednesday, 12 October 2016 at 10:34:52 UTC, Era Scarecrow 
 wrote:
 Maybe it would be better for random number generation rather 
 than secure encryption? Not sure.
It's used in windows CRNG to compute a big hash of big amount of entropy. BTW if you encrypt something twice, isn't it decryption?
No, not unless they are mapped in a way where x becomes y, and y becomes x. Like ROT13, that would result in decryption as encryption, or if the XOR is used twice then it's decryption or redundant. Trust me what I'm doing is considerably more complex, encrypting multiple times won't get you the original value (for a very very long time, unless you have a very poor set of RNG inputs) You know I can't help but remember the last time I got into encryption with Semeir, where it would create a number of circular chains of different lengths and then the rotation of where it would move to for encrypting would change based on not only the last character it encrypted but how far along in the stream it was. (Originally stream encryption, not block encryption). Alas breaking large data down into smaller and larger blocks (6 bit, 12 bit, 20 bit, etc) and encrypting each with a different set of keys was... time consuming, and it really was over my head... I doubt anyone uses it, although if they do I can almost guarantee no one can break it due to it's obscureness :P
Oct 17 2016
parent Kagamin <spam here.lot> writes:
On Monday, 17 October 2016 at 08:46:36 UTC, Era Scarecrow wrote:
 encrypting multiple times won't get you the original value
My impression is different. This is how decryption looks like for chacha: void ECRYPT_decrypt_bytes(ECRYPT_ctx *x,const u8 *c,u8 *m,u32 bytes) { ECRYPT_encrypt_bytes(x,c,m,bytes); }
Oct 17 2016