[Botan-devel] Small file ciphering speed

Christophe Meessen meessen at cppm.in2p3.fr
Thu Sep 18 07:55:14 EDT 2008


Mr Diggilin a écrit :
> Thank you for you're reply. Unfortunately I have to confess I'm rather
> new to this, and so for the most part you lost me.
> One thing is that it seems I didn't make clear is that the database
> fields are completely independent of one another. Random access is the
> rule. This is where the problem is. I am making a new pipe & key and all
> that for every single 50 byte field. My question is what my options are
> for making such a system faster (or perhaps someone has a completely
> different approach to recommend?).
>   
Unfortunately I'm not an expert of Botan since I currently don't use it.
I subscribed to this mailing list and keep following its development and 
might go back to it later.

So I can suggest a cryptographic method but I won't be able to make a 
sound suggestion on how to apply it with Botan.

The trick is to generate a unique key for each record.

To do so use a unique global key (i.e. 256bit). The generate a random 
sequence of bytes of the same length. Save these in a secure location.

Now you have to generate the specific key for each record. To do so, you 
xor together the random byte sequence and the unique record id I hope 
you have access to. If not use some data in the record (i.e. primary 
key), but make sure it is unique. Then encrypt this with the global. Use 
the resulting encrypted text as key to encrypt and decrypt the records.

In fact the CFB chaining algorithm does something very similar. So a 
simpler implementation would be to use the previously generated xored 
result (random bytes with unique record data) as IV value of an AES/CFB 
or TwoFish/CFB encryption engine you run on your record data.
It will be as if each record is encrypted with a unique key.










> I've already put forth the possibility of reusing the key and reusing
> the pipe...
> As for the key reuse, I understood the security problem you brought up,
> but I didn't understand the solution, perhaps because you thought I was
> doing something I'm not with the database.
> As for the pipe reuse... is that recommended, or is there another way to
> deal with a whole bunch on little encryption jobs?
>
> Thanks again.
> -mr diggilin
>
> On Thu, 2008-09-18 at 08:27 +0200, Christophe Meessen wrote:
>   
>> Hello,
>>
>> don't know how this can fit your needs and with Botan but here are my 
>> suggestions.
>>
>>
>> Though some informations are missing. Is decoding always sequential and 
>> in the same order ?
>> Could there be missing/deleted/inserted  records in the sequence ?
>>
>> A single global key is enough if the amount of data is limited and 
>> encrypted data is not much exposed. The security can be increased by 
>> using a different key for each record, but you'll pay it with increased 
>> processing time.
>>
>> The key to be used for each record would be a concatenation of a 
>> sequence of random bytes common to all records (salt) and the record 
>> identifier. It is assumed that the record identifier is a constant for a 
>> given record. Encrypt this constructed key with a global key and save 
>> the random bytes to generate the record key secured and hidden with the 
>> global key. It is probably more secure to insert the record indentifier 
>> in the middle and not at any ends of the record key.
>>
>> For the record itself I would recommend ctr, or better cfb, as chaining 
>> algorithm because they doen't require adding padding bytes and are quite 
>> fast.  The trick used to generate the  secondary key is the same working 
>> principle used in ctr.
>>
>> With this method you will be able to decode any record with random 
>> access. Records can be added or deleted. The only constrain is this 
>> constant record identifier.
>>
>> I don't understand the S2K with 100 hash rounds. If this done for each 
>> record, this is indeed not very efficient.
>>
>>
>>
>> Mr Diggilin a écrit :
>>     
>>> I'm trying to use botan for encryption and decryption of thousands of
>>> database rows with very little data (~50 bytes) in them. What I'm doing
>>> currently takes about 1/4th of a second per field, which is *much*
>>> slower than would make this practical. I have a few thoughts on how to
>>> optimize, but I thought I'd ask here first to find out:
>>> a. What's a good way of doing this with Botan?
>>> b. What kind of speed can I expect?
>>> c. I'm using an S2K with 100 hash rounds for each operation, which isn't
>>> much. Would there be any security concerns if I reused the same key
>>> (with more rounds) over all of the thousands of entries?
>>>
>>> My current de/ciphering routine as follows:
>>>
>>> auto_ptr<Pipe> Crypto::RunCipher(string Passphrase, wxInputStream * In,
>>> SecureVector<byte>& Salt)
>>> {
>>>   Cipher_Dir Dir = (Salt == NULL) ? ENCRYPTION : DECRYPTION;
>>>   KeyAndIV = 100 round s2k;
>>>   SymmetricKey Key(KeyAndIV, KeySize); //256 key
>>>   InitializationVector IV(KeyAndIV + KeySize, IVSize); //128 IV
>>>   auto_ptr<Pipe> Out(new Pipe(get_cipher("Twofish/EAX", Key, IV,
>>> Dir)));
>>>   Out->start_msg();
>>>   *In >> *Out;
>>>   Out->end_msg();
>>>   return Out;
>>> }
>>>
>>> _______________________________________________
>>> botan-devel mailing list
>>> botan-devel at randombit.net
>>> http://lists.randombit.net/mailman/listinfo/botan-devel
>>>   
>>>       
>
> _______________________________________________
> botan-devel mailing list
> botan-devel at randombit.net
> http://lists.randombit.net/mailman/listinfo/botan-devel
>   




More information about the botan-devel mailing list