Data is sporadically generated and appended to a file (consider mbox or a log file).
I want to store only an encrypted version.
For a single "batch" of data, this sort of thing would work:
$ echo hello | gpg -e -r key >file1 $ <file1 gpg -qd hello $ $ ( echo hello; sleep 10; echo world ) | gpg -e -r key >file1 $ <file1 gpg -qd hello world $ Is there a syntax to allow multiple separate batches of data to be incrementally appended to the encrypted file?
$ echo hello | gpg -e -r key [...] >file2 $ sleep 10 $ echo world | gpg -e -r key [...] >>file2 $ <file2 gpg -qd [...] hello world $ I didn't notice any relevent options in the gpg manpage.
I see there is gpgtar but this creates independent files.
Encryption output (third example above) seems to be just multiple PGP messages concatenated together, so presumably if there is a way to detect the boundaries, one could split the input and invoke gpg separately on each message although I guess that would be quite inefficient.
I see that there is significant overhead for encrypting small amounts of data (598 bytes were added to each chunk in my test) so perhaps there is a better approach.
--chunk-sizeand--force-aeadwhich indicate gpg encrypts data in chunks that a receiver of an encrypted stream can progressively check for errors. You would have to test whether or not appending encrypted chunks to an existing encrypted file would work as desired when you try to decrypt the whole file.dd conv=syncbut that would seem horrendously inefficient. also not clear it would help since output will still be a concatenation of separate messages (note that the idea is to invoke gpg multiple times, once for each arbitrarily-sized batch of input, not to have a single gpg process running permanently)