Skip to main content
added 166 characters in body
Source Link

Yes, your reasoning is correct, as tar doesn't sort files by extensions (which could have helped a lot to achieve higher compression ratios) and gzip is a very old compression algorithm with a relatively modest dictionary, just 32KB.

Please try using xz or p7zip instead.

Here's a compression string which allows me to achieve the highest compression ratio under Linux:

7za a -mx=9 -myx=9 -mfb=273 -bt -slp -mmt4 -md=1536m -mqs archive.7z [list of files] 

This requires a ton of memory (at the very least 32GB of RAM). If you remove -mmmt4 and reduce the dictionary size to say 1024m, 16GB would be enough.

Speaking of sorting files for tar. I wrote a script which does just that a few years ago: https://github.com/birdie-github/useful-scripts/blob/master/tar_sorted

Yes, your reasoning is correct, as tar doesn't sort files by extensions (which could have helped a lot to achieve higher compression ratios) and gzip is a very old compression algorithm with a relatively modest dictionary, just 32KB.

Please try using xz or p7zip instead.

Here's a compression string which allows me to achieve the highest compression ratio under Linux:

7za a -mx=9 -myx=9 -mfb=273 -bt -slp -mmt4 -md=1536m -mqs archive.7z [list of files] 

This requires a ton of memory (at the very least 32GB of RAM). If you remove -mmmt4 and reduce the dictionary size to say 1024m, 16GB would be enough.

Yes, your reasoning is correct, as tar doesn't sort files by extensions (which could have helped a lot to achieve higher compression ratios) and gzip is a very old compression algorithm with a relatively modest dictionary, just 32KB.

Please try using xz or p7zip instead.

Here's a compression string which allows me to achieve the highest compression ratio under Linux:

7za a -mx=9 -myx=9 -mfb=273 -bt -slp -mmt4 -md=1536m -mqs archive.7z [list of files] 

This requires a ton of memory (at the very least 32GB of RAM). If you remove -mmmt4 and reduce the dictionary size to say 1024m, 16GB would be enough.

Speaking of sorting files for tar. I wrote a script which does just that a few years ago: https://github.com/birdie-github/useful-scripts/blob/master/tar_sorted

Source Link

Yes, your reasoning is correct, as tar doesn't sort files by extensions (which could have helped a lot to achieve higher compression ratios) and gzip is a very old compression algorithm with a relatively modest dictionary, just 32KB.

Please try using xz or p7zip instead.

Here's a compression string which allows me to achieve the highest compression ratio under Linux:

7za a -mx=9 -myx=9 -mfb=273 -bt -slp -mmt4 -md=1536m -mqs archive.7z [list of files] 

This requires a ton of memory (at the very least 32GB of RAM). If you remove -mmmt4 and reduce the dictionary size to say 1024m, 16GB would be enough.