Timeline for Which file compression software for Linux offers the highest size reduction for source code?
Current License: CC BY-SA 3.0
19 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Feb 9 at 17:28 | comment | added | Pooya Estakhri | I tried lrzip -z with a jsonl file around 19 GB. it took 2.6 hours to compress the file and the result was about the same as xz | |
| Nov 6, 2023 at 8:42 | comment | added | TheKitMurkit | What about the "aggressive second-pass algorithm"? Where can one find info about it? | |
| Jun 16, 2017 at 15:36 | comment | added | ierdna | I Submitted a bug report. Unfortunately I can't provide the PDF file because it contains confidential medical information. | |
| Jun 16, 2017 at 15:26 | comment | added | Alexander Riccio | @andrei turn on verbose/debug mode - it may help here. The place we're gonna submit the bug is: github.com/ckolivas/lrzip/issues | |
| Jun 16, 2017 at 15:10 | comment | added | Alexander Riccio | @andrei well it sounds like you hit a bug. What version is installed and if it's the latest then we'll report it. Lemme grab the link. | |
| Jun 16, 2017 at 15:02 | comment | added | ierdna | i installed lrzip on archlinux, then ran lrzip -z document.pdf (the 56Mb file), it started working (counting 'chunks'), counted to 18, then threw that error | |
| Jun 16, 2017 at 14:20 | comment | added | Alexander Riccio | @andrei can you elaborate? | |
| Jun 16, 2017 at 12:19 | comment | added | ierdna | on archlinux (with 2gb ram), while trying to compress 56Mb pdf, it gets to 18 chunks, then Illegal instruction (core dumped) | |
| Jan 17, 2017 at 9:56 | comment | added | Astara | Do you know what the difference would be between lrzip and rzip? rzip looks like it was released in 1998 designed to do best on very large files with long distance redundancy, so it sounds similar to lrzip -- just wondering if lrzip was derived from rzip? (rzip from rzip.samba.org) | |
| Oct 29, 2016 at 14:02 | comment | added | Denys Vitali | Feels like Pied Piper! | |
| S Apr 23, 2016 at 11:55 | history | edited | Jakuje | CC BY-SA 3.0 | added an exempleformatting code |
| S Apr 23, 2016 at 11:55 | history | suggested | insign | CC BY-SA 3.0 | added an exemple |
| Apr 23, 2016 at 11:37 | review | Suggested edits | |||
| S Apr 23, 2016 at 11:55 | |||||
| Nov 2, 2015 at 11:07 | comment | added | mitchus | @Franki by 'contest', do you mean 'attest'? | |
| Jan 20, 2015 at 12:04 | comment | added | fnl | I've tried lrzip and pixz on a 19 GB text file. Both took about half an hour to compress it (on a hexa-core machine), but the lrz file was half the size of the xz file (2.7 vs. 4.4 GB). So, another vote for this answer instead. | |
| Nov 27, 2014 at 7:11 | comment | added | Franki | I also can contest that lrzip also works really great for backups of tar/cpio/pax'ed system file trees, because those usually contain lots of long range redundancies, something that lrzip is really good at compressing. | |
| S Mar 13, 2014 at 2:49 | review | Late answers | |||
| Mar 13, 2014 at 5:10 | |||||
| S Mar 13, 2014 at 2:49 | review | First posts | |||
| Mar 13, 2014 at 6:46 | |||||
| Mar 13, 2014 at 2:31 | history | answered | Alexander Riccio | CC BY-SA 3.0 |