Skip to main content
6 events
when toggle format what by license comment
Dec 18, 2023 at 15:47 comment added FABBRj this not solve my problem
Nov 14, 2022 at 0:06 comment added Stephen Harris What you're asking for isn't supported by "native" Unix tooling. You'd need a tool that would evaluate all the files to be archived, potentially optimise them in terms of size, and then create <n> independent archives. There's noting "out of box" that does this. I could see a scripted solution to this, and I'm sure someone somewhere has solved it. But I'm not sure it's a useful thing. Typically we archive things so we can restore them; cat * | tar... makes it easier to recover file1234.jpg. If it's split over 10 different files then it becomes harder to do the restore!
Nov 13, 2022 at 22:34 comment added Paul_Pedant @Martin You would have to make lists of each group of files of the approximate size you need to split, and fake the file numbering yourself. Split works on byte count or line count, but there is no way it can be aware of the tar file boundaries. As noted in another comment, most image formats (like jpeg) have integral compression, which is probably mire effective than generic gzip because they can take advantage of typical image features. Further attempts at compression just wastes CPU cycles.
Nov 13, 2022 at 18:15 comment added Martin Müller Thanks, indeed that fixed it for me! Do you know of a way to end up with complete workable splits?
Nov 13, 2022 at 18:09 vote accept Martin Müller
Nov 13, 2022 at 18:01 history answered Stephen Harris CC BY-SA 4.0