LONG ago I wrote a backup script for our site and have updated it ever since. However, occasionally things go wrong and some of the older backups are now broken.
In days gone by I had used the utility zipinfo in our automated scripts to try and figure out if a previous backup was bad and then re-attempt a backup, but while we still have some zip laying around there are (at least) two issues.
First, zip has fundamental limitations and so we've used tar for larger backups.
And secondarily, zip doesn't capture as much metadata as tar does and so we prefer it for certain kinds of things.
Further, we've shifted to gzip instead of zip, and we're also compressing our tars as well...
Our backups are now huge and I'm trying to figure out what to remove and what to keep - no point in keeping broken files. So, I'm writing a script that merges our various backup directories - from on and off-site, etc - and I feel a serious need to check the validity of each file because sometimes one copy gets corrupted and the other is OK.
I did a look for a gzip version of zipinfo but didn't find it. And I've never heard of such a thing for tar, but I may just be ignorant!
I sure don't want to have to resort to expanding into disk space!
tar(becausetardoesn't perform any kind of data checksumming internally).