I have looked through several topics about calculating checksums of files in Python but none of them answered the question about one sum from multiple files. I have several files in sub directories and would like to determine if there was any change in one or more of them. Is there a way to generate one sum from multiple files?
EDIT: This is the way I do it to get a list of sums:
checksums = [(fname, hashlib.md5(open(fname, 'rb').read()).digest()) for fname in flist]
.updatemethod with the bytes of each file. But why bother? Simply hash each file separately, and see if any of the hashes have changed. That way, you also get the identity of which file(s) changed. But if you really want a multi-file hashing program, try writing it and if you get stuck post your code and I'll be happy to help..update()instead of.digest()but I am not sure how. Do you mean calc hash for the first file like this:hash_obj = hashlib.md5(open(fname, 'rb').read())and after that dohash_obj.update(fname)? Will it calc hash from file contents or just filename string?.updatemethod to supply extra data to the hashlib object. The.digestand.hexdigestmethods are simply output methods that give the digest of the data that's been fed so far to the hashlib object. I don't have time write now to go into further details or write any code. But I recommend that you don't try to do this all in a one-line list comprehension: it might save a tiny bit of time but it makes the code hard to work with and hard to read.