开发者

Comparing the MD5 results of split files against the MD5 of the whole

开发者 https://www.devze.com 2023-02-22 16:57 出处:网络
I have a situation where I have one VERY large file that I\'m using the linux \"split\" command to break into smaller parts.Later I use the linux \"cat\" command to bring the parts all back together a

I have a situation where I have one VERY large file that I'm using the linux "split" command to break into smaller parts. Later I use the linux "cat" command to bring the parts all back together again.

In the interim, however, I'm curious...

If I get an MD5 fingerprint on the large file before splitting it, then later get the MD5 fingerprints on all the independent file parts that result from the split command, is there a way to take the independent fingerprints and somehow deduce that the sum or average (or whatever you like to all it) of their parts is equal to the fingerprint of the single large file?

By (very) loose example...

bigoldfile.txt MD5 = 737da789
smallfile1.t开发者_如何学Pythonxt MD5 = 23489a89
smallfile2.txt MD5 = 1238g89d
smallfile3.txt MD5 = 01234cd7

someoperator(23489a89,1238g89d,01234cd7) = 737da789 (the fingerprint of the original file)


You likely can't do that - MD5 is complex enough inside and depends on actual data as well as the "initial" hash value.

You could instead generate "incremental" hashes - hash of first part, hash of first plus second part, etc.


Not exactly but the next best thing would be to do this: cat filepart1 filepart2 | md5sum or cat filepart* | md5sum

Be sure to cat them back together in the correct order. by piping the output of cat you don't have to worry about creating a combined file that is too large.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号