I’ve recently revamped my entire backup system. A datahoarder archivist like me has tons of data, with different tiers of importance, and multiple varying locations to store it. Though I’ve never lost anything important due to corruption, I absolutely understand the importance in verifying your data is what it says it is.

Therefore, I set out to find a minimally invasive utility with a little more intelligence than a basic md5sum but not overbearing. I looked at hashdeep, cfv, and rhash. I needed the ability to easily create, update (detect new files), and audit/verify in a minimally invasive and easy to use package.

I decided on rhash!

Here are the commands I use to achieve a pretty effective system for handling file integrity. All of these are executed in the current working directory, the place where I want all files/dirs to be hashed. I’m using CRC32 but many more are available.

Create:

rhash -rCv ./ -o crc32.sfv

Update:

rhash -rCv -u ./crc32.sfv .

Audit:

rhash -crv crc32.sfv
Mario Loria is a builder of diverse infrastructure with modern workloads on both bare-metal and cloud platforms. He's traversed roles in system administration, network engineering, and DevOps. You can learn more about him here.