Merging directories without loss of data

A problem I often have is to have two directories which are probably mostly the same, but maybe not completely as some of the files might be newer (edited) versions of the other.

For example, directory A:


and B:


Now, I want to merge A and B. With only this small number of files, I could easily check by hand if document.txt is the same on both sides, &c. However, in a large directory, this becomes impossible, so I wrote up a small utitlity to do so:

mergedirs B A

Will go through all of the files in B and check whether an equivalent file in A exists. If so, it will check the contents* (and flags, depending on the command line arguments used) and refuse to remove any file for which you do not have a copy.

Another cute thing it can do is compute a hash of a directory with all its files:

mergedirs --mode=hash

Prints out (for a directory called merge):

merge                    4a44a8706698da50f41fef5fdcffd163

This can be useful to check whether two directories in different computers are exactly the same (in terms of file contents, flags &c).

It’s mostly a tool I wrote to scratch my own itch. I have no plans to develop it beyond my needs, but I it might be useful for others too.


One thought on “Merging directories without loss of data

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.