I am currently working on a large project (Phd thesis) using git
for version control. My source files are processed by latex
which are just plain-text, naturally suited to version control.
The images in the project are naturally binary files (pdf/ps/png/tiff) that do not lend well to ascii-diff version control, and leads to ever-increasing repository size.
For these types of data, my university provides me a cloud-service (box) that is just mounted as an ordinary folder in a location on my computer.
The question is the following:
Can perhaps a shell script be written that selectively copies the latest version of all files of only a specific type (eg
Does anyone know of a way to manage such binary files for VCS systems across multiple machines? Git annex does not work for me. I do need the (limited) version-controlling provided to me like box/dropbox etc.
Maybe the answer is as simple as hard-linking? But I do not want to manually hard-link each file that gets generated, renamed and deleted. Instead I would like the process to be seamless.
rsync
, look into that. – msb Jul 08 '19 at 17:09