How do I copy an entire directory into a directory of the same name without replacing the content in the destination directory? (instead, I would like to add to the contents of the destination folder)
-
1Yes, it is annoying when a useful program/utility does not have that one "if only it could ..." option ! In this case "--noclobber" ! – MikeW Feb 07 '18 at 10:57
9 Answers
Use rsync
, and pass -u
if you want to only update files that are newer in the original directory, or --ignore-existing
to skip all files that already exist in the destination.
rsync -au /local/directory/ host:/remote/directory/
rsync -a --ignore-existing /local/directory/ host:/remote/directory/
(Note the /
on the source side: without it rsync
would create /remote/directory/directory
.)

- 829,060
-
@Anthon I don't understand your comment and I don't see an answer or comment by chandra.
--ignore-existing
does add without replacing, what data loss do you see? – Gilles 'SO- stop being evil' Nov 27 '13 at 09:59 -
Sorry, I only looked at your first example that is where you can have data loss (and is IMHO not what the OP asked for), if you include --ignore-existing data-loss should not happen. – Anthon Nov 27 '13 at 10:08
-
3This does not help if the remote system does not have
rsync
easily available.... (Like Win32-OpenSSH) – Gert van den Berg Oct 25 '16 at 08:00 -
@GertvandenBerg rsync is pretty easy to install on Windows, no harder than SSH. – Gilles 'SO- stop being evil' Oct 25 '16 at 11:51
-
@Gilles: True, but all of the options seems to involve Cygwin DLLs... (The current state of the MS port of OpenSSH is such that enabling compression on scp is enough to break SCP...) (Getting rsync functional over Win32-OpenSSH also seems non-trivial - hopefully that improves over time) (Solaris 10 is the other example, where a third party package and
--rsync-path
is needed) – Gert van den Berg Oct 25 '16 at 13:01 -
Just to be sure, this (first line) only updates one way, the newer files in original directory replace files in the destination, but newer files in destination do not update the source, right?
I was surprised that it said it had both sent and received files (received was really small, so maybe it is only some data used during transfer?)
– Kvothe Jun 12 '18 at 08:28 -
1@Kvothe When rsync tells you how much data it's sent and received, that's total data, not just file contents. Data flows both ways: the file that's sending file contents needs to know what to send. – Gilles 'SO- stop being evil' Jun 12 '18 at 08:55
-
-
Note that this assumes that you want to copy the files from your local machine to a remote machine. If you instead want to copy the files from the remote machine to your local machine, the last two arguments need to be swapped. – HelloGoodbye Apr 24 '23 at 09:08
scp will overwrite the files if you have write permissions to them.
In other words:
You can make scp
effectively skip said files by temporarily removing the write permissions on them (if you are the files' owner, that is).
-
8
-
make sure you copy the files back you add a * to do so. Example
scp -r user@server.com:/location/of/files/* /local/location/
– Rick May 27 '15 at 19:16 -
1
-
2In the uncommon case that only some of the files in the directory are expected to be overwritten (if not "protected"), and file permissions are not uniform across files, this solution may not work. Otherwise, it is quite simple and effective. – sancho.s ReinstateMonicaCellio May 20 '18 at 12:55
-
2To do this on windows select all the files, right click -> properties -> read only – BenJammin Sep 06 '18 at 15:17
-
1Nice trick. You can add
2>/dev/null
at the end to disregard the complaints. My attempt was to copy over remote 19200 files ignoring the existing files, so the screen was a mess. – David Jung Nov 19 '18 at 04:46 -
Strange, on SLES 15.1
scp
didn't complain it cannot overwrite it and copied all files even they didn't have thew
permission. – Peter VARGA Apr 05 '20 at 14:06 -
Had to combine the tips from (at)BenJammin and @david-jung to do it from windows because without the error redirection the copy stopped at the first file that already existed with read-only flag. – 40detectives Sep 23 '21 at 07:46
-
You can copy only new files by date. Use find
scp `find /data/*.gz -type f -mtime -7` USER@SERVER:/backup/
From the manpage (-atime
is for last accessed time, but the principle is the same):
-atime n
File was last accessed n*24 hours ago. When find figures out how many 24-hour periods ago the file was last accessed, any fractional part is ignored, so to match
-atime +1
, a file has to have been accessed at least two days ago.>
If you can make the destination file contents read-only:
find . -type f -exec chmod a-w '{}' \;
before running scp
(it will complain and skip the existing files).
And change them back afterward ( chmod +w
to get umask based value). If the files
do not all have write permission according to your umask, you would somehow have to store the permissions so that you can restore them.
(Gilles' answer overwrites existing files if locally they are newer, I lost valuable data that way. Do not understand why that wrong and harmful answer has so many up votes).
-
3I don't get it: how did
rsync --ignore-existing
cause you to lose data? – Gilles 'SO- stop being evil' Nov 27 '13 at 10:01 -
4I got the error
find: missing argument to '-exec'
using this command, and instead had to use:find . -type f -exec chmod a-w {} \;
. My linux is bad, ymmv. – wpearse Apr 06 '15 at 00:16 -
This is a better and safer answer. The correct command to change the permission is as @wpearse mentioned:
find . -type f -exec chmod a-w {} \;
– Amir Oct 17 '17 at 16:41 -
1
To copy a whole bunch of files, it's faster to tar them. By using -k you also prevent tar from overwriting files when unpacking it on the target system.
tar -c <source-dir> | ssh <name>@<host> 'tar -kxzf - -C <target-dir>'
Note: If your tar -c
command creates a POSIX tar archive (GNU), you will have to run the extract like this tar -kxvf
- otherwise you'll get error messages like "gzip: stdin: not in gzip format
"

- 3,018

- 111
-
1It does make a remote connection. First it tar's the source, pipes it into the ssh connection and unpacks it on the remote system. – huembi Aug 22 '16 at 21:17
I had a similar task, in my case I could not use rsync
, csync
, or FUSE because my storage has only SFTP. rsync
could not change the date and time for the file, some other utilities (like csync
) showed me other errors: "Unable to create temporary file Clock skew detected".
If you have access to the storage-server - just install openssh-server
or launch rsync
as a daemon here.
In my case - I could not do this and the solution was: lftp. lftp
's usage for syncronization is below:
lftp -c "open -u login,password sftp://sft.domain.tld/; \
mirror -c --verbose=9 -e -R -L /srs/folder /rem/folder"
/src/folder
- is the folder on my PC, /rem/folder
- is sftp://sft.domain.tld/rem/folder
.
You may find man pages by the link: http://lftp.yar.ru/lftp-man.html
-
Fabulous! While it's not using scp (as the binary), but sftp (as the same protocol), it helps to achieve a synchronization when sftp is the only protocol available - hence no ssh and therefore no rsync, ls, tar or any of the other proposed solutions. Thank you! Btw, if you want to mirror the remote folder to a local folder, just drop the "-R" flag. – Martin Rüegg Jun 14 '20 at 10:59
-
Awesome, this worked for me as well. Just some notes. Careful about using
-R
, that deletes items if they're not present (believe its similar to the--delete
flags inrsync
). The/rem/folder
is a relative path. So if you wanted to copy something to the base path, you'd do:sftp://stf.domain.tld/; ...... /src/folder ../../
– Sean Breckenridge Jun 29 '20 at 01:31 -
@SeanBreckenridge it's
-e
that causes delete,-R
means put instead of get, just that. – Vesper Nov 23 '23 at 11:11
Another way to achieve this is to do a ls
on the destination folder:
On remote destination folder:
ls | awk '{print "mv " $1 " ../copied_data/"}' > mv_copied_data
scp mv_copied_data user@source.server.com:/path/to/source/folder
On source:
cd /path/to/source/folder
chmod 777 mv_copied_data
./mv_copied_data
On destination:
scp -r user@source.server.com:/path/to/source/folder /path/to/destination/foldeer

- 116,213
- 16
- 160
- 287
-
2Please don't do this. 1. Don't parse ls. 2. The answer is dangerous if any filename contains
-
>
space newline or other special characters. 3. Hard-code a for-loop in a python script. Generating a program at runtime is hard and tricky (even when done by professional programmers), for a lot of reasons. Generating a shell script is even worse - an awful idea. – ignis Jun 01 '19 at 21:12
An other option not using rsync (perhaps for portability) is using sftp
.
get -a -r FILES
will attempt to resume copies and will only bother to copy differences.
This is in case you want to avoid copying duplicate files, but this will not protect files that are different and need to remain different

- 121
- 1
- 7
scp
does overwrite files and there's no switch to stop it doing that, but you can copy things out the way, do the scp and then copy the existing files back. Examples:
Copy all existing files out the way
mkdir original_files ; cp -r * original_files/
Copy everything using scp
scp -r user@server:dir/* ./
Copy the original files over anything scp has written over:
cp -r original_files/* ./
-
5This method doesn't help when you're trying to pull files over from a remote and pick up where you left off. I.e. if the whole purpose is to save time. – Oliver Williams Dec 01 '16 at 17:58