17

I'd like to run an scp transfer to download the latest (newest) file in a certain directory to my local directory.

Something like this:

  • source: root@rimmer.sk:/home/rimmer/backups/
  • destination: /home/rimmer/backups/

While getting the newest file only, not all of them, in backups.

Frantisek
  • 415

3 Answers3

21

Suppose you have variables server and dir defined, you can do

$ dir="~"
$ server="user@server.com"
$ scp $server:$dir/$(ssh $server 'ls -t $dir | head -1') .

Where you first look for the newest file, and then copy it.

Note: I did not check it for foolproofness (e.g., the latest entry being a folder)

Bernhard
  • 12,272
  • If it's a directory that won't scp anything. You need to use scp -r. – slm Jul 10 '13 at 08:06
  • @slm Correct, but he might not want to copy that. – Bernhard Jul 10 '13 at 08:10
  • If the scp get's interrupted it will have to restart the copying from the beginning, rsync can resume from where it left off. – slm Jul 10 '13 at 08:24
  • 1
    For bash, I needed to replace the single quotes with double quotes in the argument to ssh. – Johannes Bittner Mar 14 '19 at 17:21
  • 1
    You can remove the first $dir by doing the following (added a /* to the end of the second $dir): scp $server:$(ssh $server 'ls -t $dir/* | head -1') . – Jesse Apr 23 '19 at 03:20
4

scp is dumb in the sense that it just blindly copies files from source to destination. If you want something that's more intelligent about copying files you'll need to use a tool such as rsync.

$ rsync -avz root@rimmer.sk:'$(find /home/rimmer/backups/ -ctime -1)' /home/rimmer/backups/

This will only copy files that are missing or have changed from rimmer.sk's backups directory in the last day (-ctime -1) to your local backup's directory.

-ctime n
   File's  status  was last changed n*24 hours ago.  See the comments for 
   -atime to understand how rounding affects the interpretation of file 
   status change times.

References

slm
  • 369,824
  • The problem with rsync is the same. If I've been offline for 2 days, it tries to copy all backups from all days, which I don't need at all and is huge. – Frantisek Jul 10 '13 at 07:54
  • @RichardRodriguez - OK please update your question and be more specific what you're after. – slm Jul 10 '13 at 07:57
  • 2
    @slm: I think the question is all right as it is. I just want to download the latest file in a directory. What more can I explain about it? :) – Frantisek Jul 10 '13 at 07:58
  • @RichardRodriguez - what files are in this directory? A sample listing would be helpful. – slm Jul 10 '13 at 07:59
  • @RichardRodriguez - here's why I'm saying this - if you've been offline for 2 days and you connect back, what's the test that I would do as the script, to know oh skip these other days and only upload today's files and/or directories. It's difficult to imagine a script or one-liner without knowing what the cutoff is. What does the term "newest" mean to you? – slm Jul 10 '13 at 08:02
  • @RichardRodriguez - see update, this will find all the files that have changed in the last day and will only rsync those. – slm Jul 10 '13 at 08:09
  • @slm With find you can add -type f. Removed my downvote, as I think it now better reflects the question. (just rsync was a too easy answer to me) – Bernhard Jul 10 '13 at 08:13
  • @Bernhard - thank you for undoing that. I agree, but remember that a lot of people are unfamiliar with the differing technologies so what's easy to us, might be a foreign concept to others 8-). Yeah the -type f would exclude directories so that's why I left it out. – slm Jul 10 '13 at 08:15
  • 1
    Why the down vote? – slm Jul 10 '13 at 14:41
  • @slm I removed all but one, I think that is relevant to keep. – Bernhard Jul 10 '13 at 14:55
  • @Bernhard - thanks. I'm thinking someone saw our exchange and just downvoted without reading everything. I hate when ppl downvote and don't say why. – slm Jul 10 '13 at 14:58
  • @slm I agree! You always want to improve your post, and I never avoid a discussion. Anyhow, now I upvoted, as this answer does not deserve to have a negative score! – Bernhard Jul 10 '13 at 17:52
1

A bit late to the party but perhaps a solution with ssh and rsync will work for some:

source_host="yourhost.com"
source_dir="/a/dir/on/yourhost.com/"
target_dir="/the/dir/where/last_backup/will/be/placed"
last_backup=$(ssh user@${source_host} "ls -t ${source_dir} | head -1")
if [ "${last_backup}" == "" ]; then
    echo "ERROR: didn't find a backup, cannot continue!"
else
    echo "the last backup is: ${last_backup}"
    rsync -avzh user@${source_host}:${source_dir}/${last_backup} ${target_dir}
fi
DRAD
  • 61