I'm writing a bash script that needs to fetch all *_out.csv
from a directory, on a remote server. All these files are several directories deep inside of another directory. So for instance, say the directory is called ox_20190404/
. I can find all my files by going:
find ox_20190404/assessment/LWR/validation -type f -name "*_out.csv"
This question answers part of my question, but since I don't want to copy the directory in it's entirety I need to figure out how to implement the above code. Suppose I start with this:
$ dir="/projects/ox/git"
$ server="myusername@server"
$ scp $server:$dir/$(ssh $server 'ls -t $dir | head -1') .
How would I grab the files I need from there?
The last part of my question wonders if there is a way to then take all the copied files and place them in the same file path and directory they were in on the remote server.
ox_20190404
in the lead-up, so it's not clear how you selected it. – Jeff Schaller Apr 05 '19 at 20:11ls -t /projects/ox/git | head -1
thenox_20190404
is the directory that is returned. I then want to go inside that folder and get the files from there. – dylanjm Apr 05 '19 at 20:13zsh
available on $server? – Jeff Schaller Apr 05 '19 at 20:14scp
command would explicitly list all of the*_out.csv
files underneath the most recent directory under$dir
in order to be copied locally? – Jeff Schaller Apr 05 '19 at 20:20ox_20190404/assessments/LWR/validation/
. That's where I want to find all my*_out.csv
files. – dylanjm Apr 05 '19 at 20:22ssh
access. Alternatively, maybe an sshfs mount? – Jeff Schaller Apr 05 '19 at 20:24