you guys assume that there are only files in the dir. however if there are subdirs, they also get to exported list. so you'll have to filter subdirs somehow as well..
One could use ls -l, and filter out lines starting with d:
sftp -q user@host:/dir <<<"ls -l *.txt" | grep -v -E "^sftp|^d"
but that will still leave symlinks, so perhaps:
sftp -q user@host:/dir <<<"ls -l *.txt" | grep -v "^sftp"| grep -- ^-
where
grep -v -E "^sftp|^d"
filters printout of sftp cmd or ^d - dirs,^l symlinks.. what else?
grep -- ^-
this shows only files
however, there is redunant info from ls -l.. so it also has to be cut off.. using cut or awk as direct approach:
sftp -q user@host:/dir <<<"ls -l *.txt" | grep -v "^sftp"| grep -- ^- | tr -s ' '|cut -d' ' -f9
sftp -q user@host:/dir <<<"ls -l *.txt" | grep -v "^sftp"| grep -- ^- | awk -F' ' '{print $9}'
sftp -q user@host:/dir <<<"ls -l *.txt" | grep -v "^sftp"| grep -- ^- | cut -c57-
tr -s ' '
to squeeze repated blanks.. for turther fields sorting
cut -d' ' -f9
leave only field #9 (filename usually, but check on your sys)
awk -F' ' '{print $9}'
also outputs only 9th filed.
cut -c57-
just removes chars 1-56 and leaves the filename on my system.. This fragile cutting is only option I found, when you have filenames with whitespaces...
so, something that freaking weird to get simple list of sftp files only which you could use for further processing..
Update:
For the situation where filenames contain whitespaces, I can't find anything better, than cut -c57-
to extract just filenames from ls -l. but it is not reliable, since different systems may have other position of name field in ls output, or even same system will shift the data when file-owner names are long!
This, I've found, can be avoided by using ls -n
, which replaces names with numeric id, but still the approach is not "nice"..
I was thinking to try using awk, to extract filename from $0, starting with 1st position of $9 till end of line $0..
anyone knows how in awk get position of a field, let say $9, in the line $0?
I haven't found any reliable way to do that. index($0,$9)
will produce wrong result in case of a filename match username or some number in previous columns. calculating total length
of $1,$2..$8 is also wrong, since field delimiters are repeated in sequences..
I've also noticed, using ls -n *
, will print /
after dirnames, which is easy to filter then. However using any glob mask with a ls
, reports a message to stderr if nothings is found (Can't ls: "/curdir/*" not found), which is annoying.
so, anyone have ideas hot to get list of regular files on sftp, allowing to process filenames with whitespaces and maybe other special chars?