13

I can list file names actually but a lot of unwanted stuff comes from there also.

> echo "ls *.txt" | sftp user@host:/dir
Connected to 1.1.1.1.
Changing to: /dir
sftp> ls *.txt
1.txt
2.txt
3.txt

Is there a proven/reliable way to list files and only files? I'd like to avoid using head-like filters if possible.

4 Answers4

18

Use the -q option to tell sftp to be quiet, thereby suppressing most of the output you don't care about:

echo "ls *.txt" | sftp -q user@host.example.com:/path

You will still see the lines for the interactive prompt to which you are echoing, e. g. sftp> ls *.txt, but those can be filtered out with a grep -v:

echo "ls *.txt" | sftp -q user@host.example.com:/path | grep -v '^sftp>'

As an aside, it's probably better practice to use a batch file, and pass it with the -b parameter to sftp rather than echoing a pipeline into it.

If all you really want to do is get a list of files, this might actually be better served with ssh than with sftp (which, after all, is the secure file transfer program):

ssh user@host.example.com ls -1 /path
DopeGhoti
  • 76,081
5

Mount the remote directory tree through SSHFS. SSHFS is a remote filesystem that uses the SFTP protocol to access remote files. If the server allows SFTP access, you can use SSHFS (from the server's point of view, it's the same thing). On the client side, you need to be authorized to use FUSE, which is the case on most modern unices.

Once you've mounted the filesystem, you can use all the usual commands without having to care that the files are actually remote.

mkdir host-dir
sshfs user@host:/dir host-dir
echo host-dir/*.txt
…
fusermount -u host-dir
3
sftp -q user@host:/dir <<<"ls *.txt" | grep -v '^sftp>`

But that will work only if sftp is not asking by password. Because of grep -v that will affect sftp on asking by password.

But i think it is a lot of times simple if using tail.

sftp -q user@host:/dir <<<"ls *.txt" | tail -n+2 
1

you guys assume that there are only files in the dir. however if there are subdirs, they also get to exported list. so you'll have to filter subdirs somehow as well.. One could use ls -l, and filter out lines starting with d:

 sftp -q user@host:/dir <<<"ls -l *.txt" | grep -v -E "^sftp|^d"

but that will still leave symlinks, so perhaps:

sftp -q user@host:/dir <<<"ls -l *.txt" | grep -v "^sftp"| grep -- ^- 

where

  • grep -v -E "^sftp|^d" filters printout of sftp cmd or ^d - dirs,^l symlinks.. what else?
  • grep -- ^- this shows only files

however, there is redunant info from ls -l.. so it also has to be cut off.. using cut or awk as direct approach:

sftp -q user@host:/dir <<<"ls -l *.txt" | grep -v "^sftp"| grep -- ^- | tr -s ' '|cut -d' ' -f9
sftp -q user@host:/dir <<<"ls -l *.txt" | grep -v "^sftp"| grep -- ^- | awk -F' ' '{print $9}'
sftp -q user@host:/dir <<<"ls -l *.txt" | grep -v "^sftp"| grep -- ^- | cut -c57-
  • tr -s ' ' to squeeze repated blanks.. for turther fields sorting
  • cut -d' ' -f9 leave only field #9 (filename usually, but check on your sys)
  • awk -F' ' '{print $9}' also outputs only 9th filed.
  • cut -c57- just removes chars 1-56 and leaves the filename on my system.. This fragile cutting is only option I found, when you have filenames with whitespaces...

so, something that freaking weird to get simple list of sftp files only which you could use for further processing..

Update: For the situation where filenames contain whitespaces, I can't find anything better, than cut -c57- to extract just filenames from ls -l. but it is not reliable, since different systems may have other position of name field in ls output, or even same system will shift the data when file-owner names are long! This, I've found, can be avoided by using ls -n, which replaces names with numeric id, but still the approach is not "nice"..

I was thinking to try using awk, to extract filename from $0, starting with 1st position of $9 till end of line $0.. anyone knows how in awk get position of a field, let say $9, in the line $0?

I haven't found any reliable way to do that. index($0,$9) will produce wrong result in case of a filename match username or some number in previous columns. calculating total length of $1,$2..$8 is also wrong, since field delimiters are repeated in sequences..

I've also noticed, using ls -n *, will print / after dirnames, which is easy to filter then. However using any glob mask with a ls, reports a message to stderr if nothings is found (Can't ls: "/curdir/*" not found), which is annoying.

so, anyone have ideas hot to get list of regular files on sftp, allowing to process filenames with whitespaces and maybe other special chars?

Fedor
  • 71
  • 1
    Using ls *.txt should be enough (considering is not usual to name a directory with extensions). Also a directory is a file. I think the user wants regular files and maybe symbolic link which have the extension .txt (except named pipes, socket files, etc). So if you only want to list regular files you can use find: find . -type f -name "*.txt" – Edgar Magallon Jan 16 '23 at 09:18
  • @EdgarMagallon - is there find command on SFTP console? that would be trivial then). – Fedor Feb 06 '23 at 18:59
  • My bad, I thought sftp could use that command. So, the way to use find is by logging directly with ssh but it should be interesting if sftp could use a minimal version of find. – Edgar Magallon Feb 06 '23 at 19:38
  • I can't find anything better, than cut -c57- to extract just filenames with blanks from ls -l. but it is not reliable, since different systems may have other position, or even same system will shift data in case of long file-owner names! this can be avoided by using ls -n, which replaces names with numeric id, but still... – Fedor Mar 22 '23 at 12:11