I want my shell script to visit all subdirectories in a main directory. Do something in directories, sent output to a spool file and move on to next directory. Consider Main Dir = /tmp Sub Dir = A B C D (Four sub directories)
-
4OK, please show us your script so far. Which part of it is giving you trouble? – terdon Feb 27 '15 at 11:18
5 Answers
Use a for
loop:
for d in $(find /path/to/dir -maxdepth 1 -type d)
do
#Do something, the directory is accessible with $d:
echo $d
done >output_file
It searches only the subdirectories of the directory /path/to/dir
. Note that the simple example above will fail if the directory names contain whitespace or special characters. A safer approach is:
find /tmp -maxdepth 1 -type d -print0 |
while IFS= read -rd '' dir; do echo "$dir"; done
Or in plain bash
:
for d in /path/to/dir/*; do
if [ -d "$d" ]; then
echo "$d"
fi
done
(note that contrary to find
that one also considers symlinks to directories and excludes hidden ones)
-
2at least point the limitations and risks associated with processing the output of
find
like that. – Stéphane Chazelas Feb 27 '15 at 11:15 -
Hi...i tried to run below for loop
for d in $(find /backup/ASHISH -maxdepth 1 -type d) do ls -l |awk '{ print $9 }' |grep CC*_
– Ashish Feb 27 '15 at 12:10date +"%m%d20%Y"
|xargs echo echo $d -
Hi... I tried below for loop.
for d in $(find /backup/ASHISH -maxdepth 1 -type d) do ls -l |awk '{ print $9 }' |grep CC*_
date +"%m%d20%Y"
|xargs echo echo $dThe expected outcome is ls -ltr from all subdirectory. Above loop not working
– Ashish Feb 27 '15 at 13:17
I am a complete bash
newbie, but a UN*X veteran. Although doubtless this can be done in Bash shell scripting, in the old days we used find [-maxdepth <levels>] <start-dir> -exec <command> ;
to achieve this. You could do a man find
and play around, perhaps until someone tells you how to do it in bash
!

- 271
-
I am very flattered that my "outline" answer here has received an up-vote. However, why has @chaos' answer below received a down-vote? (As a newcomer to this board, I cannot post this comment against his answer, only against my own.) His second suggestion is correct for a shell script solution, and avoids the overhead of running an external
find
command. – JonBrave Feb 27 '15 at 11:19 -
His second is indeed correct. His first will fail if the directory names contain whitespace or special characters (backslashes for example). See the edit I made to his answer for the safe version. – terdon Feb 27 '15 at 11:28
-
-
-
Looks like you want the filenames under each of the subdirs; the ls -l | awk
is not robust enough, for what if those filenames comprise whitespace and/or newlines? The below find
would work even for find
s that donot happen to have the -maxdepth
going for them:
find . ! -name . -type d -prune -exec sh -c '
cd "$1" && \
find "." ! -name . -prune -type f
' {} {} \;
It's also possible using ls, grep, and tr
for dir in $(ls -1FA | grep / | tr -d /); do echo $dir/something; done
ls -1FA | grep / | tr -d / | while IFS= read -r TD; do echo $TD/something; done
du/sed can also be used as a selector if your ls lacks the above options
du --max-depth=1 | sed -e 's/^.*\.\///' | grep -v '\.$'
It may be important to note these examples return hidden directories and exclude parent and current directories

- 113
-
1(1)
ls
writes one file per line (what the-1
option specifies) by default when the standard output is a pipe (so it’s superfluous in your answers). (2) Parsing the output ofls
is a bad idea — see this and this. Your first answer will fail if directories have spaces (or newlines) in their names, and all will fail if they have newlines in their names. (3) You should always quote shell variables (e.g.,"$dir"
) unless you have a good reason not to, and you’re sure you know what you’re doing. – Scott - Слава Україні Nov 03 '17 at 18:09 -
Good points Scott. Leaving the answer as, for many systems without egregiously-named dirs, I think it may be still useful for the one offs. – JGurtz Nov 14 '17 at 23:59