0

I have a script that loops through all zip files in a large directory and individually unzips them. I am using the -q option in order to keep the console clean and display other monitoring info.

I would like to keep this same behavior but would like to add the output of the -v option to a single log file of the whole operation. What would be the best way of accomplishing this? Btw, I'm still a bit green when it comes to scripting so any input would be appreciated.

Unzip portion of script:

for FILES in $(ls -1Sr|grep a_media*)
 do unzip -q $FILES -d $DESTINATION
done

2 Answers2

2

You can redirect stdout and stderr for the whole loop:

for FILES in $(ls -1Sr|grep a_media*)
 do unzip -q "$FILES" -d "$DESTINATION"
done > log 2>&1

With at least zsh and recent versions of bash you can replace > log 2>&1 with >& log.

Your ls processing is suspicious, see Why *not* parse `ls`?...

Stephen Kitt
  • 434,908
0

First I suggest you to enclose the $FILES and $DESTINATION variables in the unzip command in double-quotes, since in case of paths containing spaces they will break the unzip command; also note that your script will break on filenames containing newlines, due to ls outputting the "raw" filenames containing the newlines to the pipe; use bash's globbing feature instead (for FILES in /path/to/archives/* [...]).

Second, when you don't run unzip with the -q option, which suppresses the normal output, the output it's written to stdout (STDandard OUTput), which it's a file tipically linked to the terminal screen; shells such as bash can redirect the output of a command from stdout to another file: for example, in bash, this can be done via the > and >> operators, which respectively 1. Creates the file if not existing or truncates it then appends the output to it and 2. Creates the file if not existing or appends the output to it.

So in this case, since you're processing multiple archives, you don't want to truncate the log file each time a new file it's processed, but you rather want to append the new output to the already existing log; including my corrections:

for FILES in /path/to/archives/*
do
    unzip -v "$FILES" -d "$DESTINATION" >> /path/to/logfile
done
kos
  • 2,887
  • Thanks for all the info. I was taking a calculated risk by parsing ls because in a previous iteration of the script I was sorting before unzipping the files from smallest to biggest, in a directory that the files always have a short reliable name (I have a different script that populates this dir). I have since abandoned that idea so /path/to/archives/* would work just perfectly. And I thought that >> would work but seemed like I was missing something. Thanks for clearing the stdout concept for me. – dnbpanda Jun 19 '15 at 15:48