I have a folder of .command
files (regular task) on Mac OS X , which I would like to execute simultaneously.
At the moment, I have been iterating through each file in the directory one by one to execute it. using something like:
#!/bin/bash
LAUNCHLOG=~/Desktop/My\ Automation/Resources/Logs/_AutoLaunchAgent.txt
mkdir -p ~/Desktop/My\ Automation/Resources/Logs/
mkfile -n 0k "$LAUNCHLOG"
chmod 0777 "$LAUNCHLOG"
FILES=`find -f ~/Desktop/My\ Automation/Resources/Temp/`;
while read -r line; do
"$line" >>"$LAUNCHLOG"
done <<< "$FILES"
sleep 10
The above is working, however it is quite slow... For speed reasons, I would like to execute the commands (every item in my directory) at once.
The commands are all independent of each other and do not need to communicate, and if I do this manually everything works correctly and I get a big speed boost...
What is the best way to achieve this? I tried using parentheses inside my while loop in an attempt to execute inside a new subshell on each iteration, but the process was still executing one file at a time.
&
to place each task in the background. You'll probably want to log each command in a separate log file though. – Stephen Kitt Jun 25 '15 at 11:29