I want to write my Bash functions each in a separate file, for easier version control, and source the whole lot of them in my .bashrc
.
Is there a more robust way than e.g.:
. ~/.bash_functions/*.sh
I want to write my Bash functions each in a separate file, for easier version control, and source the whole lot of them in my .bashrc
.
Is there a more robust way than e.g.:
. ~/.bash_functions/*.sh
It's simply a matter of surrounding it all with appropriate error checks:
fn_dir=~/.bash_functions
if [ -d "$fn_dir" ]; then
for file in "$fn_dir"/*; do
[ -r "$file" ] && . "$file"
done
fi
if [ -r "$file" ] ; then source "$file" ; fi
". On Unix systems, [
is an alias (often a hardlink) to /bin/test
, a program that evaluates its arguments and returns 0 or nonzero in response, which you can then react to in a script using either if
or embedded &&
type logic. It's a style choice, not a functional one. The one who edited my script thus feels it's better to be terse than verbose. You can argue over which is clearer, but both are correct.
– Warren Young
Jun 29 '23 at 16:43
As for source multiple files at once, it can be done by creating a redirect of it's concatenated output, like:
source <(cat ~/.bash_functions/*.sh)
As for the robust part you might need to have an error check properly set for whatever you're sourcing, such as:
source <(cat ~/.bash_functions/*.sh)||echo "ERROR: failed while sourcing $?";exit 1
here is another snippet, where you can also validate the sourced files it-self like:
sourced_files=$(source <(cat ~/.bash_functions/*.sh) 2>&1 > /dev/null)
if [ -n "$sourced_files" ]; then
echo "ERROR: nonzero returned"
fi
Would be even better if possible to add an error validation inside whatever you're sourcing, with custom error codes so you can have a better track where it failed, such as:
err=0
...
...
# your shell script
...
some-command-i-wanna-check
if [ "$?" -ne 0 ];then
echo "ERROR: my failed description"
err=101
exit $err
fi