1

I have accumulated a fair amout of shell scripts that help me configure/administrate my computers, and I'd like to put some order in what's becoming a mess.

In many of them I define utility functions like the following two

error() {
  printf '\e[4;5;31m%-s\e[m\n' "$@" >&2
}
warning() {
  printf '\e[4;5;33m%-s\e[m\n' "$@" >&2
}

What I'd like to do is put these and other common functions each in their own .sh file, maybe organizing them in a directory hierarchy, so that other bash scripts can "include" the ones they need before doing anything else.

This question is related, but I don't want those "common" functions to be available in the shell, I just want them to be available to the script that choose to use them, that's why I said those scripts should "include" these functions.

What are the recommended ways of "including" such common functions?

At the moment, I'm experimenting on something like this:

# this is myscript.sh residing in /home/enrico/path/to/
# error.sh only has the error() function as defined above, so I have to use echo to print the error message if I don't find it
source "$(dirname "$0")/commons/error.sh" || { echo "error.sh not found" >&2 && exit 1; }
# error() is available now
source "$(dirname "$0")/commons/warning.sh" || { error "warning.sh not found" && exit 1; }
# waringin() is available too
# ...
command_that_can_fail || error "failed because of bla and bla"
condition_to_be_concerned_about || warning "hey, there's something you should worry about"

I have sourced those two scripts prepending "$(dirname "$0")/ (to the position of those two files relative to myscript.sh) in order to be able to run myscript.sh from any directory, e.g. /home/enrico/path/to/myscript.sh. Is this technique just fine? Are there any drawbacks?

One drawback I can think of, is that I cannot use shellcheck's source directives with that line.

Enlico
  • 1,555

0 Answers0