0

How do I create n variables in shell scripts without explicitly assigning them? What I mean is something like a loop that creates var1, var2, var3,...,varx, where x is a variable I set earlier, something like:

read x

for ((a=0;a<x;++a)); do
 variable$a=${RANDOM}
done

(Let's ignore the possibility that x might be a string for now. But obviously, this doesn't work. How would one do this?

What I actually want to do, is that each argument I wrote in the command line when I executed the script to become it's own variable ARG1, ARG2... ARGn with ${1}, ${2},..., ${n} as its value, so there will only be as many of these variables, as arguments were set.

slm
  • 369,824
iamAguest
  • 483

4 Answers4

2

It looks like you want to use an array:

read x

for (( a=0; a<x; ++a)); do
   variable[a]=$RANDOM
done

printf 'First value is %s\n' "${variable[0]}"

printf 'All values (one by one): %s\n' "${variable[@]}"
printf 'All values (as one string): %s\n' "${variable[*]}"

For the second part of your question:

arg=( "$@" )

printf 'First command line argument: %s\n' "${arg[0]}"

Note also that you can easily loop over all command line arguments (or whatever happens to be in $@) without storing them anywhere special:

for arg do
    printf 'Got command line argument: %s\n' "$arg"
done
Kusalananda
  • 333,661
1

Try this,

read x

for ((a=0;a<x;++a)); do
 declare -i variable$a=${RANDOM}
done

declare command permits assigning a value to a variable in the same statement as setting its properties.

Siva
  • 9,077
0

You need to eval the assignment (but to use an array would be better).

#!/bin/bash -vx

read x
for ((a=0;a<x;++a)); do
   eval variable$a=${RANDOM}
done

From man bash

eval [arg ...]
    The args are read and concatenated together into a single command. 
    This command is then read and executed by the shell, and its exit 
    status is returned as the value of eval.
andcoz
  • 17,130
  • In this case, $a must be a number, so it can't contain any extra expansions (which eval would process), so you're sort of safe. But really, if you're running in Bash, why would you ever use eval instead of an array in a simple case like this? The mess it potentially produces is just too big. – ilkkachu Aug 22 '18 at 09:23
  • @ilkkachu I agree that it is a weak solution. But, for example, I used this to initialize some configuration variables, reading a json file (from a remote service) without knowing the needed variable names (to prepare the environment to a legacy software). I remember I needed a lot of checks to avoid word splitting. – andcoz Aug 22 '18 at 09:30
  • Yeah, that would be different. Though as another answer reminds, even that could be done with typeset (or declare) instead of eval, e.g. name=foo; value=bar; typeset "$name=$value"; echo $foo prints bar. – ilkkachu Aug 22 '18 at 09:36
  • @ilkkachu, except that declare/typeset also have the side effect of reducing the scope of the variable. – Stéphane Chazelas Aug 22 '18 at 11:23
  • 1
    Note that there is a command injection vulnerability here, the same as in all the other answers, because $x is evaluated as an arithmetic expression in for ((...)), not because of eval. Try for instance echo 'a[$(uname>&2)0]' | bash -c 'read x; for ((i = 0; i < x; i++)); do : ;done'. By the way declare/typeset (even read in bash) can introduce ACEs the same as eval (not here as the content of $a is controller). They do also evaluate code. – Stéphane Chazelas Aug 22 '18 at 11:34
0

How about (recent shell needed)

 for ((a=0;a<x;++a))
   do  read variable$a <<<${RANDOM}
   done
RudiC
  • 8,969