0

Currently I am having a bash script in which I accept input from the command line, but the input is with spaces and the bash script is not reading the word after the space. The script is something like this

#!/bin/bash
var1=$1
var2=$2
echo $var1
echo $var2

Suppose I save this file as test.sh. Now my input is something like this -

./test.sh hi check1,hello world

and the output is -

hi
check1,hello

but I need the output as

hi
check,hello world

PS: I cannot provide the inputs in double quotes so I need some other solution where I can read the word with spaces

  • 1
    So how would the shell know that it should be hi and check1,hello world and not hi check1,hello and world? How can a machine know that this space is separating the arguments but that space is not? You must provide the input quoted so the shell can know, it cannot guess. – terdon Jan 07 '22 at 10:41
  • 1
    It's a bit like asking for a C function like void fn(char* a, char* b) to get "\"yy\", \"zz\"" in b when invoked as fn("xx", "yy", "zz") – Stéphane Chazelas Jan 07 '22 at 10:47
  • @terdon I know a machine cannot know on it's own that is why I am asking is there any programming solution for this? – Sanjay Bhatia Jan 07 '22 at 11:00
  • 3
    "I cannot provide the inputs in double quotes so I need some other solution where I can read the word with spaces" – Why? See XY problem. Maybe the real solution is to figure out how to provide input in double quotes anyway. – Kamil Maciorowski Jan 07 '22 at 11:12
  • Not all shells treat double quotes as quoting operators. Most shells support '...' as a strong quoting operator which in general is preferable for passing literal text (which may contain characters special in the syntax of the shell) as one argument to a command. – Stéphane Chazelas Jan 07 '22 at 11:22
  • I was going to say that even if you can do this the way @choroba showed, it doesn't mean you should. It may be confusing to users as most commands rely on getting the arguments strings correctly set up when they're started, and the user has to use the shell's functionality to do that. I.e. quoting. E.g. you can't use find -name foo bar for find -name 'foo bar'. But then, things like echo, eval and ssh do join multiple arguments into one string. – ilkkachu Jan 07 '22 at 12:50

2 Answers2

3

That's not possible. The word splitting of the arguments happens before the script is run, so when it starts, the arguments have already been split into words. Read about "word-splitting" in man bash to learn more about the details.

If you know there will be 2 arguments and the first one will never contain spaces, you can workaround it somehow with

#! /bin/bash
first=$1
shift
rest="$*"

printf '<%s>\n' "$first" "$rest"

But it will still shrink multiple spaces into one.

choroba
  • 47,233
-2

As @choroba correctly stated you can't do that, at least not out of the box.

The reason is: the shell will, before executing anything, parse the commandline, This is a process which takes place in several well-defined steps. One of these steps is the "field splitting", where the shell splits the input line into various pieces. Consider:

command arg1 arg2

Somehow the shell has to determine that "command" is the command and "arg1" is the first argument and "arg2" is the second argument, no? There has to be a reason why "arg1" is the first argument and not "arg1 arg2" or why "command" is the command and not "command arg1", etc..

No, having said this, there are two things which influence how this splitting is done: the (shell-)variables IFS and OFS. IFS (the "internal field separator") is a list of characters which - if unmasked will separate fields. In the above example, "arg1" is separated from "arg2" and "command" by a blank - which is part of the IFS.

You can set the IFS yourself to any character (or even an empty string) but per default (stated in the POSIX documents) it is set to "blank", "tab" and "linefeed". For more information see Understanding IFS.

After this rather lengthy general introduction what does that mean for your problem?

  1. You can influence how the field-splitting is done by manipulating the IFS, but you would need to do that before your script is starting, i.e.:

     % SAVIFS="$IFS" ; export IFS=""
     % /path/to/your/script arg1 arg2 ...
     % IFS="$SAVIFS"
  2. You can do the field splitting inside your script and according to your own rules. For this you capture all the arguments into one string and split that up yourself:

     #! /bin/bash
     chArgs="$*"
     arg1=""
     arg1=""
     [...]
     shopt -s lastpipe
     # split the contents of $chArgs here, as an example:
     echo "$chArgs" | IFS=' ' read arg1 arg2
     printf "%s\n%s\n" "$arg1" "$arg2"
    

    OR, just to show how it works:

    echo "$chArgs" | IFS=',' read arg1 arg2 printf "%s\n%s\n" "$arg1" "$arg2"

Notice, however, that you always need to observe proper quoting when dealing with strings - especially string that could contain characters from the IFS. In your example:

#!/bin/bash
var1=$1
var2=$2
echo $var1
echo $var2

This proper quoting is missing and i suppose this is a(n additional) reason why it didn't work as you expected it to work. Try:

#!/bin/bash
var1="$1"
var2="$2"
echo "$var1"
echo "$var2"
bakunin
  • 531
  • 1
    That answer confuses everything. $IFS is not involved here other than in the echo $var1 / echo $var2 where the OP forgot the quotes around the variables. There's no OFS in shells (only in awk). Most shells ignore the IFS environment variable on start up so it's pointless to export it. Here it's the shell syntax tokenising rules that separates out arguments for the script. The Bourne shell used to also do IFS-splitting on top of that, but no modern shell does it any longer. – Stéphane Chazelas Jan 07 '22 at 11:56
  • demonstrating that bash ignores IFS in the environment: env IFS='hello' bash -c 'printf "%q\n" "$IFS"' outputs the default value $' \t\n' – glenn jackman Jan 07 '22 at 17:16