177

Currently I have a script.sh file with the following content:

#!/bin/bash
wget -q http://exemple.com/page1.php;
wget -q http://exemple.com/page2.php;
wget -q http://exemple.com/page3.php;

I want to execute the commands one by one, when the previous finishes. Am I doing it in the right way? I've never worked with Linux before and tried to search for it but found no solutions.

terdon
  • 242,166
drogy
  • 1,771
  • 1
    terdon's answer below is excellent but you can drop the semi-colons in the version you've shown above as well as commands are executed in order in the script, each on its own line. The semicolon is redundant unless you put all the commands on one line. – mikebabcock Feb 12 '15 at 17:57

2 Answers2

246

Yes, you're doing it the right way. Shell scripts will run each command sequentially, waiting for the first to finish before the next one starts. You can either join commands with ; or have them on separate lines:

command1; command2

or

command1
command2

There is no need for ; if the commands are on separate lines. You can also choose to run the second command only if the first exited successfully. To do so, join them with &&:

command1 && command2

or

command1 &&
command2

For more information on the various control operators available to you, see here.

terdon
  • 242,166
23

In a repetitive case like this I recommended using a for loop.

for P in {1..3} ; do wget -q http://exemple.com/page${P}.php ; done

This is a one-line version suitable for the command line but can also be used in a script. The braces around the variable name are needed when embedding a variable into a string without whitespace.

The loop not only sends the requests in order, but is easier to tweak and reuse when needed, without as many worries about typos.

842Mono
  • 113
Rache
  • 361
  • Adding with info of the top-vote, you can use && instead of ; - right before done - to make sure that the next command runs after the last one finished with success. – Timo Feb 27 '24 at 09:09