1

I have a list of servers and Bash scripts that need to be executed on a server. For example, I have a script that schedules a cronjob (with the command that the user has specified) on a server. This is just an example, I have the script for removing cronjob, storing SSH keys, configuring supervisor to start a daemon, etc.

I'm programatically SSHing into the server and want to execute those scripts on remote server, but "in the background". When I say "in the background", I mean that my code should finish and not worry about how long the actual script is running. This is because my code is running on a blocking language (PHP) and some scripts could potentially run few minutes. I want to send a script on the server through SSH, run it and don't worry about the rest -- I curl my server after a script has finished running (sort of like a webhook).

For preserving local filesystem storage, each script, after it has been modified with user content (for cronjob that could be the command and cron expression) is represented as a string, and not stored on my local server in a file. Example: I have a script echo "{{name}}" and user provides the "name" variable that programatically gets interpolated in the script. I don't want to store each script modification that the user has provided on my filesystem.

The way I've been doing this now is executing this command from my server:

ssh -T user@host /bin/bash <<EOFX\n{{script}}EOFX

This is wrapped in a PHP command to execute shell commands and {{script}} gets replaced with this:

#!/bin/bash

cat << 'EOFY' > ./script.sh
{{executionScript}}
EOFY

. ./script.sh > ./script.out

curl --data "status=$(echo $?)" myserver.com

In this case, {{executionScript}} gets replaced with the modified version of final script that needs to be ran on a server (for example echo "John Doe")

This has been working great so far, but this SSH command waits for this script to finish running. If I put a sleep 4 before curl, my code (execution of the SSH command) will wait for 4 more seconds.

Is there a way to run the SSH command that needs to run a certain script in the background?

I've given you a high-level overview of what I'm trying to achieve in hope that somebody could potentially give a better solution to my problem (executing a set of user-modified scripts on the server). I know that this is a Unix/Linux board and my primary issue is with the SSH command that waits for the script to finish.

  • You are basically trying to re-invent Ansible (or a subset thereof). Check it out. – xenoid Apr 20 '19 at 22:05
  • I know of these services. That's not what I asked. I'm building this project as a part of own learning experience. – crnkovic Apr 20 '19 at 22:08
  • Why are you sourcing (. ./script.sh) the script instead of executing it (./script.sh)? – terdon Apr 20 '19 at 22:09
  • I want to curl after a script has finished, therefore I would need to run it in a current shell, if I want to curl afterwards, right? I'm not a linux guy, not much experience :-) – crnkovic Apr 20 '19 at 22:11
  • No, that shouldn't make any difference. The curl would wait for the script to finish either way. This isn't a problem, it's just odd and makes it slightly more complcated than it needs to be. – terdon Apr 20 '19 at 22:12
  • try to look in the direction "Keep SSH Sessions running after disconnection" - https://unix.stackexchange.com/questions/479/keep-ssh-sessions-running-after-disconnection – MolbOrg Apr 21 '19 at 01:51

1 Answers1

2

When using the command shell, prefixing a command with nohup prevents the command from being aborted automatically when you log out or exit the shell.

nohup mycommand &

or

ssh -n -f user@host "sh -c 'cd /whereever; nohup ./whatever > /dev/null 2>&1 &'"

It should keep running even when you disconnect.