11

Is it possible to write a bash script, that

  1. would be started from machine A, logs in on a different machine B by ssh (both machines A and B would be Linux-Machines),
  2. copys some files on to machine B
  3. runs a python script a given python script on these machines.
  4. transfers the results back to machine A
  5. logs off from machine B.

Is this technically doable?

jofel
  • 26,758
Aufwind
  • 245
  • 1
  • 3
  • 6

3 Answers3

15

Of course it is doable:

scp file user@host:
ssh user@host path_to_script
scp user@host:file_to_copy ./

and that's it...

But there is one problem: you will be asked for password three times. To avoid that you could generate ssh keys and authorize users by these keys.

To generate ssh keys run ssh-keygen -t rsa, answer questions and copy public key to remote host (machine B) to ~/.ssh/authorized_keys file. Private key should be saved in ~/.ssh/id_rsa on local machine (A).

pbm
  • 25,387
  • If public keys arent an option, you could do something crude to minimize password prompts like cat file | ssh user@host 'cat > /destination/of/file; /path/to/script &>/dev/null; cat results' > /destination/of/results – phemmer Mar 11 '12 at 23:04
  • If you do want to use the password, you could always use OpenSSH's connection pooling by defining ControlMaster=yes and ControlPath=/path/to/socketfile, and then start one ssh connection with -f to run a background ssh. Tell all subsequent SSH connections to use the same socket file. – jsbillings Mar 11 '12 at 23:06
5

I is possible to do everything in a single ssh connection/session:

ssh user@host "cat > remote_dst; command; cat remote_src" < local_src > local_dst

This:

  1. Copies local_src to remote_dst,
  2. Executes command,
  3. Copies remote_src to local_dst.

But if command writes on stdout, the result with also be in local_dst. If command reads input from stdin, it will receive and EOF.

jfg956
  • 6,336
3

While you can do this inside a single ssh session, it's a bit tricky to combine copying files with running commands.

The easiest way to tackle this task is to run separate SSH sessions for the three operations:

rsync -a inputs/ machineB:inputs/
ssh machineB 'some command -i inputs -o outputs'
rsync -a machineB:outputs/ outputs/

This requires authenticating to machineB three times. The recommended way to avoid authenticating multiple times is to use the connection sharing facility in modern versions of OpenSSH: start a master connection to B once and for all, and let SSH automatically piggyback onto that master connection. Add ControlMaster auto and a ControlPath line to your ~/.ssh/config, then start a master connection in the background, then perform your tasks.

ssh -fN machineB                         # start a master connection in the background
# Subsequent connections will be slaves to the existing master connection
rsync -a inputs/ machineB:inputs/
ssh machineB 'some command -i inputs -o outputs'
rsync -a machineB:outputs/ outputs/

Rather than use scp or rsync to copy files, it may be easier to mount the remote filesystem under SSHFS. This will take care of setting up a master connection, by the way (assuming you've set up your ~/.ssh/config as indicated above).

mkdir /net/machineB
sshfs machineB: /net/machineB
cp -Rp inputs /net/machineB/
ssh machibeB 'some command -i inputs -o outputs'
cp -Rp /net/machineB/outputs .