I have estimated 44 GB of data from my web server. I want to transfer it to another server with less time. I am using Putty to transfer file. is there any way to achieve this? I don't know which commands to use but some blogs said to use rsync
or scp
to transfer these files. your help is greatly appreciated. I've tried scp
from local to server but I what I need is from server to server.
2 Answers
Both scp and rsync support compression and decompression at endpoints.
You can pass -C
to scp
or -z
to rsync
Transfer speeds might be a bit faster using scp because scp does not do a comparison of files between a source and destination.
I would suggest you check the man pages though for any other options you might want to use.
If you don't want decompression to happen at the remote, but just want to compress and transfer the compressed files on the fly, you might want to check this question. You can use another compression tool if you like.
Rsync uses zlib, which does a decompression. You might want to check this answer for more information how its related to zip and its differences.
The graph in the first answer of this question might also provide a better idea in terms of performance.

- 101
-
can I use both -r and -C in pscp? I am also transferring files inside directories and i want to transfer them at a less time consuming way – Peter Eris Aug 11 '21 at 02:18
-
Yes, you can. Each file will be compressed while being transferred. It will happen recursively. It probably will take some time because you're doing also compression. But the data usage will be smaller, and as a result transfer will be faster. – Erlis D. Aug 11 '21 at 08:33
If you want to suppress any overhead related to L4-L6 protocols, you can use netcat
:
On destination server side:
nc -l -p XXXX > archive.tar.xz
On uploading server side:
tar -cJf - /var/www | nc destination_server XXXX
Keep in mind that using that method, traffic is clear

- 2,649
gzip
? – Panki Aug 10 '21 at 08:39