34

I tried it with SCP, but it says "Negative file size".

>scp matlab.iso xxx@xxx:/matlab.iso
matlab.iso: Negative file size

Also tried using SFTP, worked fine until 2 GB of the file had transferred, then stopped:

sftp> put matlab.iso
Uploading matlab.iso to /home/x/matlab.iso
matlab.iso                                           -298% 2021MB -16651.-8KB/s   00:5d
o_upload: offset < 0

Any idea what could be wrong? Don't SCP and SFTP support files that are larger than 2 GB? If so, then how can I transfer bigger files over SSH?

The destination file system is ext4. The Linux distribution is CentOS 6.5. The filesystem currently has (accessible) large files on it (up to 100 GB).

Jeff Schaller
  • 67,283
  • 35
  • 116
  • 255
eimrek
  • 563
  • 5
    Looks like a variable overrun of size. But AFAIK scp/sftp has no size limit. What is the destination file system? Does it support LARGEFILES? – Milind Dumbare Mar 16 '15 at 17:09
  • @Miline Edited the question. – eimrek Mar 16 '15 at 17:17
  • Since you specifically mentions ssh, I assume you require the authentication and/or encryption features? – Ulrich Schwarz Mar 16 '15 at 17:23
  • @UlrichSchwarz preferably, yes – eimrek Mar 16 '15 at 17:44
  • Are you using a 32-bit or 64-bit application/OS? – mdpc Mar 16 '15 at 18:12
  • Don't do this. The encryption is too costly. Encrypt the file locally (if absolutely necessary) then use netcat or a torrent connection to transfer the file. An ssh pipeline for this is a waste. – mikeserv Mar 16 '15 at 18:20
  • @mdpc both systems are 64 bit – eimrek Mar 16 '15 at 18:30
  • @mikeserv I'm transferring the file on a local network with infiniband, where I usually get speeds of ~30 MB/s. Encryption is not necessary, but it doesn't really matter to me if the transfer will take 5 min or 1 min. – eimrek Mar 16 '15 at 18:30
  • Well, it is still costing you cpu cycles. If it is a local network, then why not netcat? If you do use netcat make sure it is the BSD netcat - GNU netcat sucks hard. – mikeserv Mar 16 '15 at 18:33
  • 1
    What about the applications sftp and scp? You can find this out using the file command against their binaries. – mdpc Mar 16 '15 at 18:54
  • @mdpc hmm, the scp and sftp are 32 bit... Could this be the problem? – eimrek Mar 16 '15 at 19:12
  • 1
    @shepherd - yes. – mdpc Mar 16 '15 at 19:12
  • @mdpc ok, thanks, I guess the matter is settled, then. – eimrek Mar 16 '15 at 19:25
  • I used to have this same type of problem with cpio in generating tape backups. It was a 32-bit application running in the 64-bit space. – mdpc Mar 16 '15 at 19:27
  • 2
    32-bit applications can access large files if they're compiled with -D_LARGEFILE_SOURCE -D_LARGEFILE64_SOURCE -D_FILE_OFFSET_BITS=64. But if you're running a 64-bit 6.5 system, it'd probably be easier to have the admins install openssh-5.3p1-94.el6_6.1.x86_64 and openssh-server-5.3p1-94.el6_6.1.x86_64 from the standard repos. – Mark Plotnick Mar 16 '15 at 21:02
  • 1
    lol at software using signed integers for file size – Lightness Races in Orbit Mar 17 '15 at 17:04
  • @MarkPlotnick: Wow. openssh 5.3. That's from October 2009. Please get something newer; the latest version is 6.7. – Martin Schröder Mar 17 '15 at 23:29
  • 1
    @MartinSchröder Within a release, Red Hat keeps most applications at the same revision level, to promote stability. They do fix important and critical bugs, while minimizing incompatible changes to functionality. They do this for 10 years from the date of release. You do make a good point, and if the admins wish, they can download the source code for a newer version of a package and build it. It may be more secure and will likely handle large files. – Mark Plotnick Mar 18 '15 at 00:04
  • It looks like CentOS is broken in that they didn't build the 32 bit ssh client with large file support as Mark Plotnick mentioned. You should file a bug report with CentOS. – psusi Mar 24 '15 at 01:27
  • @psusi Actually the SCP version on the CentOS destination system is 64 bit and works fine. The source system, from where I wanted to copy the file, is Windows where I used the MSYS tools (scp, sftp) tools that came with mingw (32-bit). – eimrek Mar 24 '15 at 16:38
  • Ahh, then it's the msys tools that were badly built. – psusi Mar 24 '15 at 22:46

3 Answers3

36

Rsync is very well suited for transferring large files over ssh because it is able to continue transfers that were interrupted due to some reason. Since it uses hash functions to detect equal file blocks the continue feature is quite robust.

It is kind of surprising that your sftp/scp versions does not seem to support large files - even with 32 Bit binaries, LFS support should be pretty standard, nowadays.

maxschlepzig
  • 57,532
  • 4
    Given that a large part of the file is already transferred, rsync is a good idea now. Use the -P option to both get progress indication and instruct the receiver to keep an incomplete file in case the transfer is interrupted again. – Simon Richter Mar 17 '15 at 00:02
35

I'm not sure about the file size limits of SCP and SFTP, but you might try working around the problem with split:

split -b 1G matlab.iso

This will create 1 GiB files which, by default, are named as xaa, xab, xac, .... You could then use scp to transfer the files:

scp xa* xxx@xxx:

Then on the remote system recreate the originial file with cat:

cat xa* > matlab.iso

Of course, the penalties for this workaround are the time taken in the split and cat operations, as well as the extra disk space needed on the local and remote systems.

  • 1
    good idea. I already transferred the file with an usb drive, but this would have probably been more convenient. Not as convenient as getting scp and sftp to work correctly, though. – eimrek Mar 16 '15 at 17:46
  • to note, I had to send a 4GB+ file to ArcaOS. Had to use this option. – ewokx Dec 21 '22 at 00:35
12

The original problem (based on reading all comments to the OP question) was that the scp executable on the 64-bit system was a 32-bit application. A 32-bit application that isn't compiled with "large-file support" ends up with seek pointers that are limited to 2^32 =~ 4GB.

You may tell if scp is 32-bit by using the file command:

file `which scp`

On most modern systems it will be 64-bit, so no file truncation would occur:

$ file `which scp`
/usr/bin/scp: ELF 64-bit LSB  shared object, x86-64 ...

A 32-application should still be able to support "large files" but it has to be compiled from source with large-file support which this case apparently wasn't.

The recommended solution is perhaps to use a full standard 64-bit distribution where apps are compiled as 64-bit by default.

arielf
  • 890