56

When I enter unzip ../founation-latest.zip, it outputs this:

warning [../foundation-latest.zip]: 248 extra bytes at beginning or within zipfile (attempting to process anyway)

The file is 138KB. It unzips correctly, but why am I getting this error?

drs
  • 5,453
  • 3
    One possible cause is that, in one step of its journey onto your system, it was transferred with ftp in ASCII mode rather than BINARY mode and some bytes were added. If you used ftp at any stage, run the ftp again, using the 'bin' command before any 'put' or 'get'. – Mark Plotnick Feb 19 '14 at 10:51
  • 1
    It could have a malicious payload at the beginning. It's a hostile internet. Take caution on what unzip utility you use to open a zip like that. – jbrahy May 05 '15 at 18:17
  • There's a lot of conjecture in the current answeres, because there are many possible causes. It would help to have a link to or copy of the file in question. – duozmo Aug 05 '15 at 16:31
  • Regarding possibly malicious extra payload: At that size, you could upload the file to https://www.virustotal.com/ to have it checked - in case there is no personal information in there. However, I would not worry to much about viruses in Linux, only if you copy the original file elsewhere. (You could always re-pack the file if you think the data is complete.) – Ned64 Aug 29 '15 at 08:26
  • Just to confirm this is a Problem. I tried creating a backup of my iTunes filespace with both zip and with ditto. The unzip provided (by 10.11) failed with both of these, as well as with 7za. MacOS unzip just doesn't like (large?) zip files. – Otheus May 20 '19 at 12:07

7 Answers7

60

My issue was because I was trying to use "unzip" on MAC OSX which cannot handle things zipped with PKZIP.

I was able to brew install p7zip and unzip using the command 7za x some_file.zip.

I originally found the solution in this article: need-pk-compat-v4-5-can-do-v2-1

ZachB
  • 103
  • 4
duyker
  • 701
56

I found this thread which had a similar problem. The bug report is titled: unzip fails on 5.4GB ZIP with "extra bytes at beginning or within zipfile". One of the suggested fixes was to use this command on the .zip file.

$ zip -FFv foo.zip --out fixed.zip

Example Run

$ zip -FFv foo.zip --out fixed.zip
Fix archive (-FF) - salvage what can
 Found end record (EOCDR) - says expect single disk archive
Scanning for entries...
 Local ( 1      0): copying: d1/f1   (651734 bytes)
 Local ( 1 651817): copying: d1/d2/  (0 bytes)
 Local ( 1 651905): copying: d1/d2/f3   (80 bytes)
 Local ( 1 652083): copying: d1/f23   (891 bytes)
 Local ( 1 653021): copying: d1/f27   (8764 bytes)
 Local ( 1 661837): copying: d1/f24   (14818 bytes)
 Local ( 1 676709): copying: d1/f25   (17295 bytes)
...
 Cen   ( 1 5488799949): updating: d1/f13
 Cen   ( 1 5488800052): updating: d1/f14
Zip64 EOCDR found ( 1 5488800155)...
Zip64 EOCDL found ( 1 5488800211)...
EOCDR found ( 1 5488800231)...
$ echo $?
0

zip's -FF switch

excerpt from zip man page

       -FF
       --fixfix
              Fix the zip archive. The -F option can be used if some 
              portions of the archive are missing, but requires a reasonably 
              intact central directory.   The  input  archive is scanned as 
              usual, but zip will ignore some problems.  The resulting 
              archive should be valid, but any inconsistent entries will be 
              left out.

              When doubled as in -FF, the archive is scanned from the 
              beginning and zip scans  for  special  signatures  to  
              identify  the  limits between the archive members. The single 
              -F is more reliable if the archive is not too much damaged, so 
              try this option first.

              If  the archive is too damaged or the end has been truncated, 
              you must use -FF.  This is a change from zip 2.32, where the 
              -F option is able to read a truncated archive.  The -F option 
              now more reliably fixes archives with minor damage and the -FF 
              option is  needed to fix archives where -F might have been 
              sufficient before.
              ...
slm
  • 369,824
  • 3
    Windows 10 can generate large (bigger than 4GB?) ZIP files that the Info-ZIP unzip command cannot read properly. (Looks like Debian bug #661956 was never resolved.) As this answer above suggests, you can possibly use -FF to untangle the file into a file that unzip can read, or you can give up and use the 7-Zip 7z program which can read the file directly. – Ian D. Allen Apr 05 '22 at 19:42
  • It looks like the find_ecrec() function in the unzip codebase is at fault. It's computing the expected offset of the central directory using the 0xffffffffu from the eocd, when it should be using the offset from the eocd64. I'm writing a tool that generates large zip files. What fixed this for me, is the eocd record needs to come after the eocd64 record. That way unzip won't go screwy. – Justine Tunney Nov 19 '23 at 00:48
6

Just had this warning, too. In my case it was caused by downloading it with 'curl -i' which caused the http-headers to appear at the start of the zip file. silly me. For sure this will not be the cause/solution in all cases, but maybe it helps someone...

dknaus
  • 161
4

I have seen this type of error before when the zip archive was transfered via a web service that was having trouble. Upon direct examination of the zip file, I found an error message from the web service had been sent in front of the zip file.

You might try to examine the zip file as text and see if anything interesting shows up at the front.

3

It could be a self-extracting archive (windows .exe) or has been padded for some reason.

Ricky
  • 1,377
  • 1
    What do you mean by "padded"? – rainwater11 Feb 19 '14 at 02:06
  • 1
    Extra bytes (usually null (zero)) to make the file a specific length. This used to be an artifact of the file transfer block size (eg. xmodem), but in the modern world, that doesn't happen. It could also be a crypto signature. (I don't have the file, so I don't know what those 248 bytes are.) – Ricky Feb 19 '14 at 02:43
2

I had the same issue on Linux with a .zip file larger than 4GB, compounded with a only DEFLATED entries can have EXT descriptor error.

The command 7z x resolved all my issues though.

Be careful though, the command 7z x will extract all files with a path rooted in the current directory. The option -o allows to specify an output directory.

  • Windows 10 can generate large (bigger than 4GB?) ZIP files that the Info-ZIP unzip command cannot read properly. (Looks like Debian bug #661956 was never resolved.) As an earlier answer suggests, you can possibly use zip -FF to untangle the file into a file that unzip can read, or as this posting says you can give up and use the 7-Zip 7z program which can read the file directly (though it might complain Headers Error). – Ian D. Allen Apr 05 '22 at 19:43
0

I also had same issue. I observed the issue when I copied files from Windows to Unix server without using bin mode. Best way to resolve the issue was to transfer the files in bin mode.

Amit
  • 1
  • 1
    (1) This information was already presented in a comment.  That’s OK, but …  (2) the comment has more detailed information than this answer.  (3) You should improve this answer by describing what you’re talking about. Please do not respond in comments; [edit] your answer to make it clearer and more complete. – Scott - Слава Україні Aug 10 '17 at 05:59