754

Is there a way to zip all files in a given directory with the zip command? I've heard of using *.*, but I want it to work for extensionless files, too.

tkbx
  • 10,847
  • 4
    Have you tried navigating one-level up from your desired directory and doing zip myarch.zip mydir/*? – Joseph R. Nov 28 '12 at 16:46
  • 18
    or better zip -r myarch.zip mydir/* – Adam Mar 16 '16 at 17:52
  • 46
    or better zip -r myarch.zip mydir – ctrl-alt-delor Jan 17 '17 at 16:42
  • 1
    *.* means any file with a dot. In cp/m and dos all files had a dot, and it made you type it (could not do *). Therefore people came to see *.* as all files. Eventually Microsoft added long-filenames that could have zero, or more dots. To find a file that has a dot on windows you have to type *.*.*. – ctrl-alt-delor Jan 17 '17 at 16:51

6 Answers6

1065

You can just use *; there is no need for *.*. File extensions are not special on Unix. * matches zero or more characters—including a dot. So it matches foo.png, because that's zero or more characters (seven, to be exact).

Note that * by default doesn't match files beginning with a dot (neither does *.*). This is often what you want. If not, in bash, if you shopt -s dotglob it will (but will still exclude . and ..). Other shells have different ways (or none at all) of including dotfiles.

Alternatively, zip also has a -r (recursive) option to do entire directory trees at once (and not have to worry about the dotfile problem):

zip -r myfiles.zip mydir

where mydir is the directory containing your files. Note that the produced zip will contain the directory structure as well as the files. As peterph points out in his comment, this is usually seen as a good thing: extracting the zip will neatly store all the extracted files in one subdirectory.

You can also tell zip to not store the paths with the -j/--junk-paths option.

The zip command comes with documentation telling you about all of its (many) options; type man zip to see that documentation. This isn't unique to zip; you can get documentation for most commands this way.

derobert
  • 109,670
  • 15
    You might want to add, that it is considered a good practice to contain everything in the archive in a top-level directory - so that one doesn't pollute his/her current directory on extraction. – peterph Nov 28 '12 at 16:54
  • @peterph done. Though this is less a convention in zip files than in e.g., tarfiles, I'm afraid. – derobert Nov 28 '12 at 16:59
  • unfortunately yes. Probably due to the windows heritage of drag'n'drop to the desktop and linux heritage of working with source codes. – peterph Nov 28 '12 at 17:03
  • 4
    Keep in mind that * shell-globbing doesn't include dotfiles (ie. filenames beginning with .). This is another advantage to zipping the whole directory by name. – mrb Nov 28 '12 at 17:29
  • 1
    But using -r includes the directory itself, which breaks what I'm doing. Wouldn't * include . and ..? – tkbx Nov 28 '12 at 17:30
  • @tkbx No, * doesn't include anything starting with . by default. Even if you have dotglob enabled (so it includes things starting with .), it doesn't include . or ... – derobert Nov 28 '12 at 17:32
  • @mrb Well, it does if you run shopt -s dotglob – derobert Nov 28 '12 at 17:33
  • @tkbx Also, there is the --junk-paths option. I've added that to my answer. – derobert Nov 28 '12 at 17:37
  • @derobert If you are using bash, that's true and a good solution; but it's also neither a standard sh option nor the default setting on bash, so it's worth being aware of. (Likewise, with zsh, zip foo.zip * .* automatically excludes . and .., which is convenient; but zsh doesn't have shopt) – mrb Nov 28 '12 at 17:47
  • @mrb I've added in a paragraph about this to the answer—I think it explains the issue (and of course, its not an issue if you just use zip -jr) – derobert Nov 28 '12 at 17:52
  • I normally use * .??* to quickly include most entries that start with a . (namely those with at least 3 characters in it, so . and .. won't match). This will work in most cases. – Ned64 Sep 11 '15 at 21:52
  • @Ned64 .[^.]* would probably work, just off the top of my head without any real thought. Or of course zip's recursive option, which works without any thought. – derobert Sep 11 '15 at 22:00
  • * will also fail if the directory contains many files. Recursive won't have that issue though.... – Gert van den Berg Jul 27 '16 at 11:11
  • With possible exception of files starting with . here is a mapping of what * does in Everyone's Unix and Microsoft's Windows. Unix:Windows, *:*, *:*.*, *.*:*.*.*, *.*.*:*.*.*.*. (The windows algorithm is 1: if in ends with .* then strip it off. 2: put the rest into the Unix algorithm) (I blame cp/m) – ctrl-alt-delor Jan 17 '17 at 16:48
  • if zip is not installed feel free to use tar.gz: tar -zcvf myfiles.tar.gz mydir/* – Eugene Kaurov Aug 27 '18 at 11:13
  • Why doesn't zip support capital R for recursive? Most Linux programs support -R to avoid ambiguity. – Aaron Franke Sep 12 '19 at 14:37
  • I was trying to figure out why it kept adding the directory within the zip even after adding dir/* in the path, turns out I need to cd into dir first, can't select dir files in the zip command itself – Rod911 Mar 29 '23 at 15:06
33

In my case I wanted to zip each file into its own archive, so I did the following (in zsh):

$ for file in *; do zip ${file%.*}.zip $file; done
  • 3
    There's no mkv here? Also nothing here is particularly zsh-specific. You'll want to properly quote any variable containing a file name, so zip "${file%.*}.zip" "$file" with the double quotes around both variables. – tripleee Jan 30 '17 at 11:08
  • 1
    @tripleee Firstly, thanks for pointing out my erroneous reference to mkv. Secondly, quoting arguments is unnecessary in zsh, unlike in bash. That's why I specified that this was a command for zsh. – Resigned June 2023 Jan 30 '17 at 16:20
  • 1
    Replacing the last semi-colon with an ampersand might speed it up significantly (If the number of files in the directory is reasonable...). Otherwise find . -type f -maxdepth 1 -print0|xargs -r0 -n1 -P64 -I{} bash -c 'f="{}"; zip "${f%.*}.zip" "$f"' (with -P adjusted depending on your CPU threads...) (Many GNU dependencies...) – Gert van den Berg May 18 '18 at 17:30
  • 2
    to zip each file into its own archive, do gzip * – Marco Marsala Jan 11 '21 at 22:40
9

Another way would be to use find and xargs: (this might include a "." directory in the zip, but it should still extract correctly. With my test, zip stripped the dot before compression) find . -type f -exec zip zipfile.zip {} +

(The + can be replaced with \; if your version of find does not support the + end for exec. It will be slower though...)

This will by default include all sub-directories. On GNU find -maxdepth can prevent that.

3

Another (slow) method to do this (which adds one file to the zip at a time):

for f in * .[^.]*; do
    [ -r "$f" ] || continue # Skip directories or non-existant files (Probably ".[^.]*" if directory has no dotfiles). Using -e will allow directories to be used as well
    zip zipfile.zip "$f" # If directories are included, you probably want to add -r
done

This has the dotfile issues of * (workaround added) and would be start zip once for each file, adding it to the archive. In bash, it would deal with a large amount of files.

It would be slower than most of the other methods, but is relatively simple.

  • 1
    I would say this is less simple than the accepted answer, and slower, which prompts the question: "Why would anyone do this?". If you can answer that question, I recommend you put that context in your answer, otherwise I think it is a bad answer to an old question that has a good answer already. – Centimane May 17 '18 at 16:21
  • @Centimane: I note the limitations. I feel this has educational value. (If not skipping directories, it is quite simple). If you want a much faster answer using a (standard) external tool instead, my other answer covers that. (with the dotfiles handling removed (which affects correctness without their absence mentioned in the question), I feel it is quite elegant): for f in *; do zip zip.zip "$f"; done – Gert van den Berg May 18 '18 at 17:09
  • 1
    Note that the accepted answer doesn't use an external command and would be faster. In what scenario would this answer be useful? – Centimane May 18 '18 at 19:00
  • 2
    @Centimane With tar when there are more files than what bash can pass as parameters. (find + xargs are better, for loops are easier...). It is a (unique) answer to the question. It is certainly not the optimal answer. (Non optimal answers can still be useful for similar problems, if someone has a slightly different situation - e.g. wanting tar file any directories in it) – Gert van den Berg May 18 '18 at 22:18
0

Not .zip files, but gzip * is a brief command that will compress each file in a dir into its own .gz and delete the original. Really handy in many cases, as many other tools can work with the .gz files directly.

user@computer:~/test$ touch test1.txt
user@computer:~/test$ touch test2.txt
user@computer:~/test$ touch test3.txt
user@computer:~/test$ ls
test1.txt  test2.txt  test3.txt
user@computer:~/test$ gzip *
user@computer:~/test$ ls
test1.txt.gz  test2.txt.gz  test3.txt.gz
0

If you want to avoid using globs you can do

cd mydir
zip -r ../my.zip .

It's not pretty, but it gets the job done :)