9

When I import pictures from my camera in Shotwell, it also imports the video clips. This is somewhat annoying, as I would like to store my videos in another folder. I've tried to write a bash command to do this, but have not had success.

I need a command that meets the following requirements:

  • Locate all files in a directory structure that do not have an extension of .jpg, .png, .gif, or .xcf (case insensitive).
  • Move all of these files into a target directory, regardless of whether the file names or directory paths contain spaces or special characters.

Any help would be appreciated!

EDIT: I'm using the default shell in Ubuntu, meaning that some commands are aliased, etc.

EDIT 2: I've attempted this myself (not the copy part, just the listing of files part). I turned on extglob and ran the following command:

$ ls -R /path | awk '
  /:$/&&f{s=$0;f=0}
  /:$/&&!f{sub(/:$/,"");s=$0;f=1;next}
  NF&&f{ print s"/"$0 }'

This lists everything. I tried using grep on the end of it, but haven't the foggiest idea of how to get it to not match a pattern I give it. The extglob switch didn't help much with grep, even though it does help with other commands.

Adam
  • 193
  • You'll have better luck attempting this yourself and asking specific questions about what's going wrong. – drs Aug 04 '14 at 02:54

3 Answers3

16

You can use find to find all files in a directory tree that match (or don't match) some particular tests, and then to do something with them. For this particular problem, you could use:

find -type f ! \( -iname '*.png' -o -iname '*.gif' -o -iname '*.jpg' -o -iname '*.xcf' \) -exec echo mv {} /new/path \;

This limits the search to regular files (-type f), and then to files whose names do not (!) have the extension *.png in any casing (-iname '*.png') or (-o) *.gif, and so on. All the extensions are grouped into a single condition between \( ... \). For each matching file it runs a command (-exec) that moves the file, the name of which is inserted in place of the {}, into the directory /new/path. The \; tells find that the command is over.

The name substitution happens inside the program-execution code, so spaces and other special characters don't matter.


If you want to do this just inside Bash, you can use Bash's extended pattern matching features. These require that shopt extglob is on, and globstar too. In this case, use:

mv **/!(*.[gG][iI][fF]|*.[pP][nN][gG]|*.[xX][cC][fF]|*.[jJ][pP][gG]) /new/path

This matches all files in subdirectories (**) that do not match *.gif, *.png, etc, in any combination of character cases, and moves them into the new path. The expansion is performed by the shell, so spaces and special characters don't matter again.

The above assumes all files are in subdirectories. If not, you can repeat the part after **/ to include the current directory too.

There are similar features in zsh and other shells, but you've indicated you're using Bash.


(A further note: parsing ls is never a good idea - just don't try it.)

Michael Homer
  • 76,565
  • 1
    This only for GNU find, not POSIX find. – cuonglm Aug 04 '14 at 03:15
  • I think you'd be better off with -exec mv {} +. Also - you're wrong about ls. It does the opposite thing that find does. find finds ls lists. It parses itself, by the way. And even if what I say is not true - you sure don't make any attempt to backup your blanketed statement. And please don't link to that wooledge wiki thing - it is a terrible source of information - and not just on that page. For my part, here's a fine example of how easily ls is parsed. And by the way - it would work the same there if they were newlines not spaces – mikeserv Aug 04 '14 at 08:54
  • Did you write that on the wrong answer? – Michael Homer Aug 04 '14 at 08:56
  • It could use +, but it really doesn't matter and it's much less clear. I thought it might be the wrong answer since I didn't link to wooledge here (although I have in other answers you've undoubtedly read), so "that page" seems like a dangling reference. – Michael Homer Aug 04 '14 at 09:04
  • I actually haven't noticed that you have - though a lot of people do. It really is an atrocious source of information. Have you seen the arrays page? Ugh. It's a mishmash of fairytales and hearsay. You're better than that, man. reproducible data is not the sort of thing you acquire there. In any case, you're probably right about the + - it will not likely be executed more than the one time anyway. – mikeserv Aug 04 '14 at 09:10
  • Compare this to the wooledge wiki's ls page. – mikeserv Aug 04 '14 at 09:15
  • 3
    There was one the other day about quoting variable expansions (they're for it, by the way); I agree that it's often a bit enthusiastic. I also agree that the linked answer shows how easily ls is parsed, by the way, but I think we differ on just how easily that is — that was a lot of clever work. So "never do it" is still good general advice to anyone who needs advice – Michael Homer Aug 04 '14 at 09:22
  • That is a very good caveat. Still, I don't think while read inum na should be considered difficult or clever - it's shell 101. I think we'll have to agree to disagree about the value of wooledge and co's enthusiasm, though. Thanks anyway for the repartee. – mikeserv Aug 04 '14 at 09:48
  • To be clear, I'm not saying it has much value - it's at best of intermittent value, and not a source for anything real. I wouldn't use it other than for things on the line of "is quoting variables considered best practice" (which is an assertion I'm making otherwise than from any particular source, really). – Michael Homer Aug 04 '14 at 09:57
  • For myself, I'd prefer it if some others were a little more curious about why an expansion should or should not be quoted. A perpetually quoted expansion, for instance, is never expanded and is therefore useless. Or, at best, its expansion is perpetually weakly limited, and it therefore requires more work than should be necessary to draw out its value. It confuses me that so many recommend splitting fields with awk's FS= and utterly ignore the shell's own very powerful $IFS. Cest la vie, I suppose. And yes, the wooledge wiki, is, at best, intermittent. – mikeserv Aug 04 '14 at 10:12
  • @mikeserv "And please don't link to that wooledge wiki thing - it is a terrible source of information - and not just on that page." - If there was a downvote button on comments, I'd surely downvote that. ls is definitely not meant to be parsed, and attempting to parse its output will break on obscure file names. Please don't make me reiterate. Unless you link to a reputable source claiming the opposite, I'm not going to believe you. – Alexia Luna Oct 06 '14 at 19:03
  • very nice. on the !(...)-syntax: I dont know about mv, but with cp, one might want to use the --parents option. – phil294 Oct 15 '17 at 17:47
  • We don't need "echo" word, else it just echoes the commands instead of actual move – Akshay Lokur Nov 13 '17 at 08:19
2

You use Ubuntu, so you will have GNU find, try:

find . -maxdepth 1 ! -iregex ".*\.\(jpg\|png\|gif\|xcf\)$" -exec mv -- -t /path/to/newdir "{}" +
  • -iregex use regex to find filename, but case insensitive, ! negates the regex.

  • -exec command + run command for files matching. It's like using find -print0 with xargs -0. Using this we can move multiple files found with one mv command, so the total number of invocation command is less than the number of matched files.

  • -maxdepth 1 limit find to search only in current directory, if you want recursive, remove it.

cuonglm
  • 153,898
0

There's no need to overcomplicate this! Why not just:

mkdir ../temp
mv  *.{jpg,png,gif,xcf} ../temp/
mv * /desired/target/directory/
mv ../temp/* .
rmdir ../temp

This won't work for a whole directory tree of course, just one flat directory of files.