20

I have many text files in a directory, and I want to remove the last line of every file in the directory.

How can I do it?

don_crissti
  • 82,805
  • 6
    What have you tried? https://unix.stackexchange.com/help/how-to-ask: "Sharing your research helps everyone. Tell us what you found and why it didn’t meet your needs. This demonstrates that you’ve taken the time to try to help yourself, it saves us from reiterating obvious answers, and above all, it helps you get a more specific and relevant answer!" – phemmer Jan 31 '17 at 13:23
  • Why does the thought of somebody accidentally applying that to /etc have a very special quality of cringe induction :) – rackandboneman Feb 01 '17 at 11:27

6 Answers6

19

You can use this nice oneliner if you have GNU sed.

 sed -i '$ d' ./*

It will remove the last line of each non-hidden file in the current directory. Switch -i for GNU sed means to operate in place and '$ d' commands the sed to delete the last line ($ meaning last and d meaning delete).

StefanR
  • 1,392
  • 3
    This will throw up errors (and not do anything) if the folder contains anything other than a regular file, e.g. another folder... – N.I. Jan 31 '17 at 14:27
  • 1
    @StefanR You're using -i which is a GNUism, so this is moot, but I would lose my old beard standing if I failed to point out that some older versions of sed do not allow you to put any whitespace between the $ and the d (or, in general, between the pattern and the command). – zwol Jan 31 '17 at 15:02
  • 1
    @zwol As I wrote, this will result in an error, not a warning, and sed will give up once it reach that file (at least with the version of sed I have). The next files won't be processed. Throwing away the error messages would be a terrible idea since you wouldn't even know it had happened! With zsh you could use *(.) to glob for regular files, I don't know about other shells. – N.I. Jan 31 '17 at 15:05
  • @NajibIdrissi Hmm, you're right. That surprises me; I would have expected it to complain about the directory but then go on to the next file on the command line. In fact, I think I'm going to report that as a bug. – zwol Jan 31 '17 at 15:07
  • @don_crissti I have GNU sed v4.3 too... I don't know what to tell you, I just tested again. https://gist.github.com/nidrissi/66fad6be334234f5dbb41c539d84d61e – N.I. Jan 31 '17 at 15:18
  • @zwol, what implementation/version of sed doesn't support sed '$ d'? AFAICT the very first implementation from the late 70s in Unix V7 did support it. GNU sed 1.18 at least (from 1993) does support it. – Stéphane Chazelas Jan 31 '17 at 17:41
  • @don_crissti But note what it says above, under exit code 2: "One or more of the input file specified on the command line could not be opened (e.g. if a file is not found, or read permission is denied). Processing continued with other files." It seems to me that an EISDIR open failure should be in that category. Anyway I've asked the maintainers about it: https://lists.gnu.org/archive/html/bug-sed/2017-01/msg00077.html – zwol Jan 31 '17 at 17:52
  • @StéphaneChazelas SunOS 4 is where most of my mental notes of the form "old systems can't do X" came from originally, but 20 years later I cannot be sure that this was actually a problem. – zwol Jan 31 '17 at 17:53
11

The other answers all have problems if the directory contains something other than a regular file, or a file with spaces/newlines in the file name. Here's something that works regardless:

find "$dir" -type f -exec sed -i '$d' '{}' '+'
  • find "$dir" -type f: find the files in the directory $dir
    • -type f which are regular files;
    • -exec execute the command on each file found
    • sed -i: edit the files in place;
    • '$d': delete (d) the last ($) line.
    • '+': tells find to keep adding arguments to sed (a bit more efficient than running the command for each file separately, thanks to @zwol).

If you don't want to descend into subdirectories, then you can add the argument -maxdepth 1 to find.

N.I.
  • 230
9

Using GNU sed -i '$d' means reading the full file and making a copy of it without the last line, while it would be a lot more efficient to just truncate the file in place (at least for big files).

With GNU truncate, you can do:

for file in ./*; do
  [ -f "$file" ] &&
    length=$(tail -n 1 "$file" | wc -c) &&
    [ "$length" -gt 0 ] &&
    truncate -s "-$length" "$file"
done

If the files are relatively small, that would probably be less efficient though as it runs several commands per file.

Note that for files that contain extra bytes after the last newline character (after the last line) or in other words if they have a non-delimited last line, then depending on the tail implementation, tail -n 1 will return only those extra bytes (like GNU tail), or the last (properly delimited) line and those extra bytes.

  • do you need a |wc -c in the tail call? (or a ${#length}) – Jeff Schaller Jan 31 '17 at 19:57
  • @JeffSchaller. Oops. wc -c was intended indeed. ${#length} would not work as it counts characters, not bytes and $(...) would remove the tailing newline character so ${#...} would be off by one even if all the characters were single-byte. – Stéphane Chazelas Jan 31 '17 at 20:09
6

A more portable approach:

for f in ./*
do
test -f "$f" && ed -s "$f" <<\IN
d
w
q
IN
done

I don't think this needs any explanation... except maybe that in this case d is the same as $d since ed by default selects the last line.
This will not search recursively and will not process hidden files (aka dotfiles).
If you want to edit those too see How to match * with hidden files inside a directory

don_crissti
  • 82,805
6

If you have access to vim, you can use:

for file in ./*
do
  if [ -f "${file}" ]
  then
    vim -c '$d' -c "wq" "${file}"
  fi
done
R Sahu
  • 232
3

POSIX-compliant one-liner for all files recursively starting in current directory, including dot-files:

find . -type f -exec sh -c 'for f; do printf "\$d\nx\n" | ex "$f"; done' sh {} +

For .txt files only, non-recursively:

find . -path '*/*/*' -prune -o -type f -name '*.txt' -exec sh -c 'for f; do printf "\$d\nx\n" | ex "$f"; done' sh {} +

Also see:

Wildcard
  • 36,499