Good evening,
I would like to filter a file's content with some piped commands and then write the result back to the same file. I know, I can't do that the way I wrote it. Hold on …
This is the piece of bash script I have.
grep '^[a-zA-Z.:]' "$filepath" \
| sed -r '/^(rm|cd)/d' \
| uniq -u \
> "$filepath"
So I thought I could succeed in, using process substitution instead. I then wrote:
grep '^[a-zA-Z.:]' < <(cat "$filepath") | …
This did not solve anything either. I expected process substitution to « save » my input file content somewhere, like in a temporary file. It seams I haven't understood process substitution either.
I read threads about "inplace" edition but these articles highlighted special options of some binaries like sed -i
or sort -o
but I need a general solution (I mean it has to suit any piped commands).
So first, why 'pipes standard way' cannot do this, what's happening underneath ? :/
And how should I solve my issue ? Could someone please explain me what is this all about ?
Thank you.
mv
the tmpfile over the original file. This works with any pipeline of commands rather than just the handful (like GNUsed -i
,perl -i
,sort -o
, etc) that have support for in-place editing. write-to-tmpfile-and-rename is what those commands do internally, anyway. – cas Feb 17 '16 at 01:14mktemp (1)
, you might want to usesponge (1)
from themoreutils
package. – kba Feb 17 '16 at 01:22sed --in-place
might also be something to look into, but look before you leap, if you will. – DopeGhoti Feb 17 '16 at 01:45mktemp
(I prefer doing things myself and avoid abstract work as much as possible) .. still it's not working for the moment but I must be mistaking with file descriptors. @.To whom it may interest, I will post my final script as it'll be finished and (apparently) consistent ^^. – Stphane Feb 17 '16 at 09:29