Given a directory containing:
note 1.txt
, last modified yesterdaynote 2.txt
, last modified the day before yesterdaynote 3.txt
, last modified today
What is the best way to fetch the array note 3
note 1
note 2
?
To define "best," I'm more concerned about robustness (in the context of Zsh in macOS) than I am about efficiency and portability.
The intended use case is a directory of hundreds or thousands of plain text files, but—at the risk of muddling the question—this is a specific case of a more general question I have, of what best practices are in performing string manipulations on filepaths printed by commands like ls
, find
, and mdfind
.
I've been using a macro which invokes this command to achieve the above:
ls -t | sed -e 's/.[^.]*$//'
It's never failed, but:
- Greg's Wiki strongly recommends against parsing the output of
ls
. (Parsingls
; Practices, under "5. Don't Ever Do These"). - Is invoking
sed
inefficient where parameter expansion would do?
Using find
(safely delimiting filepaths with NUL
characters rather than newlines), and parameter expansion to extract the basenames, this produces an unsorted list:
find . -type f -print0 | while IFS= read -d '' -r l ; do print "${${l%.*}##*/}" ; done
But sorting by modification date would seem to require invoking stat
and sort
, because macOS's find
lacks the -printf
flag which might otherwise serve well.
Finally, using Zsh's glob qualifiers:
for f in *(om) ; do print "${f%.*}" ; done
Though not portable, this last method seems most robust and efficient to me. Is this correct, and is there any reason I shouldn't use a modified version of the find
command above when I'm actually performing a search rather than simply listing files in a directory?