What is the least expensive way to find the oldest file in a directory, including all directories underneath. Assume directory is backed by SAN and under heavy load.
There is concern that "ls" could be locking and cause system degradation under heavy load.
Edit: Find performs very well under a simple test case - find oldest file amongst 400 gigs of files on an SSD drive took 1/20 seconds. But this is a MacBook Pro Laptop under no load... So it's a bit of an apples to oranges test case.
And as an aside what is the best way to find out implementations (underlying algorithms) for such commands?
ls
doesn't scan the file contents. It reads the directories andstat
s the files, which is necessary to find the oldest files anyway. Butls
won't really help you because going from anyls
output to finding the oldest files would be very difficult. – Gilles 'SO- stop being evil' Jul 17 '13 at 23:55