A directory I have is filled by a lot of files. I want to discover of what kind they are and whose of them are so numerous.
Here are the events when I try some commands :
ls -l | wc -l
1514340
ls | head -n 4
2004112700001.htm
2004112700002.htm
2004112700003.htm
2004112700004.htm
ls *.xml | head -n 4
20041127.xml
20041225.xml
20050101.xml
20050108.xml
ls -l *.htm | wc -l
bash: /bin/ls: Liste d'arguments trop longue
0
# Any other kind of ls command with *.htm, *.* is failing too.
I understand that wc -l
has to wait that the output of the ls -l *.htm
is entirely done before starting to analyze it. And because that output is too big, it fails.
Is it truly what is happening ?
What is the good way to make the ls
command works in this case in conjunction with wc -l
? Is there a way to ask the wc
command to start asynchronously, before the output is entirely completed ?
wc
failing because the output is too big or the pipe that's overflowing.ls
is notg even starting because*.htm
expands into too many arguments for it. – muru Jun 04 '20 at 06:29htm
thanhtm
. Nohtml
file, for example. – Marc Le Bihan Jun 04 '20 at 07:06*.htm
expands to2004112700001.htm 2004112700002.htm 2004112700003.htm 2004112700004.htm ...
thenls
is run with all those filenames as arguments, which exceeds the argument length limit. Whether or not you have a.html
file makes no difference. Please see the dupe. – muru Jun 04 '20 at 07:08*.htm
isn't thearg[0]
that a C programls
is taking to resolve a file filter by classicalfindFirst
,findNext
functions ? How would thels
succeed in expanding *.htm to a list of files ? By doing itself anls
? – Marc Le Bihan Jun 04 '20 at 07:17ls
doesn't expand anything. The shell does. See, e.g,, https://unix.stackexchange.com/q/17938/70524 – muru Jun 04 '20 at 07:28