350

I have a directory with a large number of files. I don't see a ls switch to provide the count. Is there some command line magic to get a count of files?

Blake
  • 3,687

21 Answers21

446

Using a broad definition of "file"

ls | wc -l

(note that it doesn't count hidden files and assumes that file names don't contain newline characters).

To include hidden files (except . and ..) and avoid problems with newline characters, the canonical way is:

find . ! -name . -prune -print | grep -c /

Or recursively:

find .//. ! -name . -print | grep -c //
James
  • 4,674
  • 42
    wc is a "word count" program. The -l switch causes it to count lines. In this case, it's counting the lines in the output from ls. This is the always the way I was taught to get a file count for a given directory, too. – Sandy Aug 24 '10 at 06:07
  • 38
    please add note that ls does ls -1 if the output is a pipe. – Lesmana Aug 24 '10 at 16:47
  • 7
    that doesn't get everything in a directory - you've missed dot files, and collect a couple extra lines, too. An empty directory will still return 1 line. And if you call ls -la, you will get three lines in the directory. You want ls -lA | wc -l to skip the . and .. entries. You'll still be off-by-one, however. –  Aug 25 '10 at 15:14
  • 1
    An empty directory returns 0 for me – James Sep 24 '13 at 02:16
  • 1
    If you have a file whose name contains a newline, this approach will incorrectly count it twice. – godlygeek Mar 03 '15 at 22:20
  • 3
    A corrected approach, that would not double count files with newlines in the name, would be this: ls -q | wc -l - though note that hidden files will still not be counted by this approach, and that directories will be counted. – godlygeek Mar 03 '15 at 22:30
  • How does this work? I need to know how to modify it slightly / combine recursion with including hidden files. Even a link would be nice. – Brian Peterson Jan 16 '20 at 23:30
  • 1
    Wow I can't believe that something so easily done in Windows with "dir" can be so cumbersome with Linux :| – DARKGuy May 17 '20 at 02:26
  • 1
    "ls -1 | wc" counts files. dash-one (-1) flag for 'ls' is an easy way to get a single file per line. – user62612 Jul 15 '20 at 18:18
52

For narrow definition of file:

 find . -maxdepth 1 -type f | wc -l
slava
  • 173
35

I have found du --inodes useful, but I'm not sure which version of du it requires. It should be substantially faster than alternative approaches using find and wc.

On Ubuntu 17.10, the following works:

du --inodes      # all files and subdirectories
du --inodes -s   # summary
du --inodes -d 2 # depth 2 at most

Combine with | sort -nr to sort descending by number of containing inodes.

krlmlr
  • 957
  • 1
    no --inodes on FreeBSD :( – CervEd Apr 25 '21 at 10:08
  • Thanks for sharing! I searched for "count" in the du man page, as in "I want to count the files", but it's not documented with that word. Any answer using wc -l will be wrong when any name contains a newline character. – Iain Samuel McLean Elder Mar 28 '23 at 09:03
  • 1
    du --inodes include . (or ..?) as a dir, so the count should minus 1. For example I have a dir with only jpg files, du --inodes gives me 200 while ll *.jpg | wc -l gives me 199 – WesternGun Dec 26 '23 at 18:35
22
ls -1 | wc -l

...

$ ls --help | grep -- '  -1'
    -1                         list one file per line

...

$ wc --help | grep -- '  -l'
    -l, --lines            print the newline counts

PS: Note ls -<number-one> | wc -<letter-l>

nicomen
  • 321
16

Probably the most complete answer using ls/wc pair is

ls -Aq | wc -l

if you want to count dot files, and

ls -q | wc -l

otherwise.

  • -A is to count dot files, but omit . and ...
  • -q make ls replace nongraphic characters, specifically newline character, with ?, making output 1 line for each file

To get one-line output from ls in terminal (i.e. without piping it into wc), -1 option has to be added.

(behaviour of ls tested with coreutils 8.23)

Frax
  • 261
  • 2
    As you said, -1 is not needed. As to "it handles newlines in filenames sensibly with console output", this is because of the -q switch (that you should use instead of -b because it's portable) which "Forces each instance of non-printable filename characters and characters to be written as the ( '?' ) character. Implementations may provide this option by default if the output is to a terminal device." So e.g. ls -Aq | wc -l to count all files/dirs or ls -qp | grep -c / to count only non-hidden dirs etc... – don_crissti May 13 '15 at 11:08
  • Thanks for your input. Changed -b to -q. – Frax May 14 '15 at 15:05
  • Currently includes directories in its file count. To be most complete we need an easy way to omit those when needed. – vhs May 12 '20 at 07:48
  • @JoshHabdas It says "probably". ;) I think the way to omit directories would be to use don_crissti's suggestion with a slight twist: ls -qp | grep -vc /. Actually, you can use ls -q | grep -vc / to count all (non-hidden) files, and adding -p makes it match only regular files. – Frax May 12 '20 at 11:55
9

If you know the current directory contains at least one non-hidden file:

set -- *; echo "$#"

This is obviously generalizable to any glob.

In a script, this has the sometimes unfortunate side effect of overwriting the positional parameters. You can work around that by using a subshell or with a function (Bourne/POSIX version) like:

count_words () {
  eval 'shift; '"$1"'=$#'
}
count_words number_of_files *
echo "There are $number_of_files non-dot files in the current directory"

An alternative solution is $(ls -d -- * | wc -l). If the glob is *, the command can be shortened to $(ls | wc -l). Parsing the output of ls always makes me uneasy, but here it should work as long as your file names don't contain newlines, or your ls escapes them. And $(ls -d -- * 2>/dev/null | wc -l) has the advantage of handling the case of a non-matching glob gracefully (i.e., it returns 0 in that case, whereas the set * method requires fiddly testing if the glob might be empty).

If file names may contain newline characters, an alternative is to use $(ls -d ./* | grep -c /).

Any of those solutions that rely on passing the expansion of a glob to ls may fail with a argument list too long error if there are a lot of matching files.

dessert
  • 1,687
  • 1
    Do you really want to create 13,923 positional parameters? And you should make your local variable local or eliminate it: eval $1=$# or just use echo $# and do number_of_files=$(count_words *). – Dennis Williamson Aug 24 '10 at 01:16
  • 1
    @Dennis: part of the point was to avoid forking. I guess that's not a 21st century concern. Ok, I admit I don't care about non-POSIX shells any more, so I could have avoided the temporary variable. – Gilles 'SO- stop being evil' Aug 24 '10 at 07:14
  • Why did you subtract one from $# (you hadn't done that prior to the edit)? – Dennis Williamson Aug 24 '10 at 22:12
  • @Dennis: I'm still avoiding a fork (well, it does make a difference on machines with a slow CPU such as routers) and passing a variable name as $1. So what I want to count is the number of parameters that aren't the first parameter. (I can't use shift because I need to keep the variable name around.) (Umm, now if you'd asked about the first line...) – Gilles 'SO- stop being evil' Aug 24 '10 at 22:42
  • @Dennis: come to think of it, I can use shift if I time it right. – Gilles 'SO- stop being evil' Aug 24 '10 at 22:50
  • facepalm - I overlooked the fact that you were passing a variable reference and needed to deduct for that. /renews Weekly Reader subscription – Dennis Williamson Aug 25 '10 at 22:45
9

With the GNU implementation of find:

find -maxdepth 1 -type f -printf . | wc -c
  • -maxdepth 1 will make it non-recursive, find is recursive by default
  • -type f will include regular files only
  • -printf . is a cute touch. it prints a dot (a single-byte character in every locale) for each file instead of the filename, and now this is able to handle any filename and also saves data; we just have to count the dots :). Note however that -printf is a GNU-only extension.
  • | wc -c counts bytes and reports the total as a decimal integer (possibly preceded and/or followed by whitespace depending on the wc implementation, not with GNU wc)
aude
  • 292
5

While using ls/wc pair if we are adding -U it will be much faster (do not sort ).

ls -AqU | wc -l
Jbnair
  • 51
3

After installing the tree command, just type:

tree

If you want hidden files too:

tree -a

If you are using Debian / Mint / Ubuntu Linux, type the following command to install the tree command:

sudo apt-get install tree

The option -L is used for specifying the maximum display level of the directory tree. The tree command does not only count the number of files, but also the number of directories, considering as many levels of the directory tree as you like.

Elijah Lynn
  • 1,045
lev
  • 589
  • 2
    When I type tree, I get a sort of tree output to the screen of the directory I am in but I cannot see where the number of files is shown. – charlesdarwin Apr 25 '18 at 12:30
2

No pipe, no string copy, no fork, just plain bash one liner

$ fcount() { local f i=0; for f in *; do let i++; done; echo $i; }; fcount
HalosGhost
  • 4,790
DaboD
  • 21
2

With some shells, you can do that without relying on external utilities:

fish

count *

Or to include hidden files:

count * .*

zsh

Define a function, here called count to mimic fish's builtin:

count() print $#

Then call the function with:

count *(N)

Or use an anonymous function:

(){print $#} *(N)

To include hidden (Dot) files:

(){print $#} *(ND)

You could also add a oN glob qualifier to disable sorting, or add ., / or @... to count only regular files, directories or symlinks...

Replace * with **/* to also count files in sub-directories recursively.

ksh93

count() print "$#"
count ~(N)*

To include hidden files:

count ~(N){.,.[!.],..?}*

Or:

(FIGNORE=.:..; count ~(N)*)

bash

count() { echo "$#"; }
(shopt -s nullglob; shopt -u failglob; count *)

To include hidden files:

(shopt -s nullglob dotglob; shopt -u failglob; count *)
1

Here's another technique along the lines of the one Gilles posted:

word_count () { local c=("$@"); echo "${#c[@]}"; }
file_count=$(word_count *)

which creates an array with 13,923 elements (if that's how many files there are).

  • What's the point of that c array? word_count() { echo "$#"; } would be enough. The point of @Gilles solution is to store the count in a returned variable to avoid having to use command substitution (which involves a fork and pipe in shells other than ksh93). – Stéphane Chazelas Mar 01 '16 at 16:27
1

On Linux, to make the command very robust and handle files that might have newlines in their name, use this:

find -maxdepth 1 -type f -print0 | tr -cd '\0' | wc -c

This saves us from the ordeal of parsing ls output.


Related:

0
find . -type f -maxdepth 1 |  wc -l 

This can list only the files in current directory.

MelBurslan
  • 6,966
srpatch
  • 109
  • find . -type f will find files in the current directory, and also, recursively, in sub-directories. – dhag Jun 16 '16 at 20:23
0

Improving some answers given before but this time doing explicitly.

$ tree -L 1 | tail -n 1 | cut -d " " -f 3

It's worthy to notice the use of some loved commands like tail and cut. Also, note that tree is not available by default. The command above first capture information about the directory at level 1, then get the last line tail -n 1 where our goal is, and end up with cut to take the third word.

For instance, locating in /:

/ $ tree -L 1
.
├── 1
├── bin -> usr/bin
├── boot
├── dev
├── etc
├── home
├── lib -> usr/lib
├── lib64 -> usr/lib64
├── lost+found
├── media
├── mnt
├── opt
├── proc
├── root
├── run
├── sbin -> usr/sbin
├── srv
├── sys
├── tmp
├── usr
└── var

20 directories, 1 file
/ $ tree -L 1 | tail -n 1
20 directories, 1 file
/ $ tree -L 1 | tail -n 1 | cut -d " " -f 3
1

Then, what about asking the number of directories?

0

If you have rights to install packages, there is a very simple tool to do this (and more). It is called ncdu and it can be installed using apt or yum. A basic usage of ncdu would be:

ncdu /path/to/dir

This will display an ncurses-based screen which you can navigate using cursor keys. At the bottom, initially you will see the total number of files in that directory and subdirectories. Using the up/down arrow keys and ENTER, you can quickly navigate to any directory and get stats on usage.

A slightly advance use is ncdu -x /path/to/dir which will count only those files and directories which are on the same filesystem as the directory being scanned.

A bonus of ncdu is that it gives a progress bar while scanning. You can also redirect the output to a file for later use.

In the man page, there is an interesting section on how hard links are handled across various versions of ncdu.

HTH.

0

I use this one, few examples:

ls -Ap    directory01/directory02  | grep -v /$  | wc -l
ls -Ap    directory01/directory02/exampl*  | grep -v /$  | wc -l
ls -Ap    /home/directory01/directory02  | grep -v /$  | wc -l

It works like this:

  • -p with ls adds / at the end of the directory names.
  • -A with ls lists all the files and directories, including hidden files but excluding . and .. directories.
  • grep -v /$ only shows the lines that do not match (-v option) lines that end with /. (directories)
  • wc -l counts the number of lines.

Either I use for example mix of these:

ls -Ap    directory01/directory02  | grep -v /$  | wc -l ; /
ls -Ap    directory01/directory02/exampl*  | grep -v /$  | wc -l; /
ls -Ap    /home/directory01/directory02  | grep -v /$  | wc -l

So, for example from the:

$ tree
.
├── directory01
│   ├── directory02
│   ├── directory03
│       ├── Screenshot from 2022-04-19 15-12-55.png
│       └── Screenshot from 2022-04-19 16-05-05.png
│       └── directory04

I will get count [plain files]

$ ls -Ap    directory01/directory03  | grep -v /$  | wc -l 
2

It will be counted both

│       ├── Screenshot from 2022-04-19 15-12-55.png
│       └── Screenshot from 2022-04-19 16-05-05.png

but no [not a plain file]

directory04
  • (1) Please don’t pipe grep into wc -l unless you have some reason for not using grep -c.  (2) If you use a wildcard (glob), you should use the -d option of ls.  (3) What in the world is that second block of commands?  (4) Directories are files.  The question doesn’t say that it’s asking for the number of *plain files.*  If you want to provide that, you should say that that’s what you are doing. … (Cont’d) – G-Man Says 'Reinstate Monica' May 10 '22 at 17:22
  • (Cont’d) …  (5) This is little more than a rehash of comments by don_crissti and Frax, and a few answers.   It’s (arguably) OK to take somebody else’s comment and post it as an answer, but only if you give credit to the people who posted it first. – G-Man Says 'Reinstate Monica' May 10 '22 at 17:22
  • Sure. Thanks for Comment. Ad.1. If anything else would work to get what exactly I need I would use stg different, but so far very only this command gives me what I want. Ad 4. ok, ok, edited about plain files Ad.5. It is already in TWO places written about it, do you want it in a very first sentence or where? – user14927127 May 19 '22 at 08:10
0

To build on James' answer, you can write something like this to achieve a breakdown of all the direct subdirectories.

tree -aFid -L 1 . | while read f; do if [[ -d "${f}" ]]; then echo -en "${f}: "; find "${f}"//. ! -name . -print | grep -c //; fi; done

Output:

.: 884
documents: 46
photos: 300
videos: 138

I've just managed to quickly find the one directory that contained missing files after a file transfer that way.

-2

Try this i hope this answer will help you

echo $((`ls -l | wc -l` -1 ))
Tiger
  • 666
-2

You can check with:

ls -l | grep -v ^l | wc -l
techraf
  • 5,941
-3

Use the tree command, just:

tree
chaos
  • 48,171