Fast Linux file count for a large number of files

Charles picture Charles · Sep 15, 2009 · Viewed 99.2k times · Source

I'm trying to figure out the best way to find the number of files in a particular directory when there are a very large number of files (more than 100,000).

When there are that many files, performing ls | wc -l takes quite a long time to execute. I believe this is because it's returning the names of all the files. I'm trying to take up as little of the disk I/O as possible.

I have experimented with some shell and Perl scripts to no avail. How can I do it?

Answer

mark4o picture mark4o · Sep 15, 2009

By default ls sorts the names, which can take a while if there are a lot of them. Also there will be no output until all of the names are read and sorted. Use the ls -f option to turn off sorting.

ls -f | wc -l

Note that this will also enable -a, so ., .., and other files starting with . will be counted.