Using
ls -1 -f
is about 10 times faster and it is easy to do (I tested with 1 million files, but my original problem had 6 800 000 000 files)
But in my case I needed to check if some specific directory contains more than 10 000 files. If there were more than 10 000 files, I am not anymore interested that how many files there is. I just quit the program so that it will run faster and wont try to read the rest one-by-one. If there are less than 10 000, I will print the exact amount. Speed of my program is quite similar to ls -1 -f if you specify bigger value for parameter than amount of files.
You can use my program find_if_more.pl in current directory by typing:
find_if_more.pl 999999999
If you are just interested if there are more than n files, script will finish faster than ls -1 -f with very large amount of files.
#!/usr/bin/perl use warnings; my ($maxcount) = @ARGV; my $dir = '.'; $filecount = 0; if (not defined $maxcount) { die "Need maxcount\n"; } opendir(DIR, $dir) or die $!; while (my $file = readdir(DIR)) { $filecount = $filecount + 1; last if $filecount> $maxcount } print $filecount; closedir(DIR); exit 0;
lson a million files actually fails with out of memory error. The only way around it isfindorls -1f.