Back in 2011 we had a peculiar problem.
We noticed that the processing tool for one of our clients was taking a long time processing ~20000 p/hr. Since we needed to process upwards 3 million files a month possibly in a 3 day span this was a problem for us.
We also noticed that this was not a CPU utilization issue which was around 10 percent. This tool used to write the files to a folder named by the area short code(e.g. DEL for Delhi). A folder could eventually hold anywhere between 5000 to 200000 files once the processing job was over.
We also observed that while files were being written to a folder, if we removed the files that were previously written to that folder(when > ~10000) into a subfolder, that sped up the job considerably.
Eventually the issue was solved when our tool vendor modified the tool to write files to folders named by zipcode within the folder with name of area code(DEL/110012 and so on). Each of these folders now contained to a maximum of approx 2000 files.
Edit 1: The OS used was HP-UX. Edit 2: File size averaged in the range 25-50 KB.
So why does it take longer to add new files to a directory that has a large number of files in it?