I have a program that produces large number of small files (say, 10,000 files). After they are created, another script accesses them and processes one by one.
Questions:
I run Debian with ext4 file system
Related
10k files inside a single folder is not a problem on Ext4. It should have the dir_index
option enabled by default, which indexes directories content using a btree-like structure to prevent performance issues.
To sum up, unless you create millions of files or use ext2/ext3, you shouldn't have to worry about system or FS performance issues.
That being said, shell tools and commands don't like to be called with a lot of files as parameter ( rm *
for example) and may return you an error message saying something like 'too many arguments'. Look at this answer for what happens then.