Practical limit to how many items can be in a "folder"?

randycat99

Veteran
I have encountered a folder of stuff on a Windows Server box where stuff has been accumulated to it as far back as 5 yrs. It's about 200 GB comprised of 200,000+ items. The problem is that it seems to cease to be manageable, at this point. You can't really open the folder in a window to inspect the contents (takes about 24 hrs to populate and never truly reaches the end of its search process). We would like to start cleaning out old stuff, but even basic file operations are severly handicapped on a folder of this nature. For example, deleting 2 files can take as much as a few minutes before the UI can refresh to reflect that you deleted those 2 files. It [time investment] doesn't bode well if the projection is we need to dump 100 GB of the stuff. I even tried to make a vb script to take a crack at it, but I think it went belly up just trying to catalog the folder into a file system object variable.

So my question is- what do you think is causing the unresponsiveness? Is it just the sheer number of items in a folder? ...or is it the number shouldn't matter, but if it is a sheer large amount of data in a dubiously fragmented state, the culmulative impact of cataloging something that size is just too much for the file system to handle?

Anybody have any experience with this kind of scenario? Is there a "power-user" way to approach it more effectively?
 
In this case I'd avoid using any form of GUI to manage it. Open a plain old command prompt and start moving things to various subfolder. A good start would be alphabetically into subfolders A, B, C ... etc.

mkdir A
move A*.* A
mkdir B
move B*.* B
...
 
lol that must be painful. and I was annoyed by the "size" and "duration" columns slowing my folder of about 100 movies.

you can do a listing of your directory :
dir /o c:\path\directory > c:\stuff\foo.txt (/s option for a recursive dir, try other formatting options, one gives file names only, you might use that data in a .vbs or .bat script)
try an alternate file manager, there's a number of free norton commander clones and other software.
 
Last edited by a moderator:
Interesting strategy about moving to alphabetical folders. I will pass on the suggestion. I don't know if it will be applicable, though, if these files begin with a number code (which is a distinct possibility). I guess we could try moving to 10 different number folders, but who knows if that will be enough to breakdown 200,000 files into manageable chunks... I suppose it will be easy enough to test for folder "1". ;)

As for swapping the hdd to a linux machine, it is an interesting solution, but I don't know how practical for my particular context (my bad, for not being more detailed). It's a remote machine, so not exactly under our control. Additionally, it is in a live production environment, so it probably will need quite a bit more approval before we could even consider trying that. I guess if all else fails, it would not be a bad strategy, of course.

If there is a command prompt syntax to move files according to the "last modified" value, that would definitely be the "super-combo" solution! Anybody know enough about dos, if that exists?
 
DOS? I doubt you'll use that. but if you mean that little command prompt that looks and acts like DOS then yeah, inquire into using that. DIR /? to know about the sorting options.
 
200,000 wow that's something. I have one folder with 20,000+ and it's always annoying when windows need to reload that folder. It can take up to 20 seconds of disk trashing.
 
how about winrar
there is a commandline version and it accepts wild cards
so you split the folder into several winrar archives
also on the winrar download page is far a file manager i think it supports raring over ftp (read the docs i could be wrong)
 
Another option would be to create subfolders based on the file type. Depending on the content it could make more sense. Although I suppose there's a great chance that plenty of the files are of the same type.

mkdir JPG
move *.jpg JPG
mkdir ZIP
move *.zip ZIP
...
 
As to the fish slapping, the state of stuff I have seen left in place before I was tasked to evaluate the situation would shock you! :eek: Never in my wildest dreams that a Windows machine could still actually be functional in such states, but actually can still work. It's really quite amazing. You would never want to let it get that far gone, but it is amazing nonetheless that it can actually retain viability...
 
Wow, that's some pr0n stash you stumbled upon.
 
it's on par with the IE5 cache I've seen on an old win98 PC with 64MB ram and dial-up access. IE had a bug where the cache would grow infinitely, there were like 80,000 files and extreme fragmentation. Under pure DOS it would perform like a 4.77MHz XT! The cache took forever to delete.
I banged my head on that beast, with useless firmware upgrades. My bro solved the performance by letting it defrag during two days.
 
I can't find reference right now but despite NTFS itself having no practical limitations to the number of files in a directory, there is a performance drop off (possibly a limitation in how Windows Explorer examines the directory?) after a low number of tens of thousands of files in one folder (like somewhere between 10,000 & 20,000).
 
Back
Top