randycat99
Veteran
I have encountered a folder of stuff on a Windows Server box where stuff has been accumulated to it as far back as 5 yrs. It's about 200 GB comprised of 200,000+ items. The problem is that it seems to cease to be manageable, at this point. You can't really open the folder in a window to inspect the contents (takes about 24 hrs to populate and never truly reaches the end of its search process). We would like to start cleaning out old stuff, but even basic file operations are severly handicapped on a folder of this nature. For example, deleting 2 files can take as much as a few minutes before the UI can refresh to reflect that you deleted those 2 files. It [time investment] doesn't bode well if the projection is we need to dump 100 GB of the stuff. I even tried to make a vb script to take a crack at it, but I think it went belly up just trying to catalog the folder into a file system object variable.
So my question is- what do you think is causing the unresponsiveness? Is it just the sheer number of items in a folder? ...or is it the number shouldn't matter, but if it is a sheer large amount of data in a dubiously fragmented state, the culmulative impact of cataloging something that size is just too much for the file system to handle?
Anybody have any experience with this kind of scenario? Is there a "power-user" way to approach it more effectively?
So my question is- what do you think is causing the unresponsiveness? Is it just the sheer number of items in a folder? ...or is it the number shouldn't matter, but if it is a sheer large amount of data in a dubiously fragmented state, the culmulative impact of cataloging something that size is just too much for the file system to handle?
Anybody have any experience with this kind of scenario? Is there a "power-user" way to approach it more effectively?