Practical limit to how many items can be in a "folder"?

Discussion in 'PC Hardware, Software and Displays' started by randycat99, Mar 15, 2009.

  1. randycat99

    Veteran

    Joined:
    Jul 24, 2002
    Messages:
    1,772
    Likes Received:
    12
    Location:
    turn around...
    I have encountered a folder of stuff on a Windows Server box where stuff has been accumulated to it as far back as 5 yrs. It's about 200 GB comprised of 200,000+ items. The problem is that it seems to cease to be manageable, at this point. You can't really open the folder in a window to inspect the contents (takes about 24 hrs to populate and never truly reaches the end of its search process). We would like to start cleaning out old stuff, but even basic file operations are severly handicapped on a folder of this nature. For example, deleting 2 files can take as much as a few minutes before the UI can refresh to reflect that you deleted those 2 files. It [time investment] doesn't bode well if the projection is we need to dump 100 GB of the stuff. I even tried to make a vb script to take a crack at it, but I think it went belly up just trying to catalog the folder into a file system object variable.

    So my question is- what do you think is causing the unresponsiveness? Is it just the sheer number of items in a folder? ...or is it the number shouldn't matter, but if it is a sheer large amount of data in a dubiously fragmented state, the culmulative impact of cataloging something that size is just too much for the file system to handle?

    Anybody have any experience with this kind of scenario? Is there a "power-user" way to approach it more effectively?
     
  2. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    15,821
    Likes Received:
    3,000
    maybe connect the hdd to a linux box it may be more responsive
     
  3. Humus

    Humus Crazy coder
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,217
    Likes Received:
    77
    Location:
    Stockholm, Sweden
    In this case I'd avoid using any form of GUI to manage it. Open a plain old command prompt and start moving things to various subfolder. A good start would be alphabetically into subfolders A, B, C ... etc.

    mkdir A
    move A*.* A
    mkdir B
    move B*.* B
    ...
     
  4. Richard

    Richard Mord's imaginary friend
    Veteran

    Joined:
    Jan 22, 2004
    Messages:
    3,508
    Likes Received:
    40
    Location:
    PT, EU
    http://technet.microsoft.com/en-us/library/cc781134.aspx

     
  5. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    lol that must be painful. and I was annoyed by the "size" and "duration" columns slowing my folder of about 100 movies.

    you can do a listing of your directory :
    dir /o c:\path\directory > c:\stuff\foo.txt (/s option for a recursive dir, try other formatting options, one gives file names only, you might use that data in a .vbs or .bat script)
    try an alternate file manager, there's a number of free norton commander clones and other software.
     
    #5 Blazkowicz, Mar 15, 2009
    Last edited by a moderator: Mar 15, 2009
  6. randycat99

    Veteran

    Joined:
    Jul 24, 2002
    Messages:
    1,772
    Likes Received:
    12
    Location:
    turn around...
    Interesting strategy about moving to alphabetical folders. I will pass on the suggestion. I don't know if it will be applicable, though, if these files begin with a number code (which is a distinct possibility). I guess we could try moving to 10 different number folders, but who knows if that will be enough to breakdown 200,000 files into manageable chunks... I suppose it will be easy enough to test for folder "1". ;)

    As for swapping the hdd to a linux machine, it is an interesting solution, but I don't know how practical for my particular context (my bad, for not being more detailed). It's a remote machine, so not exactly under our control. Additionally, it is in a live production environment, so it probably will need quite a bit more approval before we could even consider trying that. I guess if all else fails, it would not be a bad strategy, of course.

    If there is a command prompt syntax to move files according to the "last modified" value, that would definitely be the "super-combo" solution! Anybody know enough about dos, if that exists?
     
  7. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    DOS? I doubt you'll use that. but if you mean that little command prompt that looks and acts like DOS then yeah, inquire into using that. DIR /? to know about the sorting options.
     
  8. randycat99

    Veteran

    Joined:
    Jul 24, 2002
    Messages:
    1,772
    Likes Received:
    12
    Location:
    turn around...
    Yeah, that's what I meant.
     
  9. Bludd

    Bludd Experiencing A Significant Gravitas Shortfall
    Veteran

    Joined:
    Oct 26, 2003
    Messages:
    3,367
    Likes Received:
    941
    Location:
    Funny, It Worked Last Time...
    The /od and /tw switches may be useful.
     
  10. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    15,821
    Likes Received:
    3,000
    what you need is a large fish

    to slap the dude who let it get like that in the first place
     
  11. V3

    V3
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    3,304
    Likes Received:
    5
    200,000 wow that's something. I have one folder with 20,000+ and it's always annoying when windows need to reload that folder. It can take up to 20 seconds of disk trashing.
     
  12. _xxx_

    Banned

    Joined:
    Aug 3, 2004
    Messages:
    5,008
    Likes Received:
    86
    Location:
    Stuttgart, Germany
    +1 :)
     
  13. Sxotty

    Legend Veteran

    Joined:
    Dec 11, 2002
    Messages:
    5,082
    Likes Received:
    443
    Location:
    PA USA
    Did you read richard's suggestion?
     
  14. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    15,821
    Likes Received:
    3,000
    mines better ;)
     
  15. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    15,821
    Likes Received:
    3,000
    how about winrar
    there is a commandline version and it accepts wild cards
    so you split the folder into several winrar archives
    also on the winrar download page is far a file manager i think it supports raring over ftp (read the docs i could be wrong)
     
  16. Humus

    Humus Crazy coder
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,217
    Likes Received:
    77
    Location:
    Stockholm, Sweden
    Another option would be to create subfolders based on the file type. Depending on the content it could make more sense. Although I suppose there's a great chance that plenty of the files are of the same type.

    mkdir JPG
    move *.jpg JPG
    mkdir ZIP
    move *.zip ZIP
    ...
     
  17. randycat99

    Veteran

    Joined:
    Jul 24, 2002
    Messages:
    1,772
    Likes Received:
    12
    Location:
    turn around...
    As to the fish slapping, the state of stuff I have seen left in place before I was tasked to evaluate the situation would shock you! :eek: Never in my wildest dreams that a Windows machine could still actually be functional in such states, but actually can still work. It's really quite amazing. You would never want to let it get that far gone, but it is amazing nonetheless that it can actually retain viability...
     
  18. BRiT

    BRiT Verified (╯°□°)╯
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    15,423
    Likes Received:
    13,891
    Location:
    Cleveland
    Wow, that's some pr0n stash you stumbled upon.
     
  19. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    it's on par with the IE5 cache I've seen on an old win98 PC with 64MB ram and dial-up access. IE had a bug where the cache would grow infinitely, there were like 80,000 files and extreme fragmentation. Under pure DOS it would perform like a 4.77MHz XT! The cache took forever to delete.
    I banged my head on that beast, with useless firmware upgrades. My bro solved the performance by letting it defrag during two days.
     
  20. hoom

    Veteran

    Joined:
    Sep 23, 2003
    Messages:
    3,022
    Likes Received:
    547
    I can't find reference right now but despite NTFS itself having no practical limitations to the number of files in a directory, there is a performance drop off (possibly a limitation in how Windows Explorer examines the directory?) after a low number of tens of thousands of files in one folder (like somewhere between 10,000 & 20,000).
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...