XP defrager

All defraggers will need some free space to defrag optimally, though I imagine it varies from product to product. To work efficiently 15% free space is a good rule of thumb.
 
_xxx_ said:
Well you have to store the temporary data somewhere, I suppose? ;)

Memory?
With 1GB of RAM I wouldn't mind if defragging can store data in memory.

The only problem is what happens if something occurs and the data in memory is lost.
 
Diplo said:
All defraggers will need some free space to defrag optimally, though I imagine it varies from product to product. To work efficiently 15% free space is a good rule of thumb.
"Optimally" as in final result, or defragging time? I would doubt it the first case, but a lot of seeks would be saved when moving big chunks of files around, although windows doesn't seem to do well in these cases (I can be wrong, it's not like we can see what's going on like when we had 20Mb HDD's :)).

As a sidenote: do any of you guys know of a defragger that can put the swap file at the beginning of the disk as Norton's Speedisk used to do? I lost my faith on Norton Utilities when it stop doing this, and sysinternals' defragger just, well, defrags it... :D
 
t0y said:
"Optimally" as in final result, or defragging time?
Defragging time. If you want to do such things as consolidate the MFT then you really do need the space to move it in one contiguous chunk. Also, the less free space you have on a disk the more likely that space is to be fragmented. So whilst it may be possible to defragment a disk with a small amount of free space it's generally not very efficient to let it get into that state.
 
K.I.L.E.R said:
Memory?
With 1GB of RAM I wouldn't mind if defragging can store data in memory.
Timeframe context problem; when XP came out 1Gb was not exactly considered a "normal" amount of RAM....they probably never just saw it down the road or felt the need for it.
 
K.I.L.E.R said:
Memory?
With 1GB of RAM I wouldn't mind if defragging can store data in memory.
The only problem is what happens if something occurs and the data in memory is lost.


Right, so chunks of data have to be duplicated on the drive before they can be moved to their new location. This is so powerloss can happen at any time and data should not be lost (you obviously have to write tracking data as well for recovery).

But you do have a point about the 15% limit. I think this is just something Microsoft put in based on a single configuration. Perhaps a 40GB drive or so. The silly thing with 15% is that as hard drives grow, data sizes do not. So, for example, on a 1TB HDD 15% represent 150GB. That's quite a bit of space to work with, damn the percentages. However, this is something Microsoft includes as added-value in their product and is not meant to be the perfect solution (although one would wondery why not). One can only imagine the lawsuits if the built-in Windows defragmenter was 'complete' and therefore made 3rd party products obsolete.

It is a very 'stupid' limitation, but you can run the defragmenter with less free space than 15%, it's just not recommended. ANd for those who really care, they would do well to buy a real 'optimizer' that not only joins fragmented files, but orders them on the physical disk for fastest possible access. On a sad note, it seems XP is rather fond of fragmenting itself so I am not sure how valuable such a tool would be.
 
the limit comes from NTFS structures (like MFT)
rule of the dumb == you need 15-20% free space in order to defrag NTFS file system
If you make OS defragmentation tool, would you target the descriptions to the 0.1% of people who have 1TB hdds and should know what they have to do or to others 99.9% ?

from sysinternals few fine tools can be downloaded - like pagedefrag and contig
 
K.I.L.E.R said:
Memory?
With 1GB of RAM I wouldn't mind if defragging can store data in memory.

The only problem is what happens if something occurs and the data in memory is lost.

Yes, and they also have to make sure that every machine will be able to run it, even with 128 MB RAM as well. Would be dynamically managable as well, but programmers are a lazy bunch as you surely know :)
 
Back
Top