_xxx_ said:Well you have to store the temporary data somewhere, I suppose?
"Optimally" as in final result, or defragging time? I would doubt it the first case, but a lot of seeks would be saved when moving big chunks of files around, although windows doesn't seem to do well in these cases (I can be wrong, it's not like we can see what's going on like when we had 20Mb HDD's ).Diplo said:All defraggers will need some free space to defrag optimally, though I imagine it varies from product to product. To work efficiently 15% free space is a good rule of thumb.
Defragging time. If you want to do such things as consolidate the MFT then you really do need the space to move it in one contiguous chunk. Also, the less free space you have on a disk the more likely that space is to be fragmented. So whilst it may be possible to defragment a disk with a small amount of free space it's generally not very efficient to let it get into that state.t0y said:"Optimally" as in final result, or defragging time?
Timeframe context problem; when XP came out 1Gb was not exactly considered a "normal" amount of RAM....they probably never just saw it down the road or felt the need for it.K.I.L.E.R said:Memory?
With 1GB of RAM I wouldn't mind if defragging can store data in memory.
K.I.L.E.R said:Memory?
With 1GB of RAM I wouldn't mind if defragging can store data in memory.
The only problem is what happens if something occurs and the data in memory is lost.
K.I.L.E.R said:Memory?
With 1GB of RAM I wouldn't mind if defragging can store data in memory.
The only problem is what happens if something occurs and the data in memory is lost.
Also get the nice GUI from excessive-software .chavvdarrr said:from sysinternals few fine tools can be downloaded - like pagedefrag and contig