Windows 7

could that be a raid driver problem, my p4 machine gets about 5mb/sec without the via raid driver installed, with it installed i get about 60 mb/sec even though i dont use raid (single sata disk)

Nope, it's just a single IDE disk, connected to the onboard Intel controller. It's a PII with 440BX chipset. As I say, in Windows it performs great... There is support for the chipset in FreeBSD, everything seems to be detected properly... it just doesn't perform.
I've always had poor disk performance in FreeBSD and linux compared to Windows anyway, regardless of what PC I used.
 
Its not that I have problems compiling...its just that it takes longer and longer over time the more fragmented the drive gets. This is in XP 32 bit by the way...it could well be that MS has updated the filesystem in the mean time between XP 32 and Windows 7. I have not yet tried to do any work on my W7 machine...so maybe that problem wont be around anymore...which would totally rock.

Well, I've developed on XP32 for over 6 years, at home, on a single installation. Didn't notice degraded performance.
Still using XP32 at work, never noticed anything there either. I defragged occassionally at home (once every 3-4 months at most), and haven't ever defragged the disk at work.

Perhaps the problem is that you have everything on the same partition or so?
I have my sourcecode on a separate partition, so it isn't affected by things like file downloads, temp folders, installing/uninstalling applications and whatnot.
 
One of the tidbits about NTFS is that you don't want to fill a volume too much. If a volume is filled too much, such as, about 95% space used, it may make some important system area severely fragged and you are boned. Most older defragment tools can't even defrag areas like MTF.
 
Hey Scali,

Well when I say CS grad I meant CS undergrad hehe minor correction there. Kinda weird but I never really enjoyed the subject yet I am in the software development field. I hate theory, I would rather do stuff by learning but ah well.

Anyway, I have done the test before and defragmentation was the problem. For example my build was taking close to 3 min for a project. Do the defrag (offline defrag first, then online) and it drops to as quick as 1 min 8 sec which is the average linux build time.

I see what you are saying but it is hard to believe that Linux (Ubuntu's latest distro) is masking the problem of fragmentation through some smart algorithm or something like that.

Did some reading and I found this very nice example. http://geekblog.oneandoneis2.org/index.php/2006/08/17/why_doesn_t_linux_need_defragmenting

Seems to me that if yo uare a bit smart, one can go ahead and prevent fragmentation in a lot of scenarios! Wonder why MS hasnt done that yet.

You can certainly get fragmentation if you are frequently pulling down many changes, force syncing, merging (ugh), or generating and nuking lots of sandboxes. Not to mention all the intermediate files generated by the compiler. As Scali said, this is just a fact of life and a slightly different allocation strategy isn't going to do anything to help it.

For me, builds were so bad (45 min+) that we went out and bought SSDs. Build times now? 9 minutes! And solutions open in seconds instead of almost a minute.

The lesson? Forget what filesystem you are using and buy a good SSD.
 
Yeah I do have one partition on my work machine. That would explain it. I am probably going to create some partitions when I next reinstall the OS on my machine which will probably be when the RC runs out of time.

I am probably going to get me some SSDs as well :) Seems like the biggest thing that is holding my computer back is just the hard drives now hehe
 
6.1 != 7

Has anyone noticed that "Windows 7" is actually Windows 6.1 ?

Code:
C:\>ver

Microsoft Windows [Version 6.1.7600]

I'm kinda disappointed, was hoping for a something more than a refinement of Vista (6.0). Damn you M$ marketing droids! <shakes fist in air>
 
Has anyone noticed that "Windows 7" is actually Windows 6.1 ?

Code:
C:\>ver

Microsoft Windows [Version 6.1.7600]

I'm kinda disappointed, was hoping for a something more than a refinement of Vista (6.0). Damn you M$ marketing droids! <shakes fist in air>

Yep, it's 6.1 and I'm surprised anyone expected anything else TBH, it has been clear since the beginning at least for me, especially since this "MinWin" kernel is in fact another name for the NT6.x kernels, while people thought it's something completely new when they demoed it with 40MB total OS memory footprint webserver, that build was just stripped to absolute minimum, unlike Vista/Server/7 builds
 
It's not exactly entirely BS. The tweaks performed under the hood are significant enough to warrant the version "7" marketing label. The performance of Win 7 is beyond what a mere "6.1" over "6.0" indicates.
 
Uhm, so I was looking at the differences between W7 versions & Wikipedia & various posts on techy forums (can't find any official quotes) say Home Premium 64bit has max 16GB RAM while the more expensive versions have 192GB, are they right? :???:

If so thats absolutely barmy! :oops:

Looking up current pricing, if I was going to build a Core i7 system now I'd quite possibly go for 12GB with either 6*2GB sticks or 3*4GB sticks.

I expect a fair chunk of people to be over 16GB before the next Windows version.
What if its as long as XP -> Vista?! Lots of people had only 128MB (less even?) when XP came out.

Sure, can buy the more expensive version but aside from this artificial RAM cap, I can't really see anything in the other versions that I would actually want/need.
64GB or even 32GB I'd probably be ok with but 16GB limit is really low balling it to force the upgrade.
 
The performance of Win 7 is beyond what a mere "6.1" over "6.0" indicates.
Proves in studio please.
So far I've seen 0 comparisons where objective measurments show significant performance difference
 
Last edited by a moderator:
Proves in studio please.
So far I've seen 0 comparisons where objective measurments show significant performance difference

I definitely felt it a bit peppier on the same hardware. Also apps start up a bit quicker I think. But it is definitely a lot better than my xp x64 setup was and i had it super tweaked too with nlite and all sorts of useful registry tweaks. I think the performance gap with Windows 7 RC and Vista SP2 is probably non existent just from a usage pov...heres a decent read though http://www.techradar.com/news/software/operating-systems/windows-deathmatch-xp-vs-vista-vs-7-615167
 
So far I've seen 0 comparisons where objective measurments show significant performance difference

Would be nice if there were objective test suites in the public domain to benchmark general responsiveness of a GUI-based OS!

Time-to-boot, time-to-load-blah-app, time-to-copy-a-file that sort of jazz is nice but not all of the story. What I want to know is how generally buggered up will my desktop response be if one of my apps decides to freeze up? (Acrobat plug-in for Firefox I'm looking directly at you in Tower-O-Sauron style!). When I happen to open a folder on an h/d that's spun down, does the entire desktop freeze while the drive spins up? When I open the add/remove programs panel, do I get the Spinning Torch of Seeking for 15 inexplicable seconds? That sort of thing.
 
Back
Top