Windows 7

TBH if the OS isn't using all of your RAM then it doesn't have a good enough pre-fetcher :) Honestly I question the utility of any "how much RAM stuff uses" measurements as a decent virtual memory and paging implementation makes any single measurement pretty useless in isolation. Working set is what matters, and even then its more the ability to efficiently page that in and out for various applications as necessary.

So what kind of hardware are you seeing a big difference in speed on? Is it mainly integrated graphics or similar? With >=2GB of RAM and a decent video card I rarely see any slowdown on Vista, so is it just that lower end hardware runs 7 better now?

Are there any other changes that make it much better? Still no WinFS or versioned file system yet, right? :( Anything above and beyond Windows Defender, UAC, etc. etc. which have absolutely no use to me? (I realize they're very important in general, but I'm selfish :)).

And boo at there being a 32-bit version still... even Vista should have been 64-bit only IMHO :)
 
Last edited by a moderator:
If only Microsoft could make an OS that didn't have to meet the needs of a stupid market. Sigh...
 
Can you compare memory taken of both Vista & 7 with a vanilla install ?
If 7 is like Vista but lighter on CPU & RAM, it might very well be the best OS from MS.
(Doesn't necessarily mean it's any better than good though ;) )


From my current comparison of 6801 and Vista SP1, ~200MB (conservative) shaved off during idle with a 2GB setup.

Might be x86 vs x64, but that's the closest I can get.


As for the UI, it's ace. Aside from some bugs with exe detection I'd say it's rather soild.
 
More options is better aslong as it is not required to fidle with them for basic users. i complain about firewalls having to little options when people say they are to complex (lol)!
 
TBH if the OS isn't using all of your RAM then it doesn't have a good enough pre-fetcher :) Honestly I question the utility of any "how much RAM stuff uses" measurements as a decent virtual memory and paging implementation makes any single measurement pretty useless in isolation. Working set is what matters, and even then its more the ability to efficiently page that in and out for various applications as necessary.
QFT! I never understood why people got so hung up on RAM usage.

I bet Windows 7 is just counting RAM usage differently from Vista.
 
QFT! I never understood why people got so hung up on RAM usage.

I bet Windows 7 is just counting RAM usage differently from Vista.

Not when it's the Windows developers themselves saying they've made a conscious effort to drop down the memory footprint in order to improve performance. Unless they're lying of course.
 
Not when it's the Windows developers themselves saying they've made a conscious effort to drop down the memory footprint in order to improve performance. Unless they're lying of course.
Well that's good an such, but "footprint" here is really nebulous. I would hope that they are talking working set as well, which is hard to measure and can be non-deterministic overall (although it will tend to convergent to some distribution).

But again, memory with nothing in it is wasted memory. Might as well load up frequently used stuff in advance. That was actually one of the coolest innovations in Vista which no other OS to my knowledge has yet matched. The pre-fetching/usage tracking/pattern matching stuff really does make a difference in my experience, and that's very neat.

Cool if they've reduced working sets though as that helps cache performance. As far as general RAM usage goes though, I still seriously question the utility of any of these naive measurements (especially ones from Task Manager), and besides, RAM is super-super-cheap nowadays :)
 
I hope there adding stuff for ssd. I know vista had some improvements over xp for ssd devices. With the price continuing to drop i wouldn't be surprised if with windows 7 many enthusists jump to ssds for thier windows install.
 
As far as general RAM usage goes though, I still seriously question the utility of any of these naive measurements (especially ones from Task Manager), and besides, RAM is super-super-cheap nowadays :)
And why would NtQuerySystemInformation(SystemProcessInformation, ...) used by taskmgr provide inaccurate information about working sets?
 
And why would NtQuerySystemInformation(SystemProcessInformation, ...) used by taskmgr provide inaccurate information about working sets?
I'm not saying it is! Rather that the information that you get back isn't particularly meaningful in the way that people think that it is. Like I said, the OS should always be trying to fill all available memory rather than wasting parts of it. And as far as working sets go, the only thing that matters is that the OS working set plays nicely with applications. So, for instance, a full screen exclusive game should have the OS put most of the game's virtual memory resources into physical memory (and reduce its own working set as much as possible) while the game is in the foreground. What the working set readings are when nothing is running is pretty meaningless though and DOESN'T predict any of the following:

1) How well an application will run that requires all available memory.
2) How well the operation system or application will run on a system will less or more physical memory.

So honestly, what are we trying to learn here? If you want to know how well the OS deals with a game that has some huge working set, run the game on different OSes and different physical memory configurations. I don't think looking at the performance counters - no matter how accurate - is telling people what they think it is.

To put it another way, I'm not questioning the measurements, but rather the implicit conclusions that people may be drawing from them. :)
 
But again, memory with nothing in it is wasted memory. Might as well load up frequently used stuff in advance. That was actually one of the coolest innovations in Vista which no other OS to my knowledge has yet matched. The pre-fetching/usage tracking/pattern matching stuff really does make a difference in my experience, and that's very neat.

Since it feels like I've posted this countless times I'm going to keep it short. The whole "empty memory is wasted memory" is as flawed as the "lowest memory usage EVA" argument. There isn't a better in every situation method. Memory filled with redundant info is wasted memory too. If you only need 1gb of RAM to do a job and you end up using 2, that's not "maximising your cache", that's being lazy. To cache stuff into memory you need a LOT of IO (today's 4gb, tomorrow's 16gb), increasing boot times, not allowing the drive to spin down, etc. My usage models change radically, this week I might use nothing but Word, next week it's all Visual Studio and the week after it's all games which I'll then uninstall and never use again. Prefetching and caching applications is only good if your usage model is pretty constant. Blah, blah, blah.

But all the argumentation in the world doesn't negate the fact that my nLite-ed "150Mb CD" version of XP32 with all sorts of crap turned off - 11 processes after everything including VS2005, Office 12, etc. are installed - prefetching turned off, etc. etc. runs faster than my Vista 64 with all the sophisticated precaching going on AND an extra gb of ram for good measure. Also liking the anecdotal Win7 evidence.
 
"OS should always be trying to fill all available memory rather than wasting parts of it"
Not true. OS should keep as much data in memory as necessary to minimize paging. Not more, not less. If paged pool is larger than available physical memory, the least frequenty used data should be paged out. If possible, OS should also keep some memory unused in the anticipation of a new process being launched or the existing, active process acquiring more resources (not being forced to page out to make room for new data improves UX). If you were able to anticipate what user is going to do next with high (near 100%) accuracy, filling entire memory would make perfect sense. But that's not the case and I believe that filling entire memory would be a mistake.

"And as far as working sets go, the only thing that matters is that the OS working set plays nicely with applications."
I'm not sure I understand what you're trying to say. If you're trying to say that the important thing is to properly prioritize high-load applications, then I obviously agree. But I disagree that working sets are no indication of how OS will perform when you're running a memory-hog. First: knowing how much memory background services are consuming is an indication (not 100% reliable but a reasonable one) of how much physical memory will be available to your foreground app. Second: knowing how working sets in Win7 compare to working sets in Vista can give you some idea of how the load of background tasks changed.

I'm not saying this is the only or even the best benchmark but it's something worth looking at. There are at least two more important things to consider: the amount of IO generated by background threads and whether or not there are some improvements in "idle state" detection. Some background tasks (e.g. search indexer) may be easily paged out completely while you use a memory and/or CPU intensive application. But to ensure that, you need some reliable way to determine when it is ok to resurrect those services. Assuming that memory consumption and responsiveness were the top priorities in Win7, I'd be surprised to see no improvement in those areas.
 
If you only need 1gb of RAM to do a job and you end up using 2, that's not "maximising your cache", that's being lazy.
Oh obviously; my point is that if your working set is < your physical memory, there should be *no* paging, which is vacuously true. So granting that the OS has stuff in it that while you're using it is going to take up a chunk of memory, I'd like that memory to be resident when there's nothing else that needs it. If the OS is now speculatively not pulling in stuff that I may well need (that can be instantly paged out if it is never touched), and when I do need it I have to wait for it to get paged in, that's a step backwards.

And while your usage model may change, it's almost certainly remarkably predictable when you look at memory accesses on a page table level. That's why "boot time optimization" is so important, and there's no good reason why that sort of thing can't empirically/heuristically apply to other applications, which is what the Vista pre-fetcher is taking advantage of.

That aside, as far as desktop stuff goes, there just aren't performance issues at all on modern hardware. Any Core 2 with 2GB RAM (or even 4... it's <$50 now!) and a $10 graphics card shouldn't be seeing any discernible slowdown in typical user applications. I don't see how you'd be seeing any differences with your custom build unless you're running on fairly low-end hardware.

Also liking the anecdotal Win7 evidence.
Cool! And actually that's precisely the type of data that - while anecdotal - is a lot more useful than Task Manager numbers after boot.

If possible, OS should also keep some memory unused in the anticipation of a new process being launched or the existing, active process acquiring more resources (not being forced to page out to make room for new data improves UX).
Ah, but clean (not dirty) physical memory pages are "free" to page out, so excepting trade-offs with pre-fetching IO and so forth, it's always better to have "potentially something useful" there rather than "guaranteed nothing useful".

First: knowing how much memory background services are consuming is an indication (not 100% reliable but a reasonable one) of how much physical memory will be available to your foreground app.
I disagree with this. The OS needs to allocate lots of virtual address space for various things - and even uses a good chunk of that while it's compositing the desktop for instance - but for instance if a full screen game is running it rarely needs to touch any of this memory and it can be safely paged out. Even for a non-game/non-exclusive mode application there is a much smaller working set of the OS that should need to be touched than if the user is actively interacting with OS-level services.

(Note that there was a very similar situation when the .NET-based control panels launched with the ATI drivers and people freaked out about the numbers in the task manager, while in reality all of that memory was efficiently paged out when required.)

Second: knowing how working sets in Win7 compare to working sets in Vista can give you some idea of how the load of background tasks changed.
Maybe, but nothing says that in Win7 the processes that are presumably using less memory after boot aren't just going to go ahead and allocate/start using an arbitrarily large amount of memory when their services are called upon. For instance one could theoretically "save" a chunk of memory by delay-loading most of the OS services, but it's not actually solving any problems and is indeed slowing things down in the long run.

Again I'm not sure much can be gleaned from looking at a snapshot of virtual/physical memory resources after boot. As an application/service writer, I could make that look arbitrarily different with no relation to system responsiveness (either idle or under load).

There are at least two more important things to consider: the amount of IO generated by background threads and whether or not there are some improvements in "idle state" detection.
I too would love to see info on that sort of thing in Windows 7, but I'd like to see the theory first. There are just too many variables to consider in a modern OS/virtual memory implementation for a few performance counters to be very meaningful IMHO.
 
Doesn't Process Explorer give you info on IO from processes?
 
"OS should always be trying to fill all available memory rather than wasting parts of it"
Not true. OS should keep as much data in memory as necessary to minimize paging. Not more, not less. If paged pool is larger than available physical memory, the least frequenty used data should be paged out. If possible, OS should also keep some memory unused in the anticipation of a new process being launched or the existing, active process acquiring more resources (not being forced to page out to make room for new data improves UX). If you were able to anticipate what user is going to do next with high (near 100%) accuracy, filling entire memory would make perfect sense. But that's not the case and I believe that filling entire memory would be a mistake.
Keeping free memory for some process you might start isn't relevant at all, the cached items in memory can be flushed clear far faster than your HDD reacts to request to read the drive to start the process
 
The Windows 7 Beta is now available for MSDN/TechNet subscribers. Going through the release notes now, it's going to take a while to get the actual beta as MS's servers are getting hammered.
 
Yea all verisons were also apparently leaked today too .

Rumor has it that an open beta will start today also and they will announce it at the CES keynote
 
Back
Top