I can say "strawman" for all your statements too. Where will that get us?
We can break up GPU performance into multiple pieces: on silicion, memory bandwidth and texture filtering throughput are fixed function, so little can be done. Shader performance is pretty much the only variable over which you have major control. The other part is the driver.
So you're saying these two are totally unrelated? As I said, it's the
software.
In existing GPU's, at reasonably high resolutions, we have seen performance scale almost linearly with the amount of hardware pipes/ALU's/etc. This is only possible if the driver is a small part of the performance equation, but let's assume that, worst case, the GPU has to sit idle 20% of its time, waiting for the driver.
Even with an infinitely fast driver, you'd still only gain 25% in performance.
So your magical 30-40% increase simply has to come from the compiler. Let's not forget, you were not talking about a specific shader here or there, but about an across-the-board performance increase. Basically, you're talking about a compiler that up a couple of weeks ago just completely totally absolutely sucked and that nobody seemed to realize it. ATI had very good shader compilers in the past. Are you suggesting that their entire, very competent, compiler team resigned and was replaced by a bunch of fumbling idiots?
I said software, this includes compiler and/or drivers. We don't know for sure all the interaction between Vista, DX10 and AMD's new hardware tech. No one knows! And like I said, I believe there is some kind of
software issue that is keeping the performance from being high(as they intended). Possibly in some apps or games the performace is just not there(as in way too poor).
But could it be that Vista drivers suddenly demand much more CPU cycles than XP?
It's possible that DX9 doesn't fit very well in the Vista driver model, but wasn't everybody raving about the high quality of ATI Vista drivers? Aren't they supposed to be unifed anyway, so only a small part is hardware specific? I have yet so see complaints about overall 40% performance loss for switchers to Vista. If ATI can make efficient Vista drivers for DX9, they should be even more efficient for DX10, unless all the praise for the much higher efficiency of DX10 was just one big lie. Unlikely, don't you think?
I'm talking about AMD's new hardware as it relates to the software, there could be some big glitches in somes apps and games(as I've said for the umpteenth time).
Strawman argument. Their compiler had to be efficient enough 2 years ago to start validation of expected performance. 40% off theoretical peak rate is not 'enough' in my book.
During those 2 years, the compiler can be gradually improved to fix corner cases. A process that will continue, as we have seen in previous generations.
Again, you assume compiler. I'm talking about an unexpected glitch in the preformance as it relates to all OS, DX10, apps and games. They can't have some games work well and some others totally suck. That would be very bad for reviews. So they delayed until they can make sure everything performs up to snuff.
Exactly my point. In the past, we've never seen across the board 30-40% performance jumps. They were always gradual.
I said 30-40% performance that they
lost, not some magical increase. And I think it's
software related!!!
How about O Christmas tree?
Why not get him a strawwoman while you're at it?
Summarizing my arguments above: that automatically implies horrible compiler performance and staggering incompetence. Yes, I suppose it's possible.
After all your straws and misunderstandings all you can do is validated my original statement? Unbelievable. I will repeat it one more time for the cheap seats,
it's a possibilty it's a software issue!
Are we done for now? Good.