I think that was a completely different situation.
I, and I imagine a few others, would think it was the same situation. End of discussion really.
I think that was a completely different situation.
Chalnoth said:And SM3 is not an intermediate step between generations. It's the basis for the current generation of graphics programming interface, and will be essentially the standard for the next 1.5-2 years.
That sounds way too much like the old, "just wait until the real drivers are out for nVidia's card and you'll really see what it can do!" magic driver line Chal.Chalnoth said:Well, I'm still holding out hope that nVidia will be able to significantly improve shader performance very shortly, and thus overcome the performance gap.
It wasn't that so far, and for all I ever learned about humans, it will never be. At the end of the day, every single argument is utterly pointless but that hasn't stopped people from leading debates on whatever topic. This just happens to be one that's been around too many times and shows no sign of ever coming to an end, is all.Tahir said:End of discussion really.
anaqer said:It wasn't that so far, and for all I ever learned about humans, it will never be. At the end of the day, every single argument is utterly pointless but that hasn't stopped people from leading debates on whatever topic. This just happens to be one that's been around too many times and shows no sign of ever coming to an end, is all.Tahir said:End of discussion really.
Hot damn. Risking that I end up looking as if "winning" was of any importance to me with my post... this one just made me actually laugh out loud with happiness. Hell if I know why, never would have thought such a lousy day could possibly end with a smile.Tahir said:But you baited me and I am in the discussion again. Therefore you win.
Chalnoth said:I think that was a completely different situation. ATI has released a new architecture without (many) new features. This is really the first time this has happened since 3dfx and the Voodoo3.
Chalnoth said:For every generation until the R300, nVidia was the first-adopter of a new featureset. They were also the performance leader (starting with the GF 256). Now ATI is trying to get ahead by skimping on features, by rehashing an architecture that is nearly two years old.
And SM3 is not an intermediate step between generations. It's the basis for the current generation of graphics programming interface, and will be essentially the standard for the next 1.5-2 years.
Right. And when was it released? About one year after the GF3 line. It's been nearly two years since ATI released the Radeon 9700 Pro.ANova said:The GF4 was not a revolutionary architecture by any means. Like many have already said, it increased performance over the GF3 Ti line, and very little else.
This has always been the case at the juncture of a new architecture. Additionally, it's once SM2 really takes off that we'll start to see noticeable differences between SM2 and SM3 (as it's easier to extend to SM3 once SM2 is fully-supported).Why would ATI support SM3 when SM2 still has plenty of headroom for development.
I don't think I ever claimed anything like that. But high-end processors should be ahead of the technology curve, not riding it. You can't make games without hardware. I think it'd just be wrong for ATI to ride on nVidia's coattails, keeping the highest-performing products by not improving technology, and then not add in SM3 until nVidia's low-end SM3 hardware is relatively common, and games start to make noticeable use of SM3.Do you think the three or so games that are currently out has put an end to SM2's lifespan?
Chalnoth said:No, digi, I don't. They had better than SM2 support. It was just a poor implementation. ATI decided to not support SM3. nVidia failed to support SM2 at high speed. I think there's a huge difference. At least nVidia has been working to fix that mistake.digitalwanderer said:
Uhm, can I find that just a tad hypocritical sounding considering that nVidia is just now giving SM 2.0 any form of realistic support this generation? :?
radar1200gs said:It would have been interesing to see NV30 as originally intended, on a Low-K process and clocked around 700mhz.
SM2.0 is just a waypoint on the DX9 map leading to SM3.0.
SM2.0 is just a waypoint on the DX9 map leading to SM3.0. In fact I believe it was initially quite unimportant until nVidia walked out of DX9 discussions and were subsequently unable to get fully to SM3.0 with NV30.
Chalnoth said:Right. And when was it released? About one year after the GF3 line. It's been nearly two years since ATI released the Radeon 9700 Pro.
This has always been the case at the juncture of a new architecture. Additionally, it's once SM2 really takes off that we'll start to see noticeable differences between SM2 and SM3 (as it's easier to extend to SM3 once SM2 is fully-supported).
I don't think I ever claimed anything like that. But high-end processors should be ahead of the technology curve, not riding it. You can't make games without hardware. I think it'd just be wrong for ATI to ride on nVidia's coattails, keeping the highest-performing products by not improving technology, and then not add in SM3 until nVidia's low-end SM3 hardware is relatively common, and games start to make noticeable use of SM3.
Low-k doesn't give that big of a performance boost, first of all, and have we all forgotten about the God-awful 128-bit memory bus on NV30?It would have been interesing to see NV30 as originally intended, on a Low-K process and clocked around 700mhz.
The Baron said:Low-k doesn't give that big of a performance boost, first of all, and have we all forgotten about the God-awful 128-bit memory bus on NV30?It would have been interesing to see NV30 as originally intended, on a Low-K process and clocked around 700mhz.
Who's doing Low-K for the performance boost? You do it to lower signal interference at high frequency and enable faster clock speeds (edit: through a lowering of the total heat budget in NV30's case).The Baron said:Low-k doesn't give that big of a performance boost, first of all, and have we all forgotten about the God-awful 128-bit memory bus on NV30?It would have been interesing to see NV30 as originally intended, on a Low-K process and clocked around 700mhz.
radar1200gs said:Who's doing Low-K for the performance boost? You do it to (...) enable faster clock speeds.
There are other aspects of Low-K that can help boost performance, I suggest you read some of the links I and others have posted on the forum concerning Low-K.Sage said:radar1200gs said:Who's doing Low-K for the performance boost? You do it to (...) enable faster clock speeds.
uhhhh.. correct me if I'm wrong, but aren't those two generally tied together?