Acert93 said:
Identifying it as marketing is important IMO to get a proper read on what he is saying.
Acknowledged; especially knowing where the original poster you replied to comes from
Done right is relative.
As a NV40 owner, my concern would be that G70/G71 make nominal steps in improving SM3.0 performance. This is interesting (read: typical PR machine) because NV bragged on their SM3.0 capabilities in 2004, now they are taking the "we are smaller" angle--when in fact that benefits them more than consumers seeing as G70/G71 are in the same consumer cost bracket as ATI's larger chips.
That's still within the usual marketing-wash from either/or side. I used a different perspective: once we see the first D3D10 GPUs, wouldn't you then rather think that THIS IS SM3.0 done right? And that doesn't go only for one IHV.
NV could have invested more die space in G70/G71 for better dynamic branching and vertex texturing performance. So while ATI may be late, NV can be said to be incomplete. So what is better: Done better (usable) or hitting a check box?
Yes they could have but since IHVs take their design decisions based on available resources and sales synergy tacticts it is my opinion that:
a) there wasn't enough time nor resources for bigger changes in G7x.
b) a higher chip complexity wouldn't give them today the possible advantage they might have in the high end notebook GPU market.
They can never always win; there has to be a hierarchy and sacrifices aren't avoidable, unless we'd be talking about way higher transistor budgets per manufacturing process, way larger product/refresh cycles and one singled out market like the PC mainstream market exclusively.
For a consumer like myself getting features that are usable is more important. Check boxes are irrelevant.
Careful new features upon introduction are most of the times mostly of developer interest. Assume you'd have a very efficient D3D10 GPU today (which isn't unfeasable anymore), what exactly would you as a consumer do with it today? Run happy-go-merry techdemos?
All perspective. But I think NV has a long enough history, shrewd as it is from a sales and market penetration and OEM contract position, to go the check box mark and hit performance targets in the following generation.
Your point would had been valid if ATI would had released R520 in spring 2004.
As for SM3.0 in general, it is not leaving us any time soon. Both next-gen consoles are SM3.0, and that will strongly influence what we see on the PC side for years, especially cross platform titles. We just saw our first SM2.0 only game (Oblivion) well over 3 years after DX9 shipped (Fall 2002). And if history is any indicator, NV's first DX10 part will provide excellent SM3.0 performance, but will probably be insuffecient for DX10 SM4.0 only/heavy tasks. This is conjecture, but this seems to fit Kirk's comments and past trends. There is no future proof GPU, but SM3.0 should not be ignored unless you plan to upgrade in the next year IMO.
IHVs will most likely tell you that "done right" comes with +1 and even more with +2, unless I've totally misunderstood that updated scheme. Sweet irony it's always been like that at least IMHLO.