Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
PanzaMan said:..du u want to bet that the NV34 is a GeForce3 in disguise?
MuFu said:Surely it's a good thing for game development that the mainstream sector will be flooded with cards that officially "support" DX9, despite being unable to execute even moderately complex shaders at any kind of useable rate.
My gues....whichever one is cheaper.![]()
Since we're talking about OEM's, my guess would be the one with DX9 stickers all over it. At least if it's somewhat close in price and performance.
Joe DeFuria said:Hmmm....did DX8 stickers all over the Radeon 9000 make that the darling of OEMs compared to the GeForce4MX? (Honest question....)
The key is "somewhat close" in price and performance. Down in this segment, a matter of a couple bucks is probably beyond the "somewhat close" barrier.
We also have to consider overall brading: that is "Radeon" vs. "GeForce" brand. It "GeForce4" was a great brand for nVidia. "GeForceFX......." ??
Bjorn said:Maybe not, but Nvidia was the market leader at that time with the GF4 and you don't win over the OEM's with one generation of cards. ("Non 3D hardware freaks" friends of mine asked me what card to buy 2-3 months after the R9700 was released and was asking me what type of GF4 they should buy for their high end gaming machines. All they knew of was "GeForce", much like "Voodoo" a couple of years back).
You're right about the price. All in all, will be a very interesting "battle" though. As for the brand, still says GeForce![]()
Joe DeFuria said:Well, now you're mixing the retail consumers with OEMs...we're talking about OEMs. Curiously, which card did you recommend to your friends?![]()
On the other hand, it's also possible for it to go the other way.....the FX line may be the first nail in the coffin for the GeForce brand....and that could be a really bad thing for nVidia....
mr said:I get 872 3DMarks with an "old" GF3 Ti 200 clocked at 230/460 with forced CPU Vertex Shaders on a XP2400+.
MuFu said:NV31 has *some* software-based shader functionality, of that I am 100% certain. It was one of the main stumbling points duing R&D.
NV34 is "DX9-compatible" - it quite blatently does not have a full DX9 H/W featureset.
MuFu said:I think it'll be transparent to the developer though, i.e. the entire NV3x line will support all DX9 functionality, just some of it will be basitit-slow because of software hacks.
MuFu said:I think it'll be transparent to the developer though, i.e. the entire NV3x line will support all DX9 functionality, just some of it will be basitit-slow because of software hacks.
Joe DeFuria said:MuFu said:Surely it's a good thing for game development that the mainstream sector will be flooded with cards that officially "support" DX9, despite being unable to execute even moderately complex shaders at any kind of useable rate.
I hope there's sarcasm in there?![]()
Dave H said:mr said:Interesting. Compared to our best guess for NV34, your GF3 has the following disadvantages:
...
On the other hand, it has the following advantages:
- 4x2 instead of...4x1? 2x2??
...
actually the gefoce3 is a 4x2, if i recall correctly the original geforce was as well. standard 4x2 that is, the fx does have a leg up on that.
Dave H said:MuFu said:NV31 has *some* software-based shader functionality, of that I am 100% certain. It was one of the main stumbling points duing R&D.
NV34 is "DX9-compatible" - it quite blatently does not have a full DX9 H/W featureset.MuFu said:I think it'll be transparent to the developer though, i.e. the entire NV3x line will support all DX9 functionality, just some of it will be basitit-slow because of software hacks.
I always thought the official (marketing) definition of "DXn-compatible" was just that the part can run the DXn runtime. In many ways this is a silly definition, as it really means only that the IHV is still developing drivers for the part (i.e. a TNT is DX9-compatible under this definition), but it does make a certain degree of sense.
"DXn-compliant", then, means that the part enables substantially all of the DXn featureset. Thus (assuming they offer >=FP24 support), NV31 and NV34 should be DX9-compliant, albeit not in hardware.
Of course figuring out exactly what shader bits are to be executed in hardware and what in software is becoming an intriguing problem. The "obvious" answer is that NV34 does all vertex shading in software and that both do all pixel shading in hardware.
But you're asserting that NV31 has some apparently new form of software-assisted shading, and it seems a possibility that NV34 might not even do full pixel shading in hardware.
Except that it would seem very, very difficult to efficiently do any sort of pixel shading in software; I'm certainly at a loss to explain how that might be done. Of course, Nvidia should certainly be able to find whatever clever solutions to this issue might exist.
As for NV31's lesser software shading...perhaps some mechanism to offload *part* of the vertex shading load to the CPU while still doing *part* on the GPU? That would be an interesting idea, and might conceivably cause problems during R&D.