crappy 3dmark score on NV31/34?

NV31 has *some* software-based shader functionality, of that I am 100% certain. It was one of the main stumbling points duing R&D.

NV34 is "DX9-compatible" - it quite blatently does not have a full DX9 H/W featureset.

MuFu.
 
PanzaMan said:
..du u want to bet that the NV34 is a GeForce3 in disguise?

I bet you will turn out to be more right than wrong....

That would also mean we can throw out any assumptions of NV chip "code names" and level of DX support....
 
I think it'll be transparent to the developer though, i.e. the entire NV3x line will support all DX9 functionality, just some of it will be basitit-slow because of software hacks. Sure - official compliance demands that certain things be run natively in hardware, but perhaps in nV's eyes they are bending the rules slightly for the better. Surely it's a good thing for game development that the mainstream sector will be flooded with cards that officially "support" DX9, despite being unable to execute even moderately complex shaders at any kind of useable rate.

I doubt the NV34 has much in common with previous generations apart from its memory controller and perhaps TMDS/RAMDAC stages. It is definitely a derivative of the NV30 core.

MuFu.
 
MuFu said:
Surely it's a good thing for game development that the mainstream sector will be flooded with cards that officially "support" DX9, despite being unable to execute even moderately complex shaders at any kind of useable rate.

I hope there's sarcasm in there? ;)

In any case, we could be getting ready to witness an interesting battle at the low end. NV34 might basically be a horrendous performer, though "supporting" DX9.

RV280 doesn't support DX9, but it might be a better performer in general?

If those two characteristics pan out....which part will be the OEM's "darling?"

My gues....whichever one is cheaper. ;)
 
It's hardly transparent to the developer if CPU-assist means shaders run an order of magnitude slower--it basically rules DX9 out.

Everything we've heard about the NV3x so far (with the significant exception of the extended shader capabilities) seems like a disaster for developers: Doom III needs to use NV30 extensions to run well; 3DMark03 needs a hacked driver to run well; Microsoft and NVidia are in a dispute about precision requirements for PS2.0; and now (if this story is to be believed) NVidia's mainstream "CineFX" part will not run shaders very well at all.
 
My gues....whichever one is cheaper. ;)

Since we're talking about OEM's, my guess would be the one with DX9 stickers all over it. At least if it's somewhat close in price and performance.
 
Since we're talking about OEM's, my guess would be the one with DX9 stickers all over it. At least if it's somewhat close in price and performance.

Hmmm....did DX8 stickers all over the Radeon 9000 make that the darling of OEMs compared to the GeForce4MX? (Honest question....)

The key is "somewhat close" in price and performance. Down in this segment, a matter of a couple bucks is probably beyond the "somewhat close" barrier. :)

We also have to consider overall branding: that is "Radeon" vs. "GeForce" brand. It "GeForce4" was a great brand for nVidia. "GeForceFX......." ??
 
Joe DeFuria said:
Hmmm....did DX8 stickers all over the Radeon 9000 make that the darling of OEMs compared to the GeForce4MX? (Honest question....)

Maybe not, but Nvidia was the market leader at that time with the GF4 and you don't win over the OEM's with one generation of cards. ("Non 3D hardware freaks" friends of mine asked me what card to buy 2-3 months after the R9700 was released and was asking me what type of GF4 they should buy for their high end gaming machines. All they knew of was "GeForce", much like "Voodoo" a couple of years back :)).

The key is "somewhat close" in price and performance. Down in this segment, a matter of a couple bucks is probably beyond the "somewhat close" barrier. :)

We also have to consider overall brading: that is "Radeon" vs. "GeForce" brand. It "GeForce4" was a great brand for nVidia. "GeForceFX......." ??

You're right about the price. All in all, will be a very interesting "battle" though. As for the brand, still says GeForce :)
 
Bjorn said:
Maybe not, but Nvidia was the market leader at that time with the GF4 and you don't win over the OEM's with one generation of cards. ("Non 3D hardware freaks" friends of mine asked me what card to buy 2-3 months after the R9700 was released and was asking me what type of GF4 they should buy for their high end gaming machines. All they knew of was "GeForce", much like "Voodoo" a couple of years back :)).

Well, now you're mixing the retail consumers with OEMs...we're talking about OEMs. Curiously, which card did you recommend to your friends? ;)

As for winning over OEMs with "one generation"....see MSI news.

You're right about the price. All in all, will be a very interesting "battle" though. As for the brand, still says GeForce :)

Could very well be correct. In fact, I believe nVidia was planning on using a different brand, but OEMs pressured them into keeping it.

That may turn out to be the smartest thing nVidia did. Without the GeForce brand, the FX may really be in trouble.

On the other hand, it's also possible for it to go the other way.....the FX line may be the first nail in the coffin for the GeForce brand....and that could be a really bad thing for nVidia....
 
Joe DeFuria said:
Well, now you're mixing the retail consumers with OEMs...we're talking about OEMs. Curiously, which card did you recommend to your friends? ;)

I was guessing that the OEM's was aware of how the people that they sold computers to were thinking :)

As for my recomendations, let's just say that it wasn't a GF4 :)

On the other hand, it's also possible for it to go the other way.....the FX line may be the first nail in the coffin for the GeForce brand....and that could be a really bad thing for nVidia....

You're right about that. I don't think it's going to happen though but who knows. If the NV31 and 34 are poor performers then things might be tough for Nvidia for quite some time. Unless they can haul out the NV35 really soon as some rumours are suggesting.

This of course assuming that the NV35 is at least as "good" as the R350.
 
mr said:
I get 872 3DMarks with an "old" GF3 Ti 200 clocked at 230/460 with forced CPU Vertex Shaders on a XP2400+.

Interesting. Compared to our best guess for NV34, your GF3 has the following disadvantages:
  • Inability to run GT4
  • Inability to use PS 1.4 to single-pass GT2 and GT3 (although unclear if that's actually an advantage for NV3x)
  • 20 MHz slower core clock
  • 2400+ instead of 2700+ CPU

On the other hand, it has the following advantages:
  • 4x2 instead of...4x1? 2x2??
  • 960 MB/s more theoretical bandwidth
  • Probably more memory/fillrate optimizations in hardware (GF3 has HierZ, Z compression and fast clears; NV34 has ??)

All in all, the 879 score is looking sadly plausible, IMO. Of course these scores could be on drivers that don't "optimize" for 3DMark03.

MuFu said:
NV31 has *some* software-based shader functionality, of that I am 100% certain. It was one of the main stumbling points duing R&D.

NV34 is "DX9-compatible" - it quite blatently does not have a full DX9 H/W featureset.
MuFu said:
I think it'll be transparent to the developer though, i.e. the entire NV3x line will support all DX9 functionality, just some of it will be basitit-slow because of software hacks.

I always thought the official (marketing) definition of "DXn-compatible" was just that the part can run the DXn runtime. In many ways this is a silly definition, as it really means only that the IHV is still developing drivers for the part (i.e. a TNT is DX9-compatible under this definition), but it does make a certain degree of sense.

"DXn-compliant", then, means that the part enables substantially all of the DXn featureset. Thus (assuming they offer >=FP24 support), NV31 and NV34 should be DX9-compliant, albeit not in hardware.

Of course figuring out exactly what shader bits are to be executed in hardware and what in software is becoming an intriguing problem. The "obvious" answer is that NV34 does all vertex shading in software and that both do all pixel shading in hardware. But you're asserting that NV31 has some apparently new form of software-assisted shading, and it seems a possibility that NV34 might not even do full pixel shading in hardware.

Except that it would seem very, very difficult to efficiently do any sort of pixel shading in software; I'm certainly at a loss to explain how that might be done. Of course, Nvidia should certainly be able to find whatever clever solutions to this issue might exist.

As for NV31's lesser software shading...perhaps some mechanism to offload *part* of the vertex shading load to the CPU while still doing *part* on the GPU? That would be an interesting idea, and might conceivably cause problems during R&D.

Hmm...
 
MuFu said:
I think it'll be transparent to the developer though, i.e. the entire NV3x line will support all DX9 functionality, just some of it will be basitit-slow because of software hacks.

They'd still likely have to develop a dumbed-down path for the NV34 then, because it's so slow, so I don't really see how that will help much.
 
It seems to me the most likely description of the NV34 is PS 2.0-alike non-fp16/fp32 capable fragment shaders (i.e., NV30 with the fp fragment processing units removed, among other things...though I still remain very curious about my vertex/fragment processing unit sharing tests).
This would be more functional than PS 1.4, but slower than PS 1.4 hardware when acting like it, i.e., as discussed about the nv30 being fast for 1 texture load (0 phase levels, PS 1.1-alike) but being equally slow for each additional dependent texture load as a tradeoff for the "Cinematic shading" (i.e., being faster for shaders too complex for gaming...though it would seem to behoove them to focus differently for the mainstream part).
This is something that Cg could expose and that DX 9 HLSL could technically expose as well, but it seems it won't because the intermediate LLSLs do not expose the possibility (PS 2.0 includes the expectation of floating point support AFAIK).

As we've mentioned before, this situation would also be reflected somewhat in the NV31 if it only supports fp16.

IMO, barring any nasty surprises, this doesn't make them bad parts (in contrast to, for example, my opinion of the GF 4 MX in the context of shader adoption), though the NV34 seems likely to suffer from what I'll call "Radeon 9000 disease" :p.


Based on these theories:

Things should still be fine for nvidia in OpenGL, regardless of Cg adoption...in the HLSL atleast. I also have some thoughts floating around about how Microsoft's IP declarations may relate to nvidia's efforts to establish Cg independently.

Nvidia's root problem, IMO, was in the apparent arrogance of their stance on Cg...the success of their design (the theoretical NV34, and maybe now NV31, design descriptions above) outside of OpenGL seems to depend on gaining popular support for Cg, and they seem to have assumed that their position as market leader at the time would allow them to mandate its acceptance by themselves. This would make their delay of the NV30 to boost clock speeds seem even more disasterous to me, as this would seem to have impacted their market share position even more negatively than being perceived to have lost the performance leadership would have (though OEMS, game developers, and game publishers who try to influence developer baseline targetting, may each have unique perspectives on the issue).

This type of arrogance may also be reflected in their decision to design their hardware regardless of standardization (the ever-popular glide comparison, except that there are standards now unlike when glide was first offered), but maybe it was just engineering ambition and arrogance just came into the picture in their approach to fitting the resultant design into the marketplace (as they seem to have tried instead to fit the marketplace to their design). At this time, my own opinion of their past behavior and comments such as those Richard Huddy have made make me think the former.

Also, their "pride" (or, less emotionally, their marketing dependence) in "performance leadership" seems to have been a big factor in their problems (though the proportional impact in the marketplace isn't necessarily accurately represented in a forum like this one), though maybe the NV35 and rumored R400 delay will change the picture later this year.
 
Joe DeFuria said:
MuFu said:
Surely it's a good thing for game development that the mainstream sector will be flooded with cards that officially "support" DX9, despite being unable to execute even moderately complex shaders at any kind of useable rate.

I hope there's sarcasm in there? ;)

More sadness than sarcasm, really. :LOL:

MuFu.
 
Dave H said:
mr said:
Interesting. Compared to our best guess for NV34, your GF3 has the following disadvantages:
...
On the other hand, it has the following advantages:
  • 4x2 instead of...4x1? 2x2??
    ...



  • actually the gefoce3 is a 4x2, if i recall correctly the original geforce was as well. standard 4x2 that is, the fx does have a leg up on that.
 
Dave H said:
MuFu said:
NV31 has *some* software-based shader functionality, of that I am 100% certain. It was one of the main stumbling points duing R&D.

NV34 is "DX9-compatible" - it quite blatently does not have a full DX9 H/W featureset.
MuFu said:
I think it'll be transparent to the developer though, i.e. the entire NV3x line will support all DX9 functionality, just some of it will be basitit-slow because of software hacks.

I always thought the official (marketing) definition of "DXn-compatible" was just that the part can run the DXn runtime. In many ways this is a silly definition, as it really means only that the IHV is still developing drivers for the part (i.e. a TNT is DX9-compatible under this definition), but it does make a certain degree of sense.

"DXn-compliant", then, means that the part enables substantially all of the DXn featureset. Thus (assuming they offer >=FP24 support), NV31 and NV34 should be DX9-compliant, albeit not in hardware.

NV34 is referred to as "DX9-compatible" internally. The phrase "low-end POS" seems to crop up quite a bit as well, so it will be interesting to see how they market it! It is capable of 4ops/clk, so I guess 4 zixel/2 pixel output per clock. With clockspeeds of only 250-300Mhz you can see how it will quite possibly suck hairy donkey balls. No colour compression in NV34, no... one of the cost-saving differences between it and NV31.

Of course figuring out exactly what shader bits are to be executed in hardware and what in software is becoming an intriguing problem. The "obvious" answer is that NV34 does all vertex shading in software and that both do all pixel shading in hardware.

Yeah - that's a requirement of DX9, right? Is it the SiS Xabre that does the same? I can't remember...

But you're asserting that NV31 has some apparently new form of software-assisted shading, and it seems a possibility that NV34 might not even do full pixel shading in hardware.

Except that it would seem very, very difficult to efficiently do any sort of pixel shading in software; I'm certainly at a loss to explain how that might be done. Of course, Nvidia should certainly be able to find whatever clever solutions to this issue might exist.

As for NV31's lesser software shading...perhaps some mechanism to offload *part* of the vertex shading load to the CPU while still doing *part* on the GPU? That would be an interesting idea, and might conceivably cause problems during R&D.

Yes. I believe the problems mainly involved getting the speed of "patched" functionality (i.e. things stripped from NV30) up when running them on CPU time. Dave B was able to help me out regarding this a while ago; I wasn't sure what they meant by "fastex" shader hacks. Pretty obvious now.

I am quite hesitant to talk about it though, because it will inevitably seem as if I made a big fuss about something totally unapparent to the consumer! I'm not sure this will ever be proved/disproved, although I suspect we'll see some suspiciously crap shader performance out of NV31, given its hardware specs.

The other big problem was the fact that many of the functional blocks inherited from NV30 were buggy. I wonder if we'll see the hardware fog bug rear its ugly head in NV31/NV34. :?

MuFu.
 
Mufu is off base...

My developer friends tell me that nv31 & 34 are full dx9. No more 'mx' nonsense - only pipelines & performance features removed from nv30.

I guess we will find out on Thursday when they launch it...
 
That doesn't differ from what MuFu's been saying: "full" DX9, but partially in software, so as to render it useless b/c of unacceptably slow "performance."
 
Back
Top