Kyle's new thread @[H]

AzBat said:
So what specifically caused the rift? I suspect the interpretation over Microsoft's DX9 precision specs. NVIDIA probably wanted there to be shaders that used all 3 precisions(FX12, FP16 and FP32). When Microsoft made the minimum FP24, FM and other beta members pretty much concurred. This is what I believe led to NVIDIA's insistence that games will optimize for their hardware and it wasn't fair for them to be handicapped in the benchmark. I'm not actually saying I agree with NVIDIA, but I think it does help explain why NVIDIA had such a problem with the benchmark.

Nvida had a problem because they were no longer getting good scores on the upcoming 3DMark2003. NV30 was late and in crisis, and had been totally overtaken by the R300. If they couldn't own the benchmark, they set out to destroy it. If they were getting good scores out of the box, Nvidia would not care one jot what the DX9 spec says.
 
Bouncing Zabaglione Bros. said:
Nvida had a problem because they were no longer getting good scores on the upcoming 3DMark2003. NV30 was late and in crisis, and had been totally overtaken by the R300. If they couldn't own the benchmark, they set out to destroy it. If they were getting good scores out of the box, Nvidia would not care one jot what the DX9 spec says.

The lack of good scores was only the result of not being able to use different precisions. Had Microsoft provided different precisions in the DX9 spec, then NVIDIA would of had a good argument to include them in the benchmark as well. At least then NVIDIA could show it had better scores when using lower precisions. They could then suggest that in the near term that lower precision was all that was needed for today's games. So, whether NV30 was late or early or R300 being better than they expected is beside the point. If lower precisions in 3DMark showed the NV30 in a better light at the time NV30 was released, then I doubt we would be in this predicament where NVIDIA quit the beta program and then cheated, err optimized, for 3DMark. Something tells me NVIDIA is probably not too happy with Microsoft with regards to the DX9 spec either. Will be interesting to see who gets favoritism in DX10 when it's released.

Tommy McClain
 
AzBat said:
Something tells me NVIDIA is probably not too happy with Microsoft with regards to the DX9 spec either. Will be interesting to see who gets favoritism in DX10 when it's released.

Tommy McClain

Microsoft is not too happy with Nvidia's chipset pricing for the xbox. As far as DX10 is concerned that may very well depend on whom is able to manage the Xbox2 graphics core. I think that Microsoft is very disappointed in the deal they made with nvidia on the xbox and this may have something to do with what is going on with DX9 spec. Further FP24 seems to be a nice compromise in terms of performance and quality, not to mention that the R300 core is no slouch.
 
AzBat said:
The lack of good scores was only the result of not being able to use different precisions. Had Microsoft provided different precisions in the DX9 spec, then NVIDIA would of had a good argument to include them in the benchmark as well.

But they didn't. The spec is minimum 24 bit precision. Not 16. Nvidia is quite welcome to use 32bit, but it's just not fast enough. Nvidia is below the spec and now complaining about it. Last time I checked, everyone is supposed to make their hardware work to the spec not the other way around. Nvidia trying to change the spec to match their hardware is just more FUD to try and make a substandard part look better.

If R300 had been a average part like the 8500, Nvidia would still be top of the benchmark scores, and they would not care what the spec was as long as they were owning the benchmark scores. Why? Because marketing is everything to Nvidia, and good 3Dmarks scores have been a cornerstone of their marketing for a long time, and (even if they won't admit is publicly) still is.
 
Bouncing Zabaglione Bros. said:
AzBat said:
The lack of good scores was only the result of not being able to use different precisions. Had Microsoft provided different precisions in the DX9 spec, then NVIDIA would of had a good argument to include them in the benchmark as well.

But they didn't. The spec is minimum 24 bit precision. Not 16. Nvidia is quite welcome to use 32bit, but it's just not fast enough. Nvidia is below the spec and now complaining about it. Last time I checked, everyone is supposed to make their hardware work to the spec not the other way around. Nvidia trying to change the spec to match their hardware is just more FUD to try and make a substandard part look better.

Ummm, I'm not arguing with you. Remember I was only talking about my _suspicion_ of what caused the rift between NVIDIA and FM. I also went into detail on what I _believed_ would happen if different precisions were allowed in DX9 and 3DMark. I wasn't getting into any of the stuff you're talking about as it's already been discussed 10 fold.

Tommy McClain
 
Back
Top