Egg on Nvidia's face: 6800U caught cheating?

Status
Not open for further replies.
This topic was just posted by gandalf___ one minute ago, and now it has disappeared and replaced with this clever one! :D

Do you know if the "trilinear optimizations" were somehow enabled in the driver control panel?

What card was used to make the reference/rasterizer image?

Finally, can anyone from Futuremark comment on this article, and comment on why the 60.72 Forceware drivers were approved?

To be quite honest, this seems more like FUD, because at least at the moment, there is no good reason for NVDA to cheat with the NV40. The NV40's performance lead over the current generation of high end cards has already been shown through numerous tests and benchmarks.
 
yeah can i just ask WTH did my post go

also this is gonna be intresting if this is true all i have to say is i feel bad that NV still feels the need to do this


edit: no i didnt touch my post i exited out of the thread and it was gone
 
On the surface, it seems like pretty minor stuff.

Though it would be good to hear a technical explanation (from Futuremark), as to why there are some differences.
 
I must be blind, or something.

None of the 3 images (refrast, nvidia, ati) look particularly the same. And none of them look particularly worse than the others.

Edit: I only looked at the first shot. The other two show the difference in LOD bias.
 
Can someone offer a better explanation about the green square in the Max Payne 2 shot than was offered by DH? Looking through the square the colored mipmaps are the same as the refrast/ATI shot. DH says that texture detail is being lowered....but which texture detail? The walls and stuff all look the same color as they should be.
 
Joe DeFuria said:
On the surface, it seems like pretty minor stuff.

Though it would be good to hear a technical explanation (from Futuremark), as to why there are some differences.

Approved drivers are supposed to render the reference rasterizer.
 
we need some expert explanations ... cause just this .. it might be caused with 2 different cards.

RainZ
 
Doomtrooper said:
Joe DeFuria said:
On the surface, it seems like pretty minor stuff.

Though it would be good to hear a technical explanation (from Futuremark), as to why there are some differences.

Approved drivers are supposed to render the reference rasterizer.

Ati's cards don't exactly match the reference rasterizer either.
 
Joe DeFuria said:
Doomtrooper said:
Joe DeFuria said:
On the surface, it seems like pretty minor stuff.

Though it would be good to hear a technical explanation (from Futuremark), as to why there are some differences.

Approved drivers are supposed to render the reference rasterizer.

Ati's cards don't exactly match the reference rasterizer either.

well Nvidia's looks pics look like they are substituting lower quality textures to gain performance. At least that's what it looks like to me
 
On any Game Test that requires FP precision ATIs won't since the reference is done at FP 32, something the NV40 is supposed to be doing. The plane is almost exact on the ATI vs Ref, but not on the Nvidia shot.
 
One thing to note on the max payne pics is the reference pic don't show any lense flares from the hall lights above. The ATI pics slows the lense flares without any mip map tinting, and the nvidia one shows one lense flare and the second light behind that is showing up the mip coloured texture. From the looks of things it seems like the tool he is using doesn't work properly on both nvidia and ati cards, only on the reference rasterizer.
 
Overall it appears that NV is being very slightly more aggressive in thier approach; however all of the images seem to contain the same amout of layers. It is hard to image this being a big performance advantage, though I really couldn't say. I did like the blending of the NV cards in the the Max Payne shots.
 
mozmo said:
One thing to note on the max payne pics is the reference pic don't show any lense flares from the hall lights above. The ATI pics slows the lense flares without any mip map tinting, and the nvidia one shows one lense flare and the second light behind that is showing up the mip coloured texture. From the looks of things it seems like the tool he is using doesn't work properly on both nvidia and ati cards, only on the reference rasterizer.

It does seem like it could be anomolous interaction between the cards / drivers and the tool.

It would have been helpful if they provided screenshots without the mip-map coloring, and with and without having the tool "layer" present. I'm all for finding cheats and corner cutting...but this just looks a bit sloppily done.
 
..but this just looks a bit sloppily done


i agree with ya there .... it seems like it was a little rushed ..... i think they should have waited to get an official explanation on the situation .... this smells funny


retsam
 
retsam said:
..but this just looks a bit sloppily done


i agree with ya there .... it seems like it was a little rushed ..... i think they should have waited to get an official explanation on the situation .... this smells funny


retsam

here is Nvidia's official explaination

"well, they are beta drivers so there is bound to be problems. It'll be fixed by the final release"

:)
 
galperi1 said:
Joe DeFuria said:
Doomtrooper said:
Joe DeFuria said:
On the surface, it seems like pretty minor stuff.

Though it would be good to hear a technical explanation (from Futuremark), as to why there are some differences.

Approved drivers are supposed to render the reference rasterizer.

Ati's cards don't exactly match the reference rasterizer either.

well Nvidia's looks pics look like they are substituting lower quality textures to gain performance. At least that's what it looks like to me

With all this witchhunting in that department these days, you could stop and ask yourself: Does it matter?

If any chipmaker decides to sacrifice IQ that CANT be seen for peformance, does it matter to the end user?
The line for IQ, is it drawn at what you can FIND by deep digging, or what you SEE onscreen when you play..

Not calling any shots on this case, but just in general..
Optimizing is a good thing, when you sacrifice IQ while doing so its instead a cheat..
But are you sacrificing IQ when the end user cant se the difference, or is it enough that you can find a difference with these Mipmap tools and blowups of screenshots?
 
Status
Not open for further replies.
Back
Top