Egg on Nvidia's face: 6800U caught cheating?

Status
Not open for further replies.
2senile said:
Evildeus said:
Just to make some more drama :devilish: It seems that Ati is directing website to this article (for what purpose :rolleyes: )
Dans un email reçu il y a quelques minutes, un représentant officiel d'ATI nous renvoit vers un article publié sur le site DriverHeaven laissant croire que NVIDIA triche dans ses drivers
http://www.clubic.com/n/n12418.html 8)

I'd be extremely interested in seeing the content of the e-mail but even without seeing it, my initial reaction is surprise & some disappointment. As far as I can tell nobody has come up with an explaination that is accepted by everybody on why the "errors" are occuring.

It just shouldn't happen! :(

take a look at zeckensacks post it pretty much explains it why it happens
 
Well if that email is real, for me it means that ATI knew about the cheat or bug and just waited that someone else notice it before sending confirmation :)

Unless they told DH where to find it to pay back nvidia about Quake ?
 
tEd said:
2senile said:
Evildeus said:
Just to make some more drama :devilish: It seems that Ati is directing website to this article (for what purpose :rolleyes: )
Dans un email reçu il y a quelques minutes, un représentant officiel d'ATI nous renvoit vers un article publié sur le site DriverHeaven laissant croire que NVIDIA triche dans ses drivers
http://www.clubic.com/n/n12418.html 8)

I'd be extremely interested in seeing the content of the e-mail but even without seeing it, my initial reaction is surprise & some disappointment. As far as I can tell nobody has come up with an explaination that is accepted by everybody on why the "errors" are occuring.

It just shouldn't happen! :(

take a look at zeckensacks post it pretty much explains it why it happens

I mean, purely on principle, ATi should not be sending e-mails to sites & involving themselves in this. It just seems so petty.

I have no problem at all with independent sites digging up some dirt tho. ;) :LOL:
 
2senile said:
I mean, purely on principle, ATi should not be sending e-mails to sites & involving themselves in this. It just seems so petty.

I have no problem at all with independent sites digging up some dirt tho. ;) :LOL:

It's called leveling the playing field. Who do you think blew the whistle on the Quake III debacle?

Purely on principal, a graphics company (or any other company, for that matter) should not be intentionally trying to mislead their public.

And did "ATI" send it, or just someone who happened to work for ATI? There's a big difference. I don't see any press releases indicating "nVidia caught with their pants down - again."

Inquiring minds want to know. And the more people involved, the quicker FM or nVidia will respond to the allegations.
 
Well you can level the playing field by the top or the bottom, i let you think which way this particular mail goes.
 
The difference is ATI were definitely cheating over the whole Quake 3 debacle, but no one has definitely shown that nVidia have here. In fact, Futuremark themselves have approved the drivers. All we have is one website producing some interesting, but hardly damning, evidence that a reference board using beta drivers has rendered some frames that are slightly different to a reference one. T

hese results haven't been verified or cross-checked or confirmed officially in any way. Just because NV have used 'optimisations' in the past doesn't automatically make them guilty now. If ATI are encouraging these rumours then that does seem a little petty and underhand.
 
Diplo said:
The difference is ATI were definitely cheating over the whole Quake 3 debacle, but no one has definitely shown that nVidia have here.

Sigh....no ATI were NOT definitely cheating wrt to Quake3 debacle.
 
If I remeber correctly they fixed that without loss of performance, and FM HAD TO BE TOLD about the clip plane issue.

We'll see what the deal is when they fix whatever it is that is wrong, and hopefully QUICKLY if it puts the 6800U in a worse performance view....
Otherwise it would be selectively allowing false #'s to be put up to keep enthusiasm high.

Like I said I am NOT damning NV on this, we'll see when it's fixed if there is a more than marginal difference in performance.

This is JMO.
 
If ATi is aiding in the distribution of these unconfirmed results without explainations from either party just to sell videocards, then it tells me that X800Pro and possibly X800XT are in trouble as far as competition is concern.

If they knew about it and it is only a matter of time before everyone knows, what do they got to loose? They will only gain, just as they did when everything came out about the NV30.

At this point, red flags have been brought up for both companies in my eyes, guess we shall just wait and see. :(
 
zeckensack said:
This is all crap IMO. LOD determination has changed on NV40 vs previous NVIDIA chips. Everything up to and including NV38, all ATI chips and the refrast were in perfect agreement about how this needs to be done for isotropic filtering. It's now slightly different, that's all.

Imagery:
NV40
NV38
R360

Source of all three
Ignore the brilinear stuff on NV38 and ignore the lower precision on R360. Just look at the shape. It's obvious that there are differences. If you do the colored mipmap thing with bilinear filtering, slight differences between mipmap selection will be more pronounced, obviously.
I personally don't give a damn. This is no cheat.
My thoughts exactly when I saw this thread.
 
Nothing against DH, Zardon, or Veridan3; but I just ain't seeing the big discrepency they're talking about.

I see the differences, but I'm not sure I understand them. Also, the ATi image seems to vary from the refrast image too...which doesn't make a whole lot of sense to me.

If neither the ATi or nVidia image is the same as the refrast, why is it that nVidia's is the bad one? :|

Like I say, I guess I just don't understand this one yet. :(

BTW-I was surprised and happy to see DH do this article, it seems they ain't all snuggly-wuggly with nVidia this round already. :D
 
Reverend said:
Ho-hum.

Oh hey, on a much more important note : Anyone knows when the hell we can buy a NV40 card? Coz Albatron sure as hell told me it's not gonna be a mass production product, at least not until end-May.
Yes, I do!

JULY!!!!
 
digitalwanderer said:
Reverend said:
Ho-hum.

Oh hey, on a much more important note : Anyone knows when the hell we can buy a NV40 card? Coz Albatron sure as hell told me it's not gonna be a mass production product, at least not until end-May.
Yes, I do!

JULY!!!!

Oh...that's when the first DX 9.1 Cards appear? ;)
 
My thoughts on this stupid topic

My thoughst on this stupid topic..

Well first of all, why would NVidia cheat with a product that has not been released yet.

Also if results are really true - would all the other reviews that have already reviewed the 6800 also mention quality issues.

The drivers all not even Beta yet - so there may be some problems..

Think about where this came from "DriverHeaven" , didn't they have run in with NVidia on there drivers. Maybe they have bad feelings about it.

My opinion of this - is this is all FUD FUD FUD and is designed to give NVidia 6800 bad press. It may be a signed that X800 is not is powerfull has some hope and some people are desiring to slow down 6800 interested because they don't want ATI to loose its market share.

I think one goal of such articles are attempt to spread fears into others and spread around rumors of cheating in drivers.

If I was NVidia and this is true and the information they provide is all false, I would take legal action..
 
One more big rumor.. July release for NV40.. that is probably the biggest rumor of all. My guess is sometime next month I will have a NV40 in my 3.2Ghz P4.

By the way DX9.1 is non existence - its DX 9.0c... And it does not mean that DX9.0c must be release before the 6800 is release. ATI did release there DX9 card before DX9 was officially released.
 
bloodbob said:
2. AA can be done by simply blurring it removes the aliasing now its a crap method but it works.
No. Blurring is not an AA method.

Aliasing is a high frequency signal that is masquerading as a low frequency signal. Blurring (i.e. post rendering low pass filtering) of an image containing aliasing errors is not really going to help as...
  • the aliasing errors will have been mapped to lower frequencies and so won't be filtered out effectively, and
  • The real image data also contains valid frequencies that will be incorrectly removed by the filter (i.e. the image will be degraded).
In practice you have to start with a higher sample rate image, lowpass filter that and then subsample.
 
muzz said:
If I remeber correctly they fixed that without loss of performance, and FM HAD TO BE TOLD about the clip plane issue.

We'll see what the deal is when they fix whatever it is that is wrong, and hopefully QUICKLY if it puts the 6800U in a worse performance view....
Otherwise it would be selectively allowing false #'s to be put up to keep enthusiasm high.

Like I said I am NOT damning NV on this, we'll see when it's fixed if there is a more than marginal difference in performance.

This is JMO.

Please do some homework and make you the favor to read the archives. Your post is totally wrong.
 
PatrickL said:
muzz said:
If I remeber correctly they fixed that without loss of performance, and FM HAD TO BE TOLD about the clip plane issue.

We'll see what the deal is when they fix whatever it is that is wrong, and hopefully QUICKLY if it puts the 6800U in a worse performance view....
Otherwise it would be selectively allowing false #'s to be put up to keep enthusiasm high.

Like I said I am NOT damning NV on this, we'll see when it's fixed if there is a more than marginal difference in performance.

This is JMO.

Please do some homework and make you the favor to read the archives. Your post is totally wrong.

If it is wrong, then my apologies, but thats how I remembered it.
 
Max Payne 2 raises a different question altogether. Its not synthetic and its really a user experience issue. When your playing away at 100+ fps I have to be honest and say that its not a noticeable change in IQ over the Radeon. You’d be hard pushed to say which image is optimised with the mipmap square.

Anytime that this is in the conclusion I am sorry but I just shrug and go meh.... what a waste of time effort and thought.

I am glad people are all on the prowl for cheats and such heh... but I think it is getting a bit ridiculous, they all look close enough to me that I don't care.
 
Status
Not open for further replies.
Back
Top