Nvidia Editor's Day report

Most NVIDIA decision makers are simply misinformed, but many are just incompetent.
Please refer yourself to ULE for more information.

Oh wait, I didn't publish it yet :oops: :p ;)


Uttar
 
Umm, there was more in that GameSpot article...

GameSpot said:
The panel then discussed Nvidia's comprehensive internal QA policy on optimizations, which states that the company refuses to optimize its drivers for specific benchmarks that emphasize features not found in real games, which is, as the representatives suggested, the reason why the most recent cards haven't shown universally high performance in recent benchmarks. The company also reiterated its commitment to image fidelity--rather than opt not to draw certain parts of a scene, GeForce FX cards draw every last part and effect. As an example, the panel showed two screenshots of an explosion from an overdraw benchmark, in which the GeForce card drew the entire explosion as a bright white flare, ATI Radeon card didn't draw every layer of the explosion (the upper-right corner had a slight reddish tinge).

Anybody care to find out what causes this?

Tommy McClain
 
AzBat said:
Umm, there was more in that GameSpot article...

GameSpot said:
The panel then discussed Nvidia's comprehensive internal QA policy on optimizations, which states that the company refuses to optimize its drivers for specific benchmarks that emphasize features not found in real games, which is, as the representatives suggested, the reason why the most recent cards haven't shown universally high performance in recent benchmarks. The company also reiterated its commitment to image fidelity--rather than opt not to draw certain parts of a scene, GeForce FX cards draw every last part and effect. As an example, the panel showed two screenshots of an explosion from an overdraw benchmark, in which the GeForce card drew the entire explosion as a bright white flare, ATI Radeon card didn't draw every layer of the explosion (the upper-right corner had a slight reddish tinge).

Anybody care to find out what causes this?

Tommy McClain
It would help if they had actually mentioned the name of the benchmark anywhere in the article, but that was conveniently over-looked. :(
 
AzBat said:
Umm, there was more in that GameSpot article...

GameSpot said:
The panel then discussed Nvidia's comprehensive internal QA policy on optimizations, which states that the company refuses to optimize its drivers for specific benchmarks that emphasize features not found in real games, which is, as the representatives suggested, the reason why the most recent cards haven't shown universally high performance in recent benchmarks. The company also reiterated its commitment to image fidelity--rather than opt not to draw certain parts of a scene, GeForce FX cards draw every last part and effect. As an example, the panel showed two screenshots of an explosion from an overdraw benchmark, in which the GeForce card drew the entire explosion as a bright white flare, ATI Radeon card didn't draw every layer of the explosion (the upper-right corner had a slight reddish tinge).

Anybody care to find out what causes this?

Tommy McClain

It's very hard to find the cause when you've never seen the effect...
 
digitalwanderer said:
AzBat said:
Umm, there was more in that GameSpot article...

GameSpot said:
The panel then discussed Nvidia's comprehensive internal QA policy on optimizations, which states that the company refuses to optimize its drivers for specific benchmarks that emphasize features not found in real games, which is, as the representatives suggested, the reason why the most recent cards haven't shown universally high performance in recent benchmarks. The company also reiterated its commitment to image fidelity--rather than opt not to draw certain parts of a scene, GeForce FX cards draw every last part and effect. As an example, the panel showed two screenshots of an explosion from an overdraw benchmark, in which the GeForce card drew the entire explosion as a bright white flare, ATI Radeon card didn't draw every layer of the explosion (the upper-right corner had a slight reddish tinge).

Anybody care to find out what causes this?

Tommy McClain
It would help if they had actually mentioned the name of the benchmark anywhere in the article, but that was conveniently over-looked. :(

Figured that much of the quote would be enough to figure it out. I didn't think there was very many overdraw benchmarks out there?

Tommy McClain
 
it sounds like the overdraw part of the Aquamark3 benchmark when the ship explodes and it goes almost completely white at the end. has massive overdraw and brings systems to a crawl.
 
Someone mentioned a Gearbox/Halo reference in relation to "ATI not drawing all the pixels" from the conference.
 
Oh, and wrt to this article I'm surprised that none of you have picked up on the fact that apparently the performance differences everyone has seen so far are nothing short of rumours.
 
GameSpot said:
The panel then discussed Nvidia's comprehensive internal QA policy on optimizations, which states that the company refuses to optimize its drivers for specific benchmarks that emphasize features not found in real games, which is, as the representatives suggested, the reason why the most recent cards haven't shown universally high performance in recent benchmarks. The company also reiterated its commitment to image fidelity--rather than opt not to draw certain parts of a scene, GeForce FX cards draw every last part and effect. As an example, the panel showed two screenshots of an explosion from an overdraw benchmark, in which the GeForce card drew the entire explosion as a bright white flare, ATI Radeon card didn't draw every layer of the explosion (the upper-right corner had a slight reddish tinge).

did they ever fix their fog issue?
 
DaveBaumann said:
Oh, and wrt to this article I'm surprised that none of you have picked up on the fact that apparently the performance differences everyone has seen so far are nothing short of rumours.

I picked up on it. Figured it was par for the course and needed no mentioning.

Tommy McClain
 
DaveBaumann said:
Oh, and wrt to this article I'm surprised that none of you have picked up on the fact that apparently the performance differences everyone has seen so far are nothing short of rumours.

:LOL:

Yeah, I think NVIDIA believes me, you, and Hellbinder are really the only three guys claiming the NV35 has inferior performance. And of course, we should ONLY trust industry legends like "Ex-Kyle(TM)" and "Biased Tom(TM)", right? ;)


Uttar
 
Lets wait another day or two if all their claims have grounds.. I think we will see if their hardware is really competent or not after seeing the benchmark and IQ results of their wonder driver..

Btw, about the driver release frequency... Nvidia uses this as a chance for a PR show.. Think about it: they can release 3 new drivers by this time if they opted to release a driver in each month like ATI.. However, the performance improvements in each set would be marginal over the previous one (i.e. 5-10%).. However, now they release just a single driver with a huge improvement (i.e. 20-40% on shader performance) in three months. Impact of this choice is much better than the impact on the other choice, of course PR wise..

I do not know how an Nvidia user think if a new driver just comes out and a bug in his/her game is not corrected, which means another wait for 6 months... I wish good luck to them...

And a last note: Is the Editor's Day basically for the harware review site editors and CEO of the publisher companies, or developers also showed up that day?
 
Why depend on humor sent repeatedly to your Inbox when you can get it in direct press releases! ;)

On driver releases, they COULD go straight to one driver per year, you know... It would just require a patch every 1-2 months. ;)
 
Just in from [H]

"It is apparent that Jensen has not been talking to the folks that are heavily involved in retailing their mid and high end products here in North America, because a lot of those guys are still trying to stop the financial bleeding from NVIDIA's near full-exit from the high end market over the last year.

As for us not attending NVIDIA's Editor Day, reading this coverage just further impacts that we have made the right decision to take the time to spend with NVIDIA's hardware instead of listening to them tell us how great it is and how great it is going to be.

The message is clear. One nDriver a year and ATI makes no impact. I think NVIDIA is the one smoking the hallucinogens this week. Who knew NVIDIA would be doing standup?"


I'm speechless.
 
Hehe, nice. I was just about to post Kyle's follow-up. For reference, his quoted comments there followed quick mentions of nVidia's "one-driver per year" statement, and (as one can infer) Huang's comment stating ATi had caused no marketshare loss.

This has been one wacky year. ^_^
 
gokickrocks said:
GameSpot said:
The panel then discussed Nvidia's comprehensive internal QA policy on optimizations, which states that the company refuses to optimize its drivers for specific benchmarks that emphasize features not found in real games, which is, as the representatives suggested, the reason why the most recent cards haven't shown universally high performance in recent benchmarks. The company also reiterated its commitment to image fidelity--rather than opt not to draw certain parts of a scene, GeForce FX cards draw every last part and effect. As an example, the panel showed two screenshots of an explosion from an overdraw benchmark, in which the GeForce card drew the entire explosion as a bright white flare, ATI Radeon card didn't draw every layer of the explosion (the upper-right corner had a slight reddish tinge).

did they ever fix their fog issue?

They are displaying every last pixel man! The fog just gets in the way of the other pixels they need to display obviously.
 
AzBat said:
Umm, there was more in that GameSpot article...

GameSpot said:
The panel then discussed Nvidia's comprehensive internal QA policy on optimizations, which states that the company refuses to optimize its drivers for specific benchmarks that emphasize features not found in real games, which is, as the representatives suggested, the reason why the most recent cards haven't shown universally high performance in recent benchmarks. The company also reiterated its commitment to image fidelity--rather than opt not to draw certain parts of a scene, GeForce FX cards draw every last part and effect. As an example, the panel showed two screenshots of an explosion from an overdraw benchmark, in which the GeForce card drew the entire explosion as a bright white flare, ATI Radeon card didn't draw every layer of the explosion (the upper-right corner had a slight reddish tinge).

Anybody care to find out what causes this?

Tommy McClain
Did it occur to anyone that maybe the "reddish tinge" is the correct result? No one showed images from the RefRast, right?
 
Back
Top