GameSpot said:The panel then discussed Nvidia's comprehensive internal QA policy on optimizations, which states that the company refuses to optimize its drivers for specific benchmarks that emphasize features not found in real games, which is, as the representatives suggested, the reason why the most recent cards haven't shown universally high performance in recent benchmarks. The company also reiterated its commitment to image fidelity--rather than opt not to draw certain parts of a scene, GeForce FX cards draw every last part and effect. As an example, the panel showed two screenshots of an explosion from an overdraw benchmark, in which the GeForce card drew the entire explosion as a bright white flare, ATI Radeon card didn't draw every layer of the explosion (the upper-right corner had a slight reddish tinge).
Anybody care to find out what causes this?
It would help if they had actually mentioned the name of the benchmark anywhere in the article, but that was conveniently over-looked.AzBat said:Umm, there was more in that GameSpot article...
GameSpot said:The panel then discussed Nvidia's comprehensive internal QA policy on optimizations, which states that the company refuses to optimize its drivers for specific benchmarks that emphasize features not found in real games, which is, as the representatives suggested, the reason why the most recent cards haven't shown universally high performance in recent benchmarks. The company also reiterated its commitment to image fidelity--rather than opt not to draw certain parts of a scene, GeForce FX cards draw every last part and effect. As an example, the panel showed two screenshots of an explosion from an overdraw benchmark, in which the GeForce card drew the entire explosion as a bright white flare, ATI Radeon card didn't draw every layer of the explosion (the upper-right corner had a slight reddish tinge).
Anybody care to find out what causes this?
Tommy McClain
AzBat said:Umm, there was more in that GameSpot article...
GameSpot said:The panel then discussed Nvidia's comprehensive internal QA policy on optimizations, which states that the company refuses to optimize its drivers for specific benchmarks that emphasize features not found in real games, which is, as the representatives suggested, the reason why the most recent cards haven't shown universally high performance in recent benchmarks. The company also reiterated its commitment to image fidelity--rather than opt not to draw certain parts of a scene, GeForce FX cards draw every last part and effect. As an example, the panel showed two screenshots of an explosion from an overdraw benchmark, in which the GeForce card drew the entire explosion as a bright white flare, ATI Radeon card didn't draw every layer of the explosion (the upper-right corner had a slight reddish tinge).
Anybody care to find out what causes this?
Tommy McClain
Heathen said:Anybody care to find out what causes this?
You're assuming it's true.
digitalwanderer said:It would help if they had actually mentioned the name of the benchmark anywhere in the article, but that was conveniently over-looked.AzBat said:Umm, there was more in that GameSpot article...
GameSpot said:The panel then discussed Nvidia's comprehensive internal QA policy on optimizations, which states that the company refuses to optimize its drivers for specific benchmarks that emphasize features not found in real games, which is, as the representatives suggested, the reason why the most recent cards haven't shown universally high performance in recent benchmarks. The company also reiterated its commitment to image fidelity--rather than opt not to draw certain parts of a scene, GeForce FX cards draw every last part and effect. As an example, the panel showed two screenshots of an explosion from an overdraw benchmark, in which the GeForce card drew the entire explosion as a bright white flare, ATI Radeon card didn't draw every layer of the explosion (the upper-right corner had a slight reddish tinge).
Anybody care to find out what causes this?
Tommy McClain
GameSpot said:The panel then discussed Nvidia's comprehensive internal QA policy on optimizations, which states that the company refuses to optimize its drivers for specific benchmarks that emphasize features not found in real games, which is, as the representatives suggested, the reason why the most recent cards haven't shown universally high performance in recent benchmarks. The company also reiterated its commitment to image fidelity--rather than opt not to draw certain parts of a scene, GeForce FX cards draw every last part and effect. As an example, the panel showed two screenshots of an explosion from an overdraw benchmark, in which the GeForce card drew the entire explosion as a bright white flare, ATI Radeon card didn't draw every layer of the explosion (the upper-right corner had a slight reddish tinge).
DaveBaumann said:Oh, and wrt to this article I'm surprised that none of you have picked up on the fact that apparently the performance differences everyone has seen so far are nothing short of rumours.
DaveBaumann said:Oh, and wrt to this article I'm surprised that none of you have picked up on the fact that apparently the performance differences everyone has seen so far are nothing short of rumours.
"It is apparent that Jensen has not been talking to the folks that are heavily involved in retailing their mid and high end products here in North America, because a lot of those guys are still trying to stop the financial bleeding from NVIDIA's near full-exit from the high end market over the last year.
As for us not attending NVIDIA's Editor Day, reading this coverage just further impacts that we have made the right decision to take the time to spend with NVIDIA's hardware instead of listening to them tell us how great it is and how great it is going to be.
The message is clear. One nDriver a year and ATI makes no impact. I think NVIDIA is the one smoking the hallucinogens this week. Who knew NVIDIA would be doing standup?"
gokickrocks said:GameSpot said:The panel then discussed Nvidia's comprehensive internal QA policy on optimizations, which states that the company refuses to optimize its drivers for specific benchmarks that emphasize features not found in real games, which is, as the representatives suggested, the reason why the most recent cards haven't shown universally high performance in recent benchmarks. The company also reiterated its commitment to image fidelity--rather than opt not to draw certain parts of a scene, GeForce FX cards draw every last part and effect. As an example, the panel showed two screenshots of an explosion from an overdraw benchmark, in which the GeForce card drew the entire explosion as a bright white flare, ATI Radeon card didn't draw every layer of the explosion (the upper-right corner had a slight reddish tinge).
did they ever fix their fog issue?
Did it occur to anyone that maybe the "reddish tinge" is the correct result? No one showed images from the RefRast, right?AzBat said:Umm, there was more in that GameSpot article...
GameSpot said:The panel then discussed Nvidia's comprehensive internal QA policy on optimizations, which states that the company refuses to optimize its drivers for specific benchmarks that emphasize features not found in real games, which is, as the representatives suggested, the reason why the most recent cards haven't shown universally high performance in recent benchmarks. The company also reiterated its commitment to image fidelity--rather than opt not to draw certain parts of a scene, GeForce FX cards draw every last part and effect. As an example, the panel showed two screenshots of an explosion from an overdraw benchmark, in which the GeForce card drew the entire explosion as a bright white flare, ATI Radeon card didn't draw every layer of the explosion (the upper-right corner had a slight reddish tinge).
Anybody care to find out what causes this?
Tommy McClain