Another Richard Huddy Interview

John Carmack of ID software, the creators of Doom3, also noted that NVIDIA’s driver seems fast, but he also noticed that if he made small changes to his code then performance would fall dramatically. What this implies is that NVIDIA are performing “shader substitution†again. We saw them spend most of last year doing this for benchmarks, and the consequence looks like faster benchmark numbers but it’s important to understand that the NVIDIA driver isn’t running the same code as the ATI driver. The ATI driver does what it’s asked, and does it fast, and the NVIDIA driver does something that’s quite similar to what it’s asked to do – but which is only chosen because it’s faster.

That makes this benchmark a rather unfair and unrealistic comparison...


:LOL: :LOL:
 
they need to learn that nvidia is never gonig to stop.

THe sad thing is once they learn then all of the companys will be doing it
 
Well, these words from your guru could at least lay to rest those pesky rumours that JC is favoring Nvidia to the detriment of ATi. Surely he would've just used the shader that Nvidia supposedly has to automatically substitute then right? :?
 
Firstly the benchmark numbers which have been released used ATI drivers which have not been heavily tuned for Doom3. In contrast, NVIDIA’s drivers contain all sorts of optimisation for Doom3, including one that’s likely to get a lot of attention in the future.

Interesting. Wonder what that could be. Though we've yet to see any IQ problems. Doesn't mean that there could be questionable optimizations there anyway though.
 
In contrast, NVIDIA’s drivers contain all sorts of optimisation for Doom3, including one that’s likely to get a lot of attention in the future.

I wonder what he means by that.
 
euan said:
He's British, we have adverse reaction to the slightest bit of sunlight.
Ah, thanks. The "cave dweller meets the daystar" thingy, eh?

Anyone who's bilingual care to take a swing at translating? Babelfish doesn't do it justice.
 
So is ATI's new marketing up to calling out questionable NV optomizations and proving it?

Now that would be interesting.
 
digitalwanderer said:
Anyone who's bilingual care to take a swing at translating? Babelfish doesn't do it justice.

I see the same question and answers in english below the other language.
 
Firstly the benchmark numbers which have been released used ATI drivers which have not been heavily tuned for Doom3. In contrast, NVIDIA’s drivers contain all sorts of optimisation for Doom3, including one that’s likely to get a lot of attention in the future.
EEEEEeeenteresting.... :|
 
digitalwanderer said:
RickCain said:
digitalwanderer said:
Anyone who's bilingual care to take a swing at translating? Babelfish doesn't do it justice.

I see the same question and answers in english below the other language.
Doh! Nevermind... :oops:

Don't feel bad, Digi --I missed it the first time I looked too and am glad you asked. . .
 
The R420 (that’s the chip in the X800 series) can do anything that SM3.0 hardware can do, and in a few rare cases it’s a little more efficient to program these using SM3.0 techniques. But the cost is massive – the chip needs to be roughly one third larger. That’s impossible to justify in any sane business plan.

Ummm, that last sentence is a bit of an oversell, isn't it? The cost/benefit won't be any better next gen, will it?
 
digitalwanderer said:
Firstly the benchmark numbers which have been released used ATI drivers which have not been heavily tuned for Doom3. In contrast, NVIDIA’s drivers contain all sorts of optimisation for Doom3, including one that’s likely to get a lot of attention in the future.
EEEEEeeenteresting.... :|

Followed immediately by accusations that ATI fed the tip to the website that breaks the story. . .
 
Back
Top