Interview with ATIs Raja Koduri

Raja Koduri:Regarding Half Life 2: The answer is more obvious. Our hardware is several times faster running Direct X 9 shaders. This is no secret and does not take rocket science to figure this out. Write a reasonably long Direct X 9 shader and run a simple fillrate test using this shader on different GPUs and the answer will be obvious. I think people read too much into the marketing arrangement and ignore the simple basic facts.

Editorial remark: We agree that the shader performance of ATIs Direct X 9 chips is better, but that's no reason to force owners of Nvidia boards to use Direct X 8 in hardware if the board supports Direct X 9. Fact is that you could switch on the Direct X 9 support for these Nvidia boards very easily: Klick with the right mouse button on the Half Life 2 icon at the desktop and choose "Properties". Add to the "Target" field after "hl2.exe" the extension "-dxlevel 90". If you start the game via this shortcut, the graphic board is now working with the hardware Direct X level 9.
No bias there.... :rolleyes:

Raja Koduri said:
There are titles in production which target R300 class hardware as minimum spec and my expectation is that these are titles that will look stunning.
Ah, any clue which titles he's implying there? :|

It wasn't a bad interview, why the fear tEd?
 
digitalwanderer said:
Editorial remark: We agree that the shader performance of ATIs Direct X 9 chips is better, but that's no reason to force owners of Nvidia boards to use Direct X 8 in hardware if the board supports Direct X 9. Fact is that you could switch on the Direct X 9 support for these Nvidia boards very easily: Klick with the right mouse button on the Half Life 2 icon at the desktop and choose "Properties". Add to the "Target" field after "hl2.exe" the extension "-dxlevel 90". If you start the game via this shortcut, the graphic board is now working with the hardware Direct X level 9.
No bias there.... :rolleyes:
A really low point of the interview indeed. You'd really get the impression there is little reason why hl2 uses dx8 on the FX cards, they could really mention you get only one third of the performance if you use dx9...
 
M'gawd. They let an engineer do an interview without a political officer. . err, pr person to "help" him?

Gotta, gotta, gotta love that he's using an AGP X800 XT. Heh. Maybe that's why there are so few around --after they take care of their own staff. :LOL:

See him step up on Doom III? You go, boy! Punt those conspiracy theories into next week! "If it is true that Doom3 is more optimized for Nvidia hardware there are only two ways to look at it in my opinion. Nvidia technical team did a better job with Id or the ATI technical team did a bad job." (and a similar mirror-image answer re ATI & HL2).
 
yeap I do like interviews when the PR filters are by passed (or turned on to a low level that most of the PR stuff gets washed out).
 
Wee bit confused by that article. The FX does indeed incorporate a DirectX 8.1 class shader path. And the NV4x uses the same DirectX 9.0 shader path afaik. Is the interview just ignoring the Nv4x when discussing this point or what? :?
 
ChrisRay said:
Wee bit confused by that article. The FX does indeed incorporate a DirectX 8.1 class shader path. And the NV4x uses the same DirectX 9.0 shader path afaik. Is the interview just ignoring the Nv4x when discussing this point or what? :?

hehe maybe he is sharpening his skills up for a PR position?
 
Raja Koduri:Regarding Half Life 2: The answer is more obvious. Our hardware is several times faster running Direct X 9 shaders.
Several times? Wow. And how many times faster are their hardware when running SM3 shaders (DX9, yeah...)?
 
DegustatoR said:
Raja Koduri:Regarding Half Life 2: The answer is more obvious. Our hardware is several times faster running Direct X 9 shaders.
Several times? Wow. And how many times faster are their hardware when running SM3 shaders (DX9, yeah...)?

well in farcry its faster doing p.s 2.0 shaders than the nv40 is at p.s 3.0 shaders


zing ! ;)
 
DegustatoR said:
Raja Koduri:Regarding Half Life 2: The answer is more obvious. Our hardware is several times faster running Direct X 9 shaders.
Several times? Wow.
I'm not sure what issues you have with that statement. The problem is not the answer, the question was a bit misleading, since it only really related to GeForce 5 FX, not GeForce 6 cards, without mentioning this. And you can't deny that DirectX 9 shaders, in fact, DO run several times faster on r300-class hardware than on GeForceFX cards.
 
mczak said:
DegustatoR said:
Raja Koduri:Regarding Half Life 2: The answer is more obvious. Our hardware is several times faster running Direct X 9 shaders.
Several times? Wow.
I'm not sure what issues you have with that statement. The problem is not the answer, the question was a bit misleading, since it only really related to GeForce 5 FX, not GeForce 6 cards, without mentioning this. And you can't deny that DirectX 9 shaders, in fact, DO run several times faster on r300-class hardware than on GeForceFX cards.


well, how fast are 3.0 Shaders on R300 class hardware? or will you deny that 3.0 shaders are part of DX9?

this ATI guy should have spoken of 2.0 shaders but not of DX9 shaders.
 
nobody said:
well, how fast are 3.0 Shaders on R300 class hardware? or will you deny that 3.0 shaders are part of DX9?

He was talking about HL2 Dx9 shaders, I don't see any 3.0 shaders in HL2 do you?

Ok I've fed the troll, sorry for that guys.
 
mczak said:
I'm not sure what issues you have with that statement. The problem is not the answer, the question was a bit misleading, since it only really related to GeForce 5 FX, not GeForce 6 cards, without mentioning this. And you can't deny that DirectX 9 shaders, in fact, DO run several times faster on r300-class hardware than on GeForceFX cards.
Well, i'm not so sure in this "several times faster" actually, if we compare R350 with NV35, but my question was more on this "DX9 shaders" which they run "several times faster". SM3 is DX9 SM and they don't run it at all ATM.

I know it's PR, but these things are steadily setting me in the "WTF mode", can't fight it...
 
DegustatoR said:
Where's HL2 mentioned in this sentence?

Ok i won't go on with this, this is my last post about it, so think what you prefer, anyway the complete sentence was:

Question said:
...Half Life 2 is using the hardware Direct X level 8 for Nvidia cards which actually supports Direct X 9?
Obiously PC-WELT is talking about Geforce FX here, and not Geforce 6.

Raja Koduri said:
Regarding Half Life 2: The answer is more obvious. Our hardware is several times faster running Direct X 9 shaders.

He meant what should be clear by now, NV30 sucked at dx9 shaders (pardon moi, 2.0 shaders) while r300 was pretty good with them.
So if i ask you about a particular scenario, you usually answer about that particular scenario. But if you think he meant that any DX9 shader ever written (yeah probably even 3.0 shaders running in software, as nobody said) run faster on ATI cards, go ahead with it, you are free to think whatever you want.
 
DegustatoR said:
mczak said:
I'm not sure what issues you have with that statement. The problem is not the answer, the question was a bit misleading, since it only really related to GeForce 5 FX, not GeForce 6 cards, without mentioning this. And you can't deny that DirectX 9 shaders, in fact, DO run several times faster on r300-class hardware than on GeForceFX cards.
Well, i'm not so sure in this "several times faster" actually, if we compare R350 with NV35, but my question was more on this "DX9 shaders" which they run "several times faster". SM3 is DX9 SM and they don't run it at all ATM.

I know it's PR, but these things are steadily setting me in the "WTF mode", can't fight it...

You're right, the R300 isn't several times faster then the NV35 when running nvidia optimized drivers, which do not correctly (or in some cases at all) render scenes and objects. However, forcing a Geforce FX to run a pure DX9 path without any optimizations (ie. equal image quality and rendering) then you do find a performance difference of several multiples.

I don't understand why you are bringing SM3 into this as it has nothing to do with the R300 or the NV30. SM3 is basically just a revised variation of SM2 that procures a few minor additions/upgrades.

well, how fast are 3.0 Shaders on R300 class hardware? or will you deny that 3.0 shaders are part of DX9?

this ATI guy should have spoken of 2.0 shaders but not of DX9 shaders.

Your confused about what Direct-X is and how it works. The R300 does not support SM3, it cannot run shaders designed around SM3 specs.
 
Althornin said:
OMG, its called context, TROLL.
READ.

Well, I hope you have found your true vocation in life, because you've just forever disqualified yourself from being a plaintiff's attorney or a reporter [with a vanishingly small number of honorable exceptions]. :LOL:
 
Back
Top