Interview with ATIs Raja Koduri

ChrisRay said:
Is the interview just ignoring the Nv4x when discussing this point or what? :?
Apparently so. Adding to that strange oversight, the interviewer then says how to enable DX9 for the easily-inferred NV3x cards. Why would you want to do that when the performance drops to half?

Other than that time capsule oddity, the interview was pretty straightforward.
 
DegustatoR said:
NocturnDragon said:
He was talking about HL2 Dx9 shaders, I don't see any 3.0 shaders in HL2 do you?

Ok I've fed the troll, sorry for that guys.

Our hardware is several times faster running Direct X 9 shaders.
Where's HL2 mentioned in this sentence?


HL2 is not mentioned anywhere in Raja's answer thus his answer is incorrect.

"Regarding Half Life 2: The answer is more obvious. Our hardware is several times faster running Direct X 9 shaders. This is no secret and does not take rocket science to figure this out. Write a reasonably long Direct X 9 shader and run a simple fillrate test using this shader on different GPUs and the answer will be obvious. I think people read too much into the marketing arrangement and ignore the simple basic facts."

A correct answer would have been:

Our hardware is several times faster running HL2 Direct X 9 shaders.
or
Our hardware is several times faster running PS2.0 shaders.

Same for:
Write a reasonably long Direct X 9 shader and run a simple fillrate test using this shader on different GPUs and the answer will be obvious

Just replace "DirectX9" by "PS2.0" and everything is ok.
 
Can people kindly stop the pedantic and petty tit-for-tat arguing in this thread; it's not progressing the discussion and the only thing it is moving forward is the increasing probability of either this thread being locked or several users being ejected.
 
Not to mention it is nice when you get an interview with someone who wil be candid and if the foot gets in the mouth at one point big deal.

It is better than PR speak when they don't tell you anything.
 
nobody said:
mczak said:
I'm not sure what issues you have with that statement. The problem is not the answer, the question was a bit misleading, since it only really related to GeForce 5 FX, not GeForce 6 cards, without mentioning this. And you can't deny that DirectX 9 shaders, in fact, DO run several times faster on r300-class hardware than on GeForceFX cards.
well, how fast are 3.0 Shaders on R300 class hardware? or will you deny that 3.0 shaders are part of DX9?

this ATI guy should have spoken of 2.0 shaders but not of DX9 shaders.
How fast do SM3 shaders run on FX cards? What's the sound of one hand clapping? If a tree falls in the woods and no one's around, does it make a sound?
nobody said:
HL2 is not mentioned anywhere in Raja's answer thus his answer is incorrect.

[...]

A correct answer would have been:

Our hardware is several times faster running HL2 Direct X 9 shaders.
or
Our hardware is several times faster running PS2.0 shaders.

Same for:
Write a reasonably long Direct X 9 shader and run a simple fillrate test using this shader on different GPUs and the answer will be obvious

Just replace "DirectX9" by "PS2.0" and everything is ok.
I don't mean to annoy Neeyik, but it bothers me that you can't acknowledge the obvious, nobody, and that is that:

1. Raja was referring to the FX line vis-a-vis ATI's offerings at that time.
2. DX9 meant PS2.0 in those days. The terms were interchangable, as SM3 wasn't supported by either IHV (or even exposed in DX, IIRC) at that time.
3. Similarly, you could've swapped the contextual HL2 reference with any of the handful of "DX9" titles or benchmarks available in the FX's prime, and Raja's statements still would've held with the hardware competing in the market at that time.

His answer is correct. Your perception of the context is incorrect.

We all know nV is both speed and featureset competitive with ATi this gen (to say the least), but that's not what this interview is about. There are plenty of other, more relevant threads to lord nV's SM3 and FP16 blending over ATi.
 
Pete. I think the big concern here is while "we" may know that. I dont think the average reader will. But without knowing the differences between the FX series and Geforce 6 series. It would be incredibly easy to misinterpret this interview. As its not very specific and theres no clarification for those who havent followed the FX/geforce 6 trends for shader performance.
 
ChrisRay said:
Pete. I think the big concern here is while "we" may know that. I dont think the average reader will. But without knowing the differences between the FX series and Geforce 6 series. It would be incredibly easy to misinterpret this interview. As its not very specific and theres no clarification for those who havent followed the FX/geforce 6 trends for shader performance.

Would you agree that is about 70/30 the interviewer's fault in the inartful way he asked the question, and even in the relevance of the question in the first place given where we are today? Maybe the guy has an FX as his primary card and can't let go of grinding his own issues. That "editorial note" certainly gave an impression of 'can't let go'.

I think the engineer correctly got the context he was being asked about and answered appropriately to that context, tho I don't disagree with your point above. I also don't think he was trying to mislead, and would agree that it would be unfortunate if some people were mislead.
 
geo said:
ChrisRay said:
Pete. I think the big concern here is while "we" may know that. I dont think the average reader will. But without knowing the differences between the FX series and Geforce 6 series. It would be incredibly easy to misinterpret this interview. As its not very specific and theres no clarification for those who havent followed the FX/geforce 6 trends for shader performance.

Would you agree that is about 70/30 the interviewer's fault in the inartful way he asked the question, and even in the relevance of the question in the first place given where we are today? Maybe the guy has an FX as his primary card and can't let go of grinding his own issues. That "editorial note" certainly gave an impression of 'can't let go'.

I think the engineer correctly got the context he was being asked about and answered appropriately to that context, tho I don't disagree with your point above. I also don't think he was trying to mislead, and would agree that it would be unfortunate if some people were mislead.

I could and would most certainly agree. The interview itself confused me. Not just the answers.

Chris
 
Of course, if I wanted to get in the wayback machine, we could all just agree that it is Microsoft's fault for not releasing DX9.1 ;)

Rule's Rule: "When hunting for common ground on PC isues, Blame Microsoft First."
 
It's possible, however unlikely, that a general reader would be reading an interview with an ATi engineer. ;) But the PCWelt followup that noted HL2 forced GF cards into DX8 mode is a pretty big clue as to what he's talking about, and if you're reading an online interview, it's easy enough to find benchmarks that show Raja's not talking about the GF6/X800 rivalry.

This would all be moot if PCWelt didn't act so cluelessly in their response. C'est la vie.
 
Well I disagree with you there Pete. I dont agree that the average GPU buyer is not reading website and gathering data. You wouldnt believe how many people I have come across that will read articles like this and then draw false conclusions. I agree the article and question is vague enough to where it could mislead someone a great deal. Whether purposely or just by a lack of doing a detailed interview. Information like this is damaging to people trying to make informed decisions about their next hardware.

How much damage this article did was probably minimal. But it still doesnt excuse mis information based upon incomplete data.
 
Good lord - its just an interview with an engineer, get over it. We've seen far worse interviews before and we'll see far worse later on as well.
 
Althornin said:
OMG, its called context, TROLL.
READ.
Exactly. And the context was wrong. B/c this sentence should be spelled like this: "Our R300 hardware is several times faster than GFFX running PS2.0 shaders." The way it was said it is FALSE.

As for calling everybody around "trolls" i'll leave that on your conscience.
 
DaveBaumann said:
Good lord - its just an interview with an engineer, get over it. We've seen far worse interviews before and we'll see far worse later on as well.

And if we all didn't obsess over 3D-related issues, what would you do with all that spare time you currently devote to B3D?
 
digitalwanderer said:
geo said:
Of course, if I wanted to get in the wayback machine, we could all just agree that it is Microsoft's fault for not releasing DX9.1
Uhm, no. :rolleyes:

C'mon digi, if MS had released 9.0c as 9.1 then nVidia fans would be disappointed (yet again) that a promised "patch" didn't fix the FX's slowness in DX9 shaders. ;)

DISCLAIMER: The above reference to "DX9 shaders" is relative to the time period and parts in question (i.e. SM 2.0 and GeForce FX). :rolleyes:
 
Mordenkainen said:
C'mon digi, if MS had released 9.0c as 9.1 then nVidia fans would be disappointed (yet again) that a promised "patch" didn't fix the FX's slowness in DX9 shaders. ;)

DISCLAIMER: The above reference to "DX9 shaders" is relative to the time period and parts in question (i.e. SM 2.0 and GeForce FX). :rolleyes:
I know, I know....but I just had to mention it because I was getting tired of all the disinformation and history re-writes some people just seem to keep pushing to make. :rolleyes:
 
digitalwanderer said:
geo said:
Of course, if I wanted to get in the wayback machine, we could all just agree that it is Microsoft's fault for not releasing DX9.1
Uhm, no. :rolleyes:

Oh, Digi, I threw that out there for the specific purpose of seeing if you'd pop out of the woodwork and strike at the bait. :LOL:
 
geo said:
Oh, Digi, I threw that out there for the specific purpose of seeing if you'd pop out of the woodwork and strike at the bait. :LOL:
Wow, did you know we have a word for that very type of thing Geo?

Yup, we call it "trolling"....and we call people who do it "trolls". Wear the badge with pride.
 
digitalwanderer said:
geo said:
Oh, Digi, I threw that out there for the specific purpose of seeing if you'd pop out of the woodwork and strike at the bait. :LOL:
Wow, did you know we have a word for that very type of thing Geo?

Yup, we call it "trolling"....and we call people who do it "trolls". Wear the badge with pride.

:rolleyes: :rolleyes: :rolleyes:

Coming from the man who the word was invented for no less?
The guy who flooded what 5-6 or more forums with his anti-NV crap right after his ATI "buddy" gave him a new 9700 a couple years ago? Or the guy who has been banned from just about every major hw site forums?
LOL
 
Back
Top