Another Richard Huddy Interview

No Huddy DID NOT quote carmack that was the whole point I made earlier.

Bouncing Zabaglione Bros. said:
Is Carmack an ATI spokesman? Does Carmack work for ATI? This isn't about what Huddy said, this is about what *Carmack* said. Huddy quotes Carmack, exactly as all the other websites do. Can't you read the attribution, or are you so intent on deliberately misrepresenting what was said and who said because you rabidly favour Nvidia?

All the Nvidia apologists are twisting Carmack's very clearly written words into something they do not say. If you speak English, there is no other way to read Carmack's words than how I have explained.

Sxotty said:
Richard Huddy said:
John Carmack of ID software, the creators of Doom3, also noted that NVIDIA’s driver seems fast, but he also noticed that if he made small changes to his code then performance would fall dramatically. What this implies is that NVIDIA are performing “shader substitutionâ€￾ again.[/b]

Now what did carmack actually say
On the other hand, the Nvidia drivers have been tuned for Doom's primary light/surface interaction fragment program, and innocuous code changes can "fall off the fast path" and cause significant performance impacts, especially on NV30 class cards.

That is classic PR taking things out of context selective quoting to make it seem like your position is valid. Huddy makes it seem that it was not an issue especially on the nv3X but on all nv cards. Whereas it is clearly not what carmack said.

Look I understand that it is clearly an issue on all NV cards, but huddy did not quote carmack, and that was my point. If he had actually quoted him that would be fine, but he just said the part that he wanted and left out the qualifier which is the especially NV30 part. That means it is especially significant.

And to hear most ATI fans talk it sounded like Carmack was an nvidia spokesman in their minds, pretty humorous that now they are using him to tout ATI and bash Nvidia...
 
DaveBaumann said:
However, I wonder why anyone would consider that they wouldn't do it anyway since they have a policy, publicly stated that openly states they will optimise shaders. <shrug>
The more important question is whether there is any discernible diffeference in IQ? If not, does it matter a damn whether they optimise shaders or not? I suspect that if any optimisations were done (and there is yet no hard proof either way) they were probably aimed at the NV30 class cards; however, if Nvidia saw they improved perfomance on all cards then I see with no discernible drop in IQ then I see no problem. ATI instead of complaining should be doing the same! They did a good job optimising AF, so why not shaders too? Give the end users better value!
 
Diplo said:
DaveBaumann said:
However, I wonder why anyone would consider that they wouldn't do it anyway since they have a policy, publicly stated that openly states they will optimise shaders. <shrug>
The more important question is whether there is any discernible diffeference in IQ? If not, does it matter a damn whether they optimise shaders or not? I suspect that if any optimisations were done (and there is yet no hard proof either way) they were probably aimed at the NV30 class cards; however, if Nvidia saw they improved perfomance on all cards then I see with no discernible drop in IQ then I see no problem. ATI instead of complaining should be doing the same! They did a good job optimising AF, so why not shaders too? Give the end users better value!

It's getting hard to weed out the facts from the BS...if you listen to ATi you think there's shader-replacement going on...and ATi has managed to convince everyone that shader-replacement is evil...

How can Richard Huddy take Carmack's comment so far out of context like that just to take a cheap shot at nVidia when in the same damned paragraph Carmack talked about how much he hated ATi's AF optimizations...

This interview is nothing but BS PR damage control...the world would be better if we stuck to impartial websites instead of listening to marketing people who couldn't tell the truth if thier lives depended on it...
 
If Huddy is taking Carmack comments out of context how else can this quote be explained:

John Carmack @ HardOCP said:
the Nvidia drivers have been tuned for Doom's primary light/surface interaction fragment program, and innocuous code changes can "fall off the fast path" and cause significant performance impacts, especially on NV30 class cards.

Also fyi, Carmack was referring to the trilinear optimizations when he commented on the colored mipmaps issue. It also looks like Huddy isn't the only one taking Carmacks comments extremely out of context because he never said anything to the tune of disdaining that particular optimization.
 
^eMpTy^ said:
Diplo said:
DaveBaumann said:
However, I wonder why anyone would consider that they wouldn't do it anyway since they have a policy, publicly stated that openly states they will optimise shaders. <shrug>
The more important question is whether there is any discernible diffeference in IQ? If not, does it matter a damn whether they optimise shaders or not? I suspect that if any optimisations were done (and there is yet no hard proof either way) they were probably aimed at the NV30 class cards; however, if Nvidia saw they improved perfomance on all cards then I see with no discernible drop in IQ then I see no problem. ATI instead of complaining should be doing the same! They did a good job optimising AF, so why not shaders too? Give the end users better value!

It's getting hard to weed out the facts from the BS...if you listen to ATi you think there's shader-replacement going on...and ATi has managed to convince everyone that shader-replacement is evil...
In this case it's rather funny than evil , if you realize that nV cards were "designed for Doom" :LOL: :LOL:
 
^eMpTy^ said:
...and ATi has managed to convince everyone that shader-replacement is evil...
Sorry, but that's wholly nVidia's fault. Maybe if they stuck to shader replacements that provided benefits to their customers, rather than their board builders....

But that's another discussion entirely.
 
gordon said:
Also fyi, Carmack was referring to the trilinear optimizations when he commented on the colored mipmaps issue. It also looks like Huddy isn't the only one taking Carmacks comments extremely out of context because he never said anything to the tune of disdaining that particular optimization.


He used the word "hate."

"I hate the idea of drivers analyzing texture data and changing parameters, but it doesn't visibly impact the quality of the game unless you know exactly what to look for on a specific texture."

Granted he did say that ATI's optimizations do not affect IQ unless you know what to look for, but he hates it all the same. Since he did mention IQ in his comments about ATI, I would expect he would say something if Nvidia's optimizations were also causing IQ issues.

Personally I don’t really care as long as I see the game the way the developer intended for me to see it. If someone changes some math in the background that gives me the same IQ with an increase in performance I am all for it.

-Nathan
 
Diplo said:
DaveBaumann said:
However, I wonder why anyone would consider that they wouldn't do it anyway since they have a policy, publicly stated that openly states they will optimise shaders. <shrug>
The more important question is whether there is any discernible diffeference in IQ? If not, does it matter a damn whether they optimise shaders or not? I suspect that if any optimisations were done (and there is yet no hard proof either way) they were probably aimed at the NV30 class cards; however, if Nvidia saw they improved perfomance on all cards then I see with no discernible drop in IQ then I see no problem. ATI instead of complaining should be doing the same! They did a good job optimising AF, so why not shaders too? Give the end users better value!

It was talked to death. The whole problem with shader replacement, when the game is used as a benchmark, is that benchmark becomes only an indication about performance of the card on that game and even more on that game version. When you have to use shader replacement to achieve the best speed you are killing any forward looking outside that very specific case.
 
The "shader replacement" problem with regards to D3 and nVidia will rear it's ugly head when mods begin to appear....... As Carmack said, performance will plummit.......

But I think this whole line of reasoning is missing the main point, which is not what Huddy had to say regarding Carmacks comments, but the "NVIDIA’s drivers contain all sorts of optimisation for Doom3, including one that’s likely to get a lot of attention in the future" comment......
 
martrox said:
But I think this whole line of reasoning is missing the main point, which is not what Huddy had to say regarding Carmacks comments, but the "NVIDIA’s drivers contain all sorts of optimisation for Doom3, including one that’s likely to get a lot of attention in the future" comment......

It'll be interesting to see what they're talking about. But i wouldn't take mr Huddys words as gospel until they actually show some evidence with IQ problems to back them up.
 
Bjorn said:
But i wouldn't take mr Huddys words as gospel until they actually show some evidence with IQ problems to back them up.

I can't disagree with that. Bjorn, except why would Mr. Huddy say something like this if there wasn't something to it? Given nVidia's past "optimization" history, while I won't take Mr. Huddys' comment as gospel I do believe he knows something about this - He's never been caught lying, can you say the same about nVidia?
 
martrox said:
I do believe he knows something about this - He's never been caught lying, can you say the same about nVidia?

Dunno if mr Huddy has been caught lying but his latest comments, in this interview f.e, doesn't seem to be anything more then a big pile of marketing crap/FUD. I certainly don't trust him anymore then i trust Dan Vivoli or any other PR guy.
 
nmyeti said:
He used the word "hate."

Granted he did say that ATI's optimizations do not affect IQ unless you know what to look for, but he hates it all the same. Since he did mention IQ in his comments about ATI, I would expect he would say something if Nvidia's optimizations were also causing IQ issues.

Personally I don’t really care as long as I see the game the way the developer intended for me to see it. If someone changes some math in the background that gives me the same IQ with an increase in performance I am all for it.

-Nathan

This has been a chronic issue for years, and almost all vendors have been guilty of it at one time or another. I hate the idea of drivers analyzing texture data and changing parameters, but it doesn't visibly impact the quality of the game unless you know exactly what to look for on a specific texture.

That doesnt look like he is directly condemning ati's method specifically, but the basic method used to achieve it.
 
Bjorn said:
Dunno if mr Huddy has been caught lying but his latest comments, in this interview f.e, doesn't seem to be anything more then a big pile of marketing crap/FUD. I certainly don't trust him anymore then i trust Dan Vivoli or any other PR guy.

would you please point out a single example to back your claim of dissinformation? how can you say his comments are fud when you just said you don't know if he has ever been caught lying?
 
kyleb said:
how can you say his comments are fud when you just said you don't know if he has ever been caught lying?

You don't have to lie to spread FUD. And marketing crap is usually not lies, just one version of the truth :)
 
nmyeti said:
He used the word "hate."

"I hate the idea of drivers analyzing texture data and changing parameters, but it doesn't visibly impact the quality of the game unless you know exactly what to look for on a specific texture."

Granted he did say that ATI's optimizations do not affect IQ unless you know what to look for, but he hates it all the same. Since he did mention IQ in his comments about ATI, I would expect he would say something if Nvidia's optimizations were also causing IQ issues.

So does this mean JC "loves" the idea of "driver programmers analyzing texture data and changing driver parameters," then...? Is his beef with automatically switched optimization versus manually switched optimization? Must be, since he never mentioned nVidia's trilinear optimizations at the same time he saw fit to mention ATi's, which I found really strange...;)

Personally I don’t really care as long as I see the game the way the developer intended for me to see it. If someone changes some math in the background that gives me the same IQ with an increase in performance I am all for it.

Apparently, though, if ATi does that JC "hates" it, but if nVidia does it he can't find the words to express any emotion about it one way or the other...;) (At least in this instance.) I'm sure he includes nVidia in his "...almost all vendors have been guilty of it at one time or another" sentiment, but he just can't bring himself to directly say "nVidia does it, too" (whereas for ATi the words had no trouble forming in his mind.)

A problem for Freud, certainly, and beyond the scope of this post...;) Oedipus complex, maybe?
 
I already showed a case of selective quoting to show how huddy was being misleading.

Furthermore, why don't you just wait till the mods are out and see if the NV40 cards performance plummets (in comparison to the r4xx cards of course) as someone suggested they would. Guess what it ain't going to I promise you that. The NV30 is another matter as we all know.
 
Back
Top