Another Richard Huddy Interview

So we've come to the conclusion that marketing departments like to show their own products in the best light whilst casting aspersions on the merits of their competitors' ? Who'd have thought that :oops:
 
jimmyjames123 said:
Carmack also said:

but it should be noted that some of the ATI [X800] cards did show a performance drop when colored mip levels were enabled, implying some fudging of the texture filtering. This has been a chronic issue for years, and almost all vendors have been guilty of it at one time or another. I hate the idea of drivers analyzing texture data and changing parameters, but it doesn't visibly impact the quality of the game unless you know exactly what to look for on a specific texture.
And how is this relevant to the discussion at hand? Do you play the game with colored mips enabled?

-FUDie
 
besides, that comment from Carmack should hardly be supriseing to anyone as ati had been stating they were doing exactly that for quite some time.

Diplo said:
So we've come to the conclusion that marketing departments like to show their own products in the best light whilst casting aspersions on the merits of their competitors' ? Who'd have thought that :oops:

sure, and Vivoli does it with fud while Huddy responds with bitterness; are you honestly trying to claim that fud is the more noble approch and that bitterness is unwarented in light of the fud?
 
jimmyjames123 said:
Sure, we can argue semantics all day, but it does seem obvious that Richard [Fuddy] likes to spin things in favor of ATI and against NV.

There are no sematic issues here. It's perfectly clear to anyone that speaks English and isn't trolling. Carmack (is he an ATI marketeer too?) says that Nvidia drivers suffer if anything changes. He singles out NV30 as being particularly bad. Just because NV30 is singled out, the Nvidia apologists are pretending this means the NV40 is fine. If it uses Nvidia drivers, it has the drop-off problem - that's what he says.
 
Why are you arguing BZB we know you favor ATI, but it is pointless. An ATI PR spokeman is accused of favoring ATI and you are jumping to his defense well WTF else is an ATI spokes person supposed to do but show ati in a good light. That doesn't mean he is evil it means he works for ATI, if he was saying "Well the NV4x is just a better design than ours" then he would be doing a bad job and get his but fired. The same thing applies in reverse and Nv pr guy that said well the nv4X is just not competitive with the R4xx would get fired too.
 
Sxotty said:
Why are you arguing BZB we know you favor ATI, but it is pointless. An ATI PR spokeman is accused of favoring ATI and you are jumping to his defense well WTF else is an ATI spokes person supposed to do but show ati in a good light. That doesn't mean he is evil it means he works for ATI, if he was saying "Well the NV4x is just a better design than ours" then he would be doing a bad job and get his but fired. The same thing applies in reverse and Nv pr guy that said well the nv4X is just not competitive with the R4xx would get fired too.

Is Carmack an ATI spokesman? Does Carmack work for ATI? This isn't about what Huddy said, this is about what *Carmack* said. Huddy quotes Carmack, exactly as all the other websites do. Can't you read the attribution, or are you so intent on deliberately misrepresenting what was said and who said because you rabidly favour Nvidia?

All the Nvidia apologists are twisting Carmack's very clearly written words into something they do not say. If you speak English, there is no other way to read Carmack's words than how I have explained.
 
Bouncing Zabaglione Bros. said:
There are no sematic issues here. It's perfectly clear to anyone that speaks English and isn't trolling. Carmack (is he an ATI marketeer too?) says that Nvidia drivers suffer if anything changes. He singles out NV30 as being particularly bad. Just because NV30 is singled out, the Nvidia apologists are pretending this means the NV40 is fine. If it uses Nvidia drivers, it has the drop-off problem - that's what he says.
He doesn't say anything about NV40.
 
who are you talking about, Carmack or Huddy? not that it matters as neither made their comments directly about the nv40, but just nvidia in general. :p
 
Humus said:
http://www2.hardocp.com/article.html?art=NjQy
I think that people need to look at this entire quote and break it down into the 2 different parts so that they understand its full meaning.

ATI related
The benchmarking was conducted on-site, and the hardware vendors did not have access to the demo before hand, so we are confident that there is no egregious cheating going on, but it should be noted that some of the ATI cards did show a performance drop when colored mip levels were enabled, implying some fudging of the texture filtering. This has been a chronic issue for years, and almost all vendors have been guilty of it at one time or another. I hate the idea of drivers analyzing texture data and changing parameters, but it doesn't visibly impact the quality of the game unless you know exactly what to look for on a specific texture.


Nvidia related

On the other hand, the Nvidia drivers have been tuned for Doom's primary light/surface interaction fragment program, and innocuous code changes can "fall off the fast path" and cause significant performance impacts, especially on NV30 class cards.

Basically both companies are “tuningâ€￾ their drivers to some extent. I am too much of a noobie to know for sure what JC is referring to when he says “tuned for doom’s primary light/surface interaction fragment program.â€￾ Most of you are suggesting shader replacements, but I don’t know enough to properly interpret that statement.

Let me ask this, if id is in bed with Nvidia enough so that he has structured his engine in such a way to run poorly on ATI’s cards, then why would they need to replace shaders in their driver when they could have just helped id write them in the first place?

-Nathan
 
nmyeti said:
Basically both companies are “tuningâ€￾ their drivers to some extent. I am too much of a noobie to know for sure what JC is referring to when he says “tuned for doom’s primary light/surface interaction fragment program.â€￾ Most of you are suggesting shader replacements, but I don’t know enough to properly interpret that statement.

"Fragment program"=shader
 
nmyeti said:
Basically both companies are “tuningâ€￾ their drivers to some extent.

i don't think you understand that the "tuning" commented about on the ati side is simply a function of their filtering methods which ati has openly explained and is not doom3 specific by any means but rather something that happens in all aplications.

nmyeti said:
Let me ask this, if id is in bed with Nvidia enough so that he has structured his engine in such a way to run poorly on ATI’s cards...

did someone here actually make such an absurd claim as that?
 
kyleb said:
i don't think you understand that the "tuning" commented about on the ati side is simply a function of their filtering methods which ati has openly explained and is not doom3 specific by any means but rather something that happens in all aplications.



Yes, I understand that this is an optimization for all situations that can not be turned off. However in this specific situation it improves D3 performance and that is the topic at hand.







kyleb said:
did someone here actually make such an absurd claim as that?
Sorry, that may have been something that I brought into this thread from outside. The notes from the ATI investor conference call seem to indicate that doom3 was written to run well on NVIDIA’s hardware at the cost of ATI's performance.
 
Tim said:
Xmas said:
He doesn't say anything about NV40.

No need, he said nVidia drivers no exceptions.
No, he said "NVidia drivers". Not "all NVidia drivers" (which would certainly be wrong for older drivers), nor "NVidia drivers on all cards".

"Especially on NV30 class cards" does not necessarily imply "and to a lesser degree on NV40 cards". For all we know he could have compared to "NV35 class cards", i.e. CineFX 2, which behave very different from CineFX 1 cards when using shaders.
 
kyleb said:
are you honestly trying to claim that fud is the more noble approch and that bitterness is unwarented in light of the fud?
*sigh* No, I was making a light-hearted comment. Never mind, carry on arguing, I'm sure it's important to you...
 
i wasn't argueing anything, i was asking a question becuase i was confused by your statement as it seems to obsure the truth of the situation, and truth is something that is important to me.
 
nmyeti said:
Yes, I understand that this is an optimization for all situations that can not be turned off. However in this specific situation it improves D3 performance and that is the topic at hand.

there is nothing specific about the situation, it improves peformace all around.
 
Xmas said:
"Especially on NV30 class cards" does not necessarily imply "and to a lesser degree on NV40 cards". For all we know he could have compared to "NV35 class cards", i.e. CineFX 2, which behave very different from CineFX 1 cards when using shaders.

In a literal translation, it does.

However, I wonder why anyone would consider that they wouldn't do it anyway since they have a policy, publicly stated that openly states they will optimise shaders. <shrug>
 
Back
Top