Another Richard Huddy Interview

digitalwanderer said:
Firstly the benchmark numbers which have been released used ATI drivers which have not been heavily tuned for Doom3. In contrast, NVIDIA?s drivers contain all sorts of optimisation for Doom3, including one that?s likely to get a lot of attention in the future.
EEEEEeeenteresting.... :|
Well, until they support for that allegation with some old-school facts, this puts ATi firmly in the 'bad loser' & 'FUD spreader' categories.
 
Bjorn said:
Firstly the benchmark numbers which have been released used ATI drivers which have not been heavily tuned for Doom3. In contrast, NVIDIA’s drivers contain all sorts of optimisation for Doom3, including one that’s likely to get a lot of attention in the future.

Interesting. Wonder what that could be. Though we've yet to see any IQ problems. Doesn't mean that there could be questionable optimizations there anyway though.

Yep, that caught my eyes too... :oops: :?:
 
geo said:
Ummm, that last sentence is a bit of an oversell, isn't it? The cost/benefit won't be any better next gen, will it?

The ratio improves. The even if the benefit doesn't move (on the software side assuming no SM3 titles, or ones that show no benefits over SM2.0; on the hardware side no significant changes are made to the "instruction per cycle / branching abilities), the cost is reduced since there will be smaller processes hence more can be packed in at similar die sizes. This is exactl as Dave Orton said when we interviewed him.
 
DaveBaumann said:
geo said:
Ummm, that last sentence is a bit of an oversell, isn't it? The cost/benefit won't be any better next gen, will it?

The ratio improves. The even if the benefit doesn't move (on the software side assuming no SM3 titles, or ones that show no benefits over SM2.0; on the hardware side no significant changes are made to the "instruction per cycle / branching abilities), the cost is reduced since there will be smaller processes hence more can be packed in at similar die sizes. This is exactl as Dave Orton said when we interviewed him.

Well, if he'd pointed at absolute die size then yes, I see your (and Orton's) point. Unfortunately, he didn't. Chalk it up to inelegant phraseology, I guess.
 
digitalwanderer said:
Is it just me or does Rich look awfully pink in that picture? :?

It's the massive quantities of Rogaine he's using. I keep telling him it won't work, but just last week he had 12 gallons pumped on under high pressure and then baked into a crust under blistering radium & UV lamps, followed by a relaxing scalp mudding with river silt from Idaho (supposedly rich in Rogaine-stimulating nutrients.) So far, at least from these pictures, no new growth--just pink. There's just nothing that can be done for it.
 
incurable said:
Well, until they support for that allegation with some old-school facts, this puts ATi firmly in the 'bad loser' & 'FUD spreader' categories.

Yea, I guess Carmack, too, since he's the one who brought up the subject in the first place...;)
 
WaltC said:
incurable said:
Well, until they support for that allegation with some old-school facts, this puts ATi firmly in the 'bad loser' & 'FUD spreader' categories.

Yea, I guess Carmack, too, since he's the one who brought up the subject in the first place...;)

Someone needs to write new shaders for doom3 so we can find out
 
WaltC said:
incurable said:
Well, until they support for that allegation with some old-school facts, this puts ATi firmly in the 'bad loser' & 'FUD spreader' categories.

Yea, I guess Carmack, too, since he's the one who brought up the subject in the first place...;)

I wouldn't automatically call shader replacement "questionable optimizations". At least not as long as the IQ are the same which we have yet to see any evidence that it's not. And i think Carmack would be the first one to complain if the result he got wasn't what he wanted.

But let's see what "optimizations" Ati is talking about before making any judgements.
 
Bjorn said:
At least not as long as the IQ are the same which we have yet to see any evidence that it's not. And i think Carmack would be the first one to complain if the result he got wasn't what he wanted.

The only visual quality problem that I saw was ATI's app AF which seems to be broken in Doom 3.
 
LeGreg said:
Bjorn said:
At least not as long as the IQ are the same which we have yet to see any evidence that it's not. And i think Carmack would be the first one to complain if the result he got wasn't what he wanted.
The only visual quality problem that I saw was ATI's app AF which seems to be broken in Doom 3.
And you'd certainly point out any problems you saw on NVIDIA's side, right? :LOL:

-FUDie
 
DaveBaumann said:
geo said:
Ummm, that last sentence is a bit of an oversell, isn't it? The cost/benefit won't be any better next gen, will it?

The ratio improves. The even if the benefit doesn't move (on the software side assuming no SM3 titles, or ones that show no benefits over SM2.0; on the hardware side no significant changes are made to the "instruction per cycle / branching abilities), the cost is reduced since there will be smaller processes hence more can be packed in at similar die sizes. This is exactl as Dave Orton said when we interviewed him.
But that still stay true. If i can do the same thing in SM2.0 with 2/3 of the die size; even in smaller process, i still win. I don't see what the process changes, because the cost is still 33% more.
 
FUDie said:
And you'd certainly point out any problems you saw on NVIDIA's side, right? :LOL:

Actually I don't have AA nor Anisotropy enabled on my 9700 pro but I enjoyed the game anyway ;). But the problem was revealed to me by the "Humus hack" thread. (the question was : "why don't people just enable the aniso in the app rather than in the control panel" and the answer is that because it seemed broken when enabled from the app).
 
More FUD from Nvidia

or wait, its FUD from ATI sorry.

Of course they are going to put down products from their competitors, what else would they do?

When you have a graphics artitecture from two years old and then release crap drivers that fix 30 things and break 30 more things and you lack in features of an API that is already out and avaliable for every video card in existance (Read DirectX 9.0c) and having terrible peformance of one of the hotest games of the year of course you are going to say the other guy sucks. Thats a no brainer.

Until they can prove what they say, they might as well say they have seen Elvis alive at a shopping mall in NYC in 2004.

ATI is smelling a lot like Nvidia did two years ago when Nvidia had a crap product. I guess this really does prove that ethics and industry don't mix.

I think they should focus on making DOOM 3 playable on some decent speeds and focus on Driver Quality instead of just adding in sloppy features that do nothing like shader tricks and also focus on getting a decent next generation card with dx 9.0c features out the door.
 
Re: More FUD from Nvidia

Proforma said:
or wait, its FUD from ATI sorry.

Of course they are going to put down products from their competitors, what else would they do?

When you have a graphics artitecture from two years old and then release crap drivers that fix 30 things and break 30 more things and you lack in features of an API that is already out and avaliable for every video card in existance (Read DirectX 9.0c) and having terrible peformance of one of the hotest games of the year of course you are going to say the other guy sucks. Thats a no brainer.

Until they can prove what they say, they might as well say they have seen Elvis alive at a shopping mall in NYC in 2004.

ATI is smelling a lot like Nvidia did two years ago when Nvidia had a crap product. I guess this really does prove that ethics and industry don't mix.

I think they should focus on making DOOM 3 playable on some decent speeds and focus on Driver Quality instead of just adding in sloppy features that do nothing like shader tricks and also focus on getting a decent next generation card with dx 9.0c features out the door.
Uhm, yeah...that's what they should do...
scratchhead.gif
 
Re: More FUD from Nvidia

Proforma said:
or wait, its FUD from ATI sorry.

Of course they are going to put down products from their competitors, what else would they do?

Bouncing Zabaglione Bros. said:
Sigma said:
Crap!
Only propaganda and no news...

Is it me or did he spoke more about NVIDIA than ATI?! :oops:

Well that's no surprise given that almost every question asked him to comment with regards to what Nvidia are doing.
 
Bjorn said:
I wouldn't automatically call shader replacement "questionable optimizations". At least not as long as the IQ are the same which we have yet to see any evidence that it's not. And i think Carmack would be the first one to complain if the result he got wasn't what he wanted.

But let's see what "optimizations" Ati is talking about before making any judgements.

I was really talking about the optimizations Carmack was complaining about when he noted that tiny little changes would knock both nV4x and nV3x "off the fast track" in the game.

What I want to know is why, when Carmack brings it up, it isn't an issue; but when ATi references what Carmack said, it "puts ATi firmly in the 'bad loser' & 'FUD spreader' categories."

Heh...;)
 
I want to know more

WaltC said:
Bjorn said:
I wouldn't automatically call shader replacement "questionable optimizations". At least not as long as the IQ are the same which we have yet to see any evidence that it's not. And i think Carmack would be the first one to complain if the result he got wasn't what he wanted.

But let's see what "optimizations" Ati is talking about before making any judgements.

I was really talking about the optimizations Carmack was complaining about when he noted that tiny little changes would knock both nV4x and nV3x "off the fast track" in the game.

What I want to know is why, when Carmack brings it up, it isn't an issue; but when ATi references what Carmack said, it "puts ATi firmly in the 'bad loser' & 'FUD spreader' categories."

Heh...;)

What I really want to know even more is did he really say that about the NV4x series or has ATI's marketing department been smoking some of what a lot of people here smoke.

Saying somebody said something and the truth is two different things.
I wouldn't doubt it about the NV30 serious, but I am not talking about that.

I need some solid proof and if you have that proof, please bring it on, otherwise its FUD.
 
I only remember seeing NV3x specifically mentioned, although I wouldn't doubt it has some effect on NV4x, but I don't think it'd be nearly as detrimental as it would be for NV3x.

Anyway, I think what some people don't like about the response to that question is that it just seems like they're making excuses for what could be considered relatively poor performance, given how well X800 cards perform in other games, and basically saying "the difference would be a lot smaller if they didn't take shortcuts".

Granted, I don't have a very good memory, but over the 1 1/2 years or so that ATI had a definate advantage, when asked about nvidia, it seems that they "took the high road" and focused their answers on their own hardware/drivers.


Take anything mentioning nvidia out of his answer about what ATI plans to do about Doom 3 performance and you have an answer where with which I think there'd be a chance even Ruined would have a hard time finding fault, well, it may be very tiny, it's still there ;). And to me and my limited (selective?) memory, that would sound more like the ATI of the past year or two.

Firstly the benchmark numbers which have been released used ATI drivers which have not been heavily tuned for Doom3.

And also it’s worth bearing in mind that the ATI cards all delivered a very playable gaming experience.

But, putting that aside, ATI agrees that we can do better. So we’ve dedicated a very substantial amount of engineering resource to make our OpenGL drivers faster, and we expect to be able to give a significant speed boost in the near future. Stay tuned!



hmmm, I thought I saw a post by Dave, but maybe it's that memory again
 
Back
Top