Another sexy interview with Richard Huddy

geo said:
I thot it was a great interview --I really like this guy. Out of various interesting statements, I found this one the most intriguing:

"I'd point out that our drivers are the best there are (for example we have much fewer open bugs in Microsoft's database than the competition). . ."

#1 -- Can someone confirm this is true (i.e. the open bugs with MS) and provide the relevant numbers?

This is extraordinarily irrelevant. It's the typical use of statistics to paint yourself in a better light than actual facts may present.

The problem with this claim lies in the fact that not all bugs are created equal. There is a definite scale between "minor" bugs and "nasty" bugs. This latter group usually causes you to be unable to play a game or somesuch.

Furthermore, since we don't even know the range of numbers, ATI could have 144 bugs and Nvidia could have 152 bugs. Sure, ATI is lower, but that's still way too many bugs for a retail product.

I'd much prefer claims about standards compliance and game compatability. Showing that your hardware is compatible with 99.5% of games availabe, compatible with 99.5% of other hardware and motherboards, and complies with a higher percentage of the OpenGL and DirectX specs correctly would be much more informative.

I suspect no-one does this sort of reporting of their products because it tends to highlight the deficiencies of their product.

I remember when I had a Permedia2 board years ago... it was a nice board, but it had a couple driver/feature bugs that made some apps display serious graphical corruption. My memory is poor, but I think it was an alpha texturing bug. Anyhow, it was just one bug... but it was pretty much a showstopper bug for some apps and games.
 
flf said:
I remember when I had a Permedia2 board years ago... it was a nice board, but it had a couple driver/feature bugs that made some apps display serious graphical corruption. My memory is poor, but I think it was an alpha texturing bug. Anyhow, it was just one bug... but it was pretty much a showstopper bug for some apps and games.

(I know I'm going OT)

Permedia 2 was unable to do some blend modes. I can't remeber which ones off the top of my head, but I expect they were ones that weren't really used at the time it was designed. Trouble was, Quake2 came along and used those blend modes to do coloured lighting. From there on, Permedia 2 had HUGE BUGS. when in reality it was more a limitation of the design.

It's probably splitting hairs, but....
 
flf said:
This is extraordinarily irrelevant. It's the typical use of statistics to paint yourself in a better light than actual facts may present.

The problem with this claim lies in the fact that not all bugs are created equal. There is a definite scale between "minor" bugs and "nasty" bugs. This latter group usually causes you to be unable to play a game or somesuch.

Furthermore, since we don't even know the range of numbers, ATI could have 144 bugs and Nvidia could have 152 bugs. Sure, ATI is lower, but that's still way too many bugs for a retail product.

I'd much prefer claims about standards compliance and game compatability. Showing that your hardware is compatible with 99.5% of games availabe, compatible with 99.5% of other hardware and motherboards, and complies with a higher percentage of the OpenGL and DirectX specs correctly would be much more informative.

I suspect no-one does this sort of reporting of their products because it tends to highlight the deficiencies of their product.

The reason I find a metric like the one he's pointing at potentially valuable is that isn't controlled by ATI or Nvidia, which unfortunately the one you are pointing at would be. It seems to me that *way* too much of the "driver quality" discussions I've seen are based almost entirely on anecdotal evidence, and hence almost immediately dissolve into "and your momma dresses you funny!".

So I was trying to reach for something that might have potential to get us past that.
 
andypski said:
...

Hmm... that's odd on the 8500 - I'm not sure what could cause the difference.

But on the matter of precision in general I hope it is clear that the level of precision of the 9200/9000 etc. is actually chosen carefully, and is _very_ useful. I believe it is a significant step up from 'FX12' (but then, of course, I would... ;) )

Is that why the Freelancer demo looked so appealing? I found many aspects of the graphics execution extremely impressive (too bad the actual gameplay seemed so anemic).

- Andy.
Bias
ATI [x---|----] nVidia
(What would you expect...)

Actually, I'm trying to "play" on that and perhaps gain some inside info by inciting you to brag. I wonder if it will work? :p
 
demalion said:
Is that why the Freelancer demo looked so appealing? I found many aspects of the graphics execution extremely impressive (too bad the actual gameplay seemed so anemic).
Of course ;)

Actually I'm sure that the Freelancer team would be kind of upset if I tried to claim that the reason that their graphics look good is entirely down to our hardware...

Actually, I'm trying to "play" on that and perhaps gain some inside info by inciting you to brag. I wonder if it will work? :p
I don't know - what do you think? :LOL:
 
Back
Top