"An interview with Richard Huddy & Kevin Strange

Yup, Wavey's comment's the best so far. Cheers! :D

I'm looking forward to the next interview. Here's hoping they'll drop some of the marketing speak. I got 9 pages of flack from our German readers @ 3dcenter for "allowing this marketing drivel to get published," and I don't really need any more of that. On the other hand side, hardly any bad feelings from English blokes. I wonder--culture thingy, do Germans just take anything too seriously, or was the German translation shite?

93,
-Sascha.rb
 
I don't think many people were pleased to see the GeForce 2 effectively re-branded as the GeForce 4 MX

Nope, but re-branding old chips isn't exactly an Nvidia specific thing.
 
nggalai said:
Yup, Wavey's comment's the best so far. Cheers! :D

I'm looking forward to the next interview. Here's hoping they'll drop some of the marketing speak. I got 9 pages of flack from our German readers @ 3dcenter for "allowing this marketing drivel to get published," and I don't really need any more of that. On the other hand side, hardly any bad feelings from English blokes.

Hehe, classic.
 
nggalai said:
Yup, Wavey's comment's the best so far. Cheers! :D

I'm looking forward to the next interview. Here's hoping they'll drop some of the marketing speak. I got 9 pages of flack from our German readers @ 3dcenter for "allowing this marketing drivel to get published," and I don't really need any more of that. On the other hand side, hardly any bad feelings from English blokes. I wonder--culture thingy, do Germans just take anything too seriously, or was the German translation shite?

93,
-Sascha.rb


The English blokes were still too busy laughing at the bit where he claimed

" ATI is the prime supplier of 3D graphics hardware for PCs and consoles"

from early on to be able to gather thoughts on being critical.


After this marketing nonesense for the guilable he seems to indicate that Ati are technology driven and nvidia are marketing driven but on the technology side nVidia have SLi and Sm 3.0 and Ati have a souped up 9700 that overclocks well.

What he meant to say was that nvidia are technology and marketing driven rather than just technology, we will see which is the best approach.
 
Regardless of their roots, complex chips don't fall out of trees. In the past 3 months ati have made 4 in the desktop space alone - three them are using a process nvidia still havent figured out and another being the first time its used in a graphics chip; in that time nvidia have struggled to bring their single sm3.0 part to market. sli is no biggie either - ati had massively scaleable parts long before!
 
dizietsma said:
After this marketing nonesense for the guilable he seems to indicate that Ati are technology driven and nvidia are marketing driven but on the technology side nVidia have SLi and Sm 3.0 and Ati have a souped up 9700 that overclocks well.

That depends on whether you believe Nvidia's SM3 and SLI is just a marketing tickbox or if it is actually usable. Given what's come out over the last few days about Nvidia trying to pay developers to not use SM2.0 even on their own SM2.0 cards, it's just as likely that SM3.0 is just a marketing checkbox for Nvidia, just as SM2.0 was for it's last generation.
 
After this marketing nonesense for the guilable he seems to indicate that Ati are technology driven and nvidia are marketing driven but on the technology side nVidia have SLi and Sm 3.0 and Ati have a souped up 9700 that overclocks well.

What he meant to say was that nvidia are technology and marketing driven rather than just technology, we will see which is the best approach.

well ati has had maxx going since the r300 series . You can have what up to 64 chips running together .

Much better than nvidia's version . so it looks like nvidia is behind the curve on sli :)

nvidia may have the more advanced part this round but for the last 2 years it has not and with the next gen of products it can easily loose it again . Not to mention the fact that the x800s are not just 9700s that overclock well .

Mean while ati is working on a n5 chip , xbox 2 chip , r500 , r600 , r700 chips plus the refresh of the r42x .

While puting out native agp versions of the r42x and native pci-e versions of r42x .

Nvidia has put out the nv40 which is still in limited amounts .

They spend way more money on twimtp than ati does on get in the game ( i only know of two games for that campain)

So i would say its nvidia thats marketing lead. They have been for awhile , since the geforce 4s imho
 
nggalai said:
Yup, Wavey's comment's the best so far. Cheers! :D

I'm looking forward to the next interview. Here's hoping they'll drop some of the marketing speak. I got 9 pages of flack from our German readers @ 3dcenter for "allowing this marketing drivel to get published," and I don't really need any more of that. On the other hand side, hardly any bad feelings from English blokes. I wonder--culture thingy, do Germans just take anything too seriously, or was the German translation shite?

93,
-Sascha.rb
AFAICS the usual suspects in the more greenish of the two camps just practice the old art of "shoot the messenger". It wasn't a good interview IMO. I'm still baffled. That was just way too over the top.

Maybe that's why the other camp shoots the messenger, too: they can't believe this is the same ATI they've come to know and like ...

That's hardly 3DCenter's fault, and neither is it your fault -- maybe it's my fault though :D

Mhrm.
 
Why is ATI going to implement 3.0 in next-gen if it is not usuable? Why doesn't ATI stay at 24 bit forever?

Were the notations on the Huddy presentation that was taken from the ATI website - the ones that seemingly mentioned trying to persuade developers to hold back on features until their hardware catches up - forged? Since devlopers aren't generally coding for games that come out tomorrow, doesn't that seem like trying to hold things back? Isn't the point for developers to write the best games possible?

People constantly complain that hardware is ahead of software. One way to offset that would be to code for upcoming hardware rather than just existing hardware. Or no?

And finally, by all indications NVIDIA is looking to spread 6000 across as many market segments as possible, as was the case with the FX? The FX happened to not be a good architecture, but the goal was clearly far from the GF4 MX. In contrast - 9200?
 
There is no point in having features which there isn't enough performance to use properly.
 
zeckensack said:
That's hardly 3DCenter's fault, and neither is it your fault -- maybe it's my fault though :D
Just because you suggested the pencil question? No way! :D

93,
-Sascha.rb
 
Voltron said:
Why is ATI going to implement 3.0 in next-gen if it is not usuable? Why doesn't ATI stay at 24 bit forever?

They'll go to these features when they can incorporate them as usable features, instead of marketing check boxes. Otherwise it is just a waste of your transisitor budget.

Turn the question around. Why did Nvidia implement SM2.0 in NV3x when it was so slow as to be unusable? What did the devs think that dropped NV3x class hardware down to DX8 and SM1.x code?
 
Voltron said:
so the GF4 MX was ok then?

No . There is no reason to put out a card that doesn't have usable features .

At the time of the geforce 4 mx ati had cards at the same price point that were more feature rich .

We have a whole generation (radeon 9500+) of cards that are capable of dx 9 shaders. What is the big deal about ati saying hey guys also cold p.s 2.0 and 2.0 b paths for all upcoming games ?

IT is the biggest installed base of dx 9 cards with usable dx 9 features .
 
I've talked to Richard before and there is a lot that isn't being said here, obviousy.

Far be it for me to speak for Richard, however the MX issue came up before and I don't think he has any issue with the capabilites of the chip; he had issues with what he was told to tell developers about its capabilities, even when developers came back specifically querying issues with that message - I think he felt it fundamentally violated what he was trying to achieve in a "developer relations" role (I believe this relates to it being sold as a hardware VS part when it wasn't - from what he says that went as far as the the developer relations message).

I'm just relaying a the gist of a conversation I had with him. Treat that as you will.
 
Back
Top